OSS baseline

Martools

A small, forkable Docker Compose stack for marketing-ops experiments and local AI gateways: Postgres and Redis always on; optional Umami, Ollama, LiteLLM, and n8n behind Compose profiles.

What it is (and is not)

Martools is not a full self-hosted CRM or product-analytics platform. It is a working baseline: real databases and cache, plus optional services you can turn on when you need them. You extend it with your own docker-compose.*.yml files or fork per cohort or project.

How it works

Docker Compose defines one network of containers. Profiles decide which optional services start: nothing extra runs until you ask for analytics, llm, automation, or the combined full profile.

docker compose up -d └── postgres (55432) redis (56379) docker compose --profile analytics up -d └── + umami (3030) → Postgres DB "umami" docker compose --profile llm up -d └── + ollama (11434) → + litellm (4000) → ollama docker compose --profile automation up -d └── + n8n (5678) → Postgres DB "n8n"

On first boot, Postgres runs init scripts that create the umami and n8n databases. The main app database is martools. Redis is ready for queues or cache for services you add later.

Services and ports

Service Profile Host port Role
Postgres (default) 55432 Shared DB for stack and optional apps
Redis (default) 56379 Cache / queues for future services
Umami analytics, full 3030 Privacy-friendly web analytics
Ollama llm, full 11434 Local LLM runtime
LiteLLM llm, full 4000 OpenAI-compatible proxy to Ollama
n8n automation, full 5678 Workflow automation

Defaults assume localhost development. Do not expose these ports to untrusted networks without TLS, firewalls, and proper authentication (see SECURITY in the repo).

LLM path

With the llm profile, clients can talk to LiteLLM at http://localhost:4000 using the LiteLLM proxy API. Models are configured in config/litellm.yaml (default: llama3.2 via Ollama). Pull weights after start: docker compose exec ollama ollama pull llama3.2.

For Cursor / Claude and MCP, see the mcp/ examples in the repository.

Quick start

  1. cp .env.example .env and set POSTGRES_PASSWORD; add UMAMI_APP_SECRET if you use analytics.
  2. docker compose up -d for Postgres + Redis.
  3. Enable profiles as needed (see README). First Umami visit: create admin at http://localhost:3030.

Full README · Stack notes (what to add next)

License

MIT — LICENSE. Third-party container images remain under their own licenses.