Free during beta — all features available, pricing at v1.0

Simple, storage-based pricing.

Pay for what you store — not per seat. All features are free during beta.

Pricing takes effect at v1.0. No credit card required today.

All tiers are free during beta — this shows what's coming at v1.0.

Free
$0
forever
  • Capture from 8 tools
  • Resume across 4 tools
  • Local full-text search
Get started
Starter
$4.99/mo
~500MB storage
  • Everything in Free
  • 500MB cloud sync
  • Web dashboard
  • Manual LLM Judge (BYOK)
  • MCP server (local)
  • Deterministic session summaries
Get started
Most popular
Pro
$14.99/mo
~2GB storage
  • Everything in Starter
  • Autosync (all modes)
  • Auto-audit (on sync, on PR)
  • Team handoff
  • GitHub + GitLab PR/MR
  • MCP server (local + remote)
  • LLM narrative summaries
  • Project context
  • Living knowledge base
  • AI agent write-back
  • Portable rules (5 tools)
  • Custom LLM gateway
Book a Demo
Team
$14.99/user/mo
1GB per user
  • Everything in Pro
  • 1GB/user storage
  • Team-wide knowledge base
  • Cross-team knowledge sharing
  • Org management + RBAC
  • Audit history + trends
  • Priority support
Book a Demo

Enterprise

Self-hosted Kubernetes deployment on your infrastructure. Your data never leaves your network.

HIPAA-ready, SAML SSO, SLA, dedicated support, and air-gapped deployment options.

Frequently asked questions

When does pricing take effect?

Pricing takes effect at v1.0, which we have not yet shipped. During the beta period — everything you see today — all features are completely free with no credit card required. We will give advance notice before any billing starts.

Why storage-based pricing instead of per seat?

Session data is the actual cost driver — not headcount. A solo developer running intensive sessions can generate more storage than a large team of occasional users. Storage pricing is honest: you pay proportional to what you actually use and what it costs us to store and serve.

Do I need to provide an LLM API key?

Only for LLM-powered features like the Judge and narrative summaries. Bring your own key (BYOK) — it is never stored on our servers. Works with any OpenAI-compatible endpoint including OpenRouter, LiteLLM, vLLM, Ollama, and Azure OpenAI. Deterministic features like session capture, resume, search, and basic summaries require no API key at all.

What is the difference between cloud sync and local mode?

The daemon defaults to local-only — sessions stay on your machine. Cloud sync is an explicit opt-in. You can push individual sessions or enable auto-sync. Enterprise users can self-host the entire backend so nothing ever leaves their network.