Pay for what you store — not per seat. All features are free during beta.
Pricing takes effect at v1.0. No credit card required today.
All tiers are free during beta — this shows what's coming at v1.0.
Self-hosted Kubernetes deployment on your infrastructure. Your data never leaves your network.
HIPAA-ready, SAML SSO, SLA, dedicated support, and air-gapped deployment options.
Pricing takes effect at v1.0, which we have not yet shipped. During the beta period — everything you see today — all features are completely free with no credit card required. We will give advance notice before any billing starts.
Session data is the actual cost driver — not headcount. A solo developer running intensive sessions can generate more storage than a large team of occasional users. Storage pricing is honest: you pay proportional to what you actually use and what it costs us to store and serve.
Only for LLM-powered features like the Judge and narrative summaries. Bring your own key (BYOK) — it is never stored on our servers. Works with any OpenAI-compatible endpoint including OpenRouter, LiteLLM, vLLM, Ollama, and Azure OpenAI. Deterministic features like session capture, resume, search, and basic summaries require no API key at all.
The daemon defaults to local-only — sessions stay on your machine. Cloud sync is an explicit opt-in. You can push individual sessions or enable auto-sync. Enterprise users can self-host the entire backend so nothing ever leaves their network.