Simple, transparent pricing
Start free. Scale as you grow. Self-host for full control.
Free
For individual developers exploring AI context.
- 500 requests/month
- 1M tokens stored
- 1 seat
- Community support
- Self-host available
Pro
For power users who need more capacity.
- 10,000 requests/month
- 20M tokens stored
- 1 seat
- Priority support
- Usage analytics dashboard
- API key management
Frequently asked questions
Can I self-host Memly?
Yes! Memly is BSL licensed. Every plan includes a license key. Personal self-hosted instances are free forever. Team licenses require a paid key.
What happens if I exceed my tier limits?
Memly gracefully degrades to passthrough mode — your LLM calls still work, they just won't include context injection. No hard stops, ever.
Which LLM providers are supported?
OpenAI, Anthropic, and Google Gemini. Any provider with an OpenAI-compatible API works out of the box. SSE streaming, function calling, and vision are fully supported.
Is my code stored on your servers?
In cloud mode, code chunks are stored encrypted in Supabase with RLS. For full data sovereignty, use self-hosted mode — your data never leaves your infrastructure.