Mem0 (agent memory API)
Drop-in memory layer for LLM agents. Vector + graph memory variants (Mem0g) — the graph variant builds a directed labeled knowledge graph alongside the vector store, with conflict detection on contradictory facts. Leads the 2026 agent-memory benchmarks at 68.4% LLM Score on multi-hop questions. Works with any LLM, including local Ollama models.
Overview
Drop-in memory layer for LLM agents. Vector + graph memory variants (Mem0g) — the graph variant builds a directed labeled knowledge graph alongside the vector store, with conflict detection on contradictory facts. Leads the 2026 agent-memory benchmarks at 68.4% LLM Score on multi-hop questions. Works with any LLM, including local Ollama models.
Pros
- Drop-in API — minutes to integrate
- Graph memory (Mem0g) leads 2026 benchmarks
- Conflict detection on contradictory facts
- Works with local LLMs
Cons
- Cloud tier required for production scale
- Graph extraction is LLM-cost heavy
- Less control than Letta's explicit OS-style approach
Compatibility
| Operating systems | macOS Linux Windows |
| GPU backends | n/a |
| License | Open source · free (OSS) + managed cloud tiers |
Get Mem0 (agent memory API)
Frequently asked
Is Mem0 (agent memory API) free?
What operating systems does Mem0 (agent memory API) support?
Which GPUs work with Mem0 (agent memory API)?
Reviewed by RunLocalAI Editorial. See our editorial policy for how we evaluate tools.