Skip to main content
These docs are designed to be easily consumable by LLMs. Each page has a button that lets you copy the page as Markdown or paste directly into ChatGPT, Claude, or any AI coding tool. We follow the llms.txt standard:

Get an API Key

Sign up for Mem0 Platform and start building

Quickstart

Store your first memory in under 5 minutes

Agent Skills

Teach your coding assistant how to build with Mem0:
npx skills add https://github.com/mem0ai/mem0 --skill mem0
Works with Claude Code, Cursor, Windsurf, and any assistant that supports skills. Once installed, your assistant understands Mem0’s full API, framework integrations, and common patterns.

Claude Code Plugin

The OpenMemory plugin gives Claude Code persistent memory across sessions, projects, and teams — automatically.
1

Get your API key

Sign up at app.openmemory.dev.
2

Install the plugin

/plugin add mem0ai/claude-code-plugin
3

Set your environment variable

export OPENMEMORY_API_KEY="your-key-here"
4

Start coding

The plugin activates automatically. It captures decisions at session end, preserves context during compaction, and retrieves relevant memories at session start.

MCP Server Setup

Connect Cursor, Windsurf, Claude Desktop, or any MCP-compatible client to Mem0.
Sign up at app.openmemory.dev, then pick your client:
npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key
For full setup options, see OpenMemory Quickstart.

Universal Starter Prompt

Copy this into any AI tool to start building with Mem0:
I want to start building with Mem0 — a self-improving memory layer for LLM
applications that gives agents persistent context across sessions.

## Mem0 Resources

**Documentation:**
- Main docs: https://docs.mem0.ai
- Platform Quickstart: https://docs.mem0.ai/platform/quickstart
- OSS Python Quickstart: https://docs.mem0.ai/open-source/python-quickstart
- OSS Node.js Quickstart: https://docs.mem0.ai/open-source/node-quickstart
- API Reference: https://docs.mem0.ai/api-reference
- Full LLM-friendly docs: https://docs.mem0.ai/llms.txt

**Code & Examples:**
- Core repo: https://github.com/mem0ai/mem0
- Python SDK: pip install mem0ai
- TypeScript SDK: npm install mem0ai
- Cookbooks: https://docs.mem0.ai/cookbooks/overview

**What Mem0 Does:**
Mem0 is a memory layer for AI apps — managed (Mem0 Platform) or self-hosted
(Open Source). It stores, retrieves, and manages user memories so agents
remember preferences, learn from interactions, and personalize over time.
Sub-50ms retrieval. Dual storage: vector embeddings + graph databases.

**Architecture Overview:**
- Memory is scoped by user_id, agent_id, or run_id
- Core operations: add, search, update, delete
- Memory types: factual (preferences, facts), episodic (past interactions),
  semantic (concept relationships), working (session state)
- Integration pattern: retrieve relevant memories → generate response → store
  new memories

**Quick Usage (Python Platform):**
  from mem0 import MemoryClient
  client = MemoryClient(api_key="m0-xxx")
  client.add("I prefer dark mode and use VS Code.", user_id="user1")
  results = client.search("What editor do they use?", user_id="user1")

**Quick Usage (JavaScript Platform):**
  import MemoryClient from 'mem0ai';
  const client = new MemoryClient({ apiKey: 'm0-xxx' });
  await client.add([{ role: "user", content: "I prefer dark mode." }], { user_id: "user1" });
  const results = await client.search("What editor?", { user_id: "user1" });

**Quick Usage (Python Open Source):**
  from mem0 import Memory
  m = Memory()
  m.add("I prefer dark mode and use VS Code.", user_id="user1")
  results = m.search("What editor do they use?", user_id="user1")

Help me integrate Mem0 into my project. Start by asking what I'm building,
what language/framework I'm using, and whether I want managed or self-hosted.

Go Deeper

Platform Quickstart

Get started with the managed API

Open Source

Self-host with full control

Cookbooks

Production-ready tutorials and examples

API Reference

Explore every REST endpoint