Quickstart
π’ Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
π Hosted OpenMemory MCP Now Available!
Sign Up Now - app.openmemory.dev
Everything you love about OpenMemory MCP but with zero setup.
β
Works with all MCP-compatible tools (Claude Desktop, Cursorβ¦)
β
Same standard memory ops: add_memories
, search_memory
, etc
β
One-click provisioning, no Docker required
β
Powered by Mem0
Add shared, persistent, low-friction memory to your MCP-compatible clients in seconds.
π Get Started Now
Sign up and get your access key at app.openmemory.dev
Example installation: npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key
Getting Started with Hosted OpenMemory
The fastest way to get started is with our hosted version - no setup required:
1. Get your API key
Visit app.openmemory.dev to sign up and get your OPENMEMORY_API_KEY
.
2. Install and connect to your preferred client
Example commands (replace your-key
with your actual API key):
For Claude Desktop: npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key
For Cursor: npx @openmemory/install --client cursor --env OPENMEMORY_API_KEY=your-key
For Windsurf: npx @openmemory/install --client windsurf --env OPENMEMORY_API_KEY=your-key
Thatβs it! Your AI client now has persistent memory across sessions.
Local Setup (Self-Hosted)
Prefer to run OpenMemory locally? Follow the instructions below for a self-hosted setup.
OpenMemory Easy Setup
Prerequisites
- Docker
- OpenAI API Key
You can quickly run OpenMemory by running the following command:
You should set the OPENAI_API_KEY
as a global environment variable:
You can also set the OPENAI_API_KEY
as a parameter to the script:
This will start the OpenMemory server and the OpenMemory UI. Deleting the container will lead to the deletion of the memory store. We suggest you follow the instructions below to set up OpenMemory on your local machine, with more persistant memory store.
Setting Up OpenMemory
Getting started with OpenMemory is straight forward and takes just a few minutes to set up on your local machine. Follow these steps:
Getting started
1. First clone the repository and then follow the instructions:
2. Set Up Environment Variables
Before running the project, you need to configure environment variables for both the API and the UI.
You can do this in one of the following ways:
-
Manually:
Create a.env
file in each of the following directories:/api/.env
/ui/.env
-
Using
.env.example
files:
Copy and rename the example files: -
Using Makefile (if supported):
Run: -
Example
/api/.env
-
Example
/ui/.env
3. Build and Run the Project
You can run the project using the following two commands:
After running these commands, you will have:
- OpenMemory MCP server running at: http://localhost:8765 (API documentation available at http://localhost:8765/docs)
- OpenMemory UI running at: http://localhost:3000
UI not working on http://localhost:3000?
If the UI does not start properly on http://localhost:3000, try running it manually:
You can configure the MCP client using the following command (replace username with your username):
The OpenMemory dashboard will be available at http://localhost:3000. From here, you can view and manage your memories, as well as check connection status with your MCP clients.
Once set up, OpenMemory runs locally on your machine, ensuring all your AI memories remain private and secure while being accessible across any compatible MCP client.
Getting Started Today
- Github Repository: https://github.com/mem0ai/mem0/tree/main/openmemory