πŸ“’ Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.

πŸš€ Hosted OpenMemory MCP Now Available!

Sign Up Now - app.openmemory.dev

Everything you love about OpenMemory MCP but with zero setup.

βœ… Works with all MCP-compatible tools (Claude Desktop, Cursor…)
βœ… Same standard memory ops: add_memories, search_memory, etc
βœ… One-click provisioning, no Docker required
βœ… Powered by Mem0

Add shared, persistent, low-friction memory to your MCP-compatible clients in seconds.

🌟 Get Started Now

Sign up and get your access key at app.openmemory.dev

Example installation: npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key

Getting Started with Hosted OpenMemory

The fastest way to get started is with our hosted version - no setup required:

1. Get your API key

Visit app.openmemory.dev to sign up and get your OPENMEMORY_API_KEY.

2. Install and connect to your preferred client

Example commands (replace your-key with your actual API key):

For Claude Desktop: npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key

For Cursor: npx @openmemory/install --client cursor --env OPENMEMORY_API_KEY=your-key

For Windsurf: npx @openmemory/install --client windsurf --env OPENMEMORY_API_KEY=your-key

That’s it! Your AI client now has persistent memory across sessions.

Local Setup (Self-Hosted)

Prefer to run OpenMemory locally? Follow the instructions below for a self-hosted setup.

OpenMemory Easy Setup

Prerequisites

  • Docker
  • OpenAI API Key

You can quickly run OpenMemory by running the following command:

curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | bash

You should set the OPENAI_API_KEY as a global environment variable:

export OPENAI_API_KEY=your_api_key

You can also set the OPENAI_API_KEY as a parameter to the script:

curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | OPENAI_API_KEY=your_api_key bash

This will start the OpenMemory server and the OpenMemory UI. Deleting the container will lead to the deletion of the memory store. We suggest you follow the instructions below to set up OpenMemory on your local machine, with more persistant memory store.

Setting Up OpenMemory

Getting started with OpenMemory is straight forward and takes just a few minutes to set up on your local machine. Follow these steps:

Getting started

1. First clone the repository and then follow the instructions:

# Clone the repository
git clone https://github.com/mem0ai/mem0.git
cd mem0/openmemory

2. Set Up Environment Variables

Before running the project, you need to configure environment variables for both the API and the UI.

You can do this in one of the following ways:

  • Manually:
    Create a .env file in each of the following directories:

    • /api/.env
    • /ui/.env
  • Using .env.example files:
    Copy and rename the example files:

    cp api/.env.example api/.env
    cp ui/.env.example ui/.env
    
  • Using Makefile (if supported):
    Run:

    make env
    
  • Example /api/.env

OPENAI_API_KEY=sk-xxx
USER=<user-id> # The User Id you want to associate the memories with 
NEXT_PUBLIC_API_URL=http://localhost:8765
NEXT_PUBLIC_USER_ID=<user-id> # Same as the user id for environment variable in api

3. Build and Run the Project

You can run the project using the following two commands:

make build # builds the mcp server and ui
make up  # runs openmemory mcp server and ui

After running these commands, you will have:

UI not working on http://localhost:3000?

If the UI does not start properly on http://localhost:3000, try running it manually:

cd ui
pnpm install
pnpm dev

You can configure the MCP client using the following command (replace username with your username):

npx @openmemory/install local "http://localhost:8765/mcp/cursor/sse/username" --client cursor

The OpenMemory dashboard will be available at http://localhost:3000. From here, you can view and manage your memories, as well as check connection status with your MCP clients.

Once set up, OpenMemory runs locally on your machine, ensuring all your AI memories remain private and secure while being accessible across any compatible MCP client.

Getting Started Today