Welcome to the Mem0 quickstart guide. This guide will help you get up and running with Mem0 in no time.
Installation
To install Mem0, you can use pip. Run the following command in your terminal:Basic Usage
Initialize Mem0
Store a Memory
Retrieve Memories
Search Memories
Update a Memory
Memory History
Delete Memory
Reset Memory
Advanced Memory Organization
Mem0 supports three key parameters for organizing memories:user_id
: Organize memories by user identityagent_id
: Organize memories by AI agent or assistantrun_id
: Organize memories by session, workflow, or execution context
Using All Three Parameters
Configuration Parameters
Mem0 offers extensive configuration options to customize its behavior according to your needs. These configurations span across different components like vector stores, language models, embedders, and graph stores.Vector Store Configuration
Vector Store Configuration
Parameter | Description | Default |
---|---|---|
provider | Vector store provider (e.g., “qdrant”) | “qdrant” |
host | Host address | ”localhost” |
port | Port number | 6333 |
LLM Configuration
LLM Configuration
Parameter | Description | Provider |
---|---|---|
provider | LLM provider (e.g., “openai”, “anthropic”) | All |
model | Model to use | All |
temperature | Temperature of the model | All |
api_key | API key to use | All |
max_tokens | Tokens to generate | All |
top_p | Probability threshold for nucleus sampling | All |
top_k | Number of highest probability tokens to keep | All |
http_client_proxies | Allow proxy server settings | AzureOpenAI |
models | List of models | Openrouter |
route | Routing strategy | Openrouter |
openrouter_base_url | Base URL for Openrouter API | Openrouter |
site_url | Site URL | Openrouter |
app_name | Application name | Openrouter |
ollama_base_url | Base URL for Ollama API | Ollama |
openai_base_url | Base URL for OpenAI API | OpenAI |
azure_kwargs | Azure LLM args for initialization | AzureOpenAI |
deepseek_base_url | Base URL for DeepSeek API | DeepSeek |
Embedder Configuration
Embedder Configuration
Parameter | Description | Default |
---|---|---|
provider | Embedding provider | ”openai” |
model | Embedding model to use | ”text-embedding-3-small” |
api_key | API key for embedding service | None |
Graph Store Configuration
Graph Store Configuration
Parameter | Description | Default |
---|---|---|
provider | Graph store provider (e.g., “neo4j”) | “neo4j” |
url | Connection URL | None |
username | Authentication username | None |
password | Authentication password | None |
General Configuration
General Configuration
Parameter | Description | Default |
---|---|---|
history_db_path | Path to the history database | ”/history.db” |
version | API version | ”v1.1” |
custom_fact_extraction_prompt | Custom prompt for memory processing | None |
custom_update_memory_prompt | Custom prompt for update memory | None |
Complete Configuration Example
Complete Configuration Example
Run Mem0 Locally
Please refer to the example Mem0 with Ollama to run Mem0 locally.Chat Completion
Mem0 can be easily integrated into chat applications to enhance conversational agents with structured memory. Mem0’s APIs are designed to be compatible with OpenAI’s, with the goal of making it easy to leverage Mem0 in applications you may have already built. If you have aMem0 API key
, you can use it to initialize the client. Alternatively, you can initialize Mem0 without an API key if you’re using it locally.
Mem0 supports several language models (LLMs) through integration with various providers.
Use Mem0 OSS
Contributing
We welcome contributions to Mem0! Here’s how you can contribute:-
Fork the repository and create your branch from
main
. - Clone the forked repository to your local machine.
-
Install the project dependencies:
-
Install pre-commit hooks:
- Make your changes and ensure they adhere to the project’s coding standards.
-
Run the tests locally:
- If all tests pass, commit your changes and push to your fork.
- Open a pull request with a clear title and description.