🔐 Mem0 is now SOC 2 and HIPAA compliant! We're committed to the highest standards of data security and privacy, enabling secure memory for enterprises, healthcare, and beyond. Learn more

Mem0 now supports Graph Memory. With Graph Memory, users can now create and utilize complex relationships between pieces of information, allowing for more nuanced and context-aware responses. This integration enables users to leverage the strengths of both vector-based and graph-based approaches, resulting in more accurate and comprehensive information retrieval and generation.

NodeSDK now supports Graph Memory. 🎉

Installation

To use Mem0 with Graph Memory support, install it using pip:

pip install "mem0ai[graph]"

This command installs Mem0 along with the necessary dependencies for graph functionality.

Try Graph Memory on Google Colab.

Initialize Graph Memory

To initialize Graph Memory you’ll need to set up your configuration with graph store providers. Currently, we support Neo4j and Memgraph as graph store providers.

Initialize Neo4j

You can setup Neo4j locally or use the hosted Neo4j AuraDB.

If you are using Neo4j locally, then you need to install APOC plugins.

User can also customize the LLM for Graph Memory from the Supported LLM list with three levels of configuration:

  1. Main Configuration: If llm is set in the main config, it will be used for all graph operations.
  2. Graph Store Configuration: If llm is set in the graph_store config, it will override the main config llm and be used specifically for graph operations.
  3. Default Configuration: If no custom LLM is set, the default LLM (gpt-4o-2024-08-06) will be used for all graph operations.

Here’s how you can do it:

from mem0 import Memory

config = {
    "graph_store": {
        "provider": "neo4j",
        "config": {
            "url": "neo4j+s://xxx",
            "username": "neo4j",
            "password": "xxx"
        }
    }
}

m = Memory.from_config(config_dict=config)

If you are using NodeSDK, you need to pass enableGraph as true in the config object.

Initialize Memgraph

Run Memgraph with Docker:

docker run -p 7687:7687 memgraph/memgraph-mage:latest --schema-info-enabled=True

The --schema-info-enabled flag is set to True for more performant schema generation.

Additional information can be found on Memgraph documentation.

User can also customize the LLM for Graph Memory from the Supported LLM list with three levels of configuration:

  1. Main Configuration: If llm is set in the main config, it will be used for all graph operations.
  2. Graph Store Configuration: If llm is set in the graph_store config, it will override the main config llm and be used specifically for graph operations.
  3. Default Configuration: If no custom LLM is set, the default LLM (gpt-4o-2024-08-06) will be used for all graph operations.

Here’s how you can do it:

from mem0 import Memory

config = {
    "graph_store": {
        "provider": "memgraph",
        "config": {
            "url": "bolt://localhost:7687",
            "username": "memgraph",
            "password": "xxx",
        },
    },
}

m = Memory.from_config(config_dict=config)

Initialize Neptune Analytics

Mem0 now supports Amazon Neptune Analytics as a graph store provider. This integration allows you to use Neptune Analytics for storing and querying graph-based memories.

Instance Setup

Create an Amazon Neptune Analytics instance in your AWS account following the AWS documentation.

  • Public connectivity is not enabled by default, and if accessing from outside a VPC, it needs to be enabled.
  • Once the Amazon Neptune Analytics instance is available, you will need the graph-identifier to connect.
  • The Neptune Analytics instance must be created using the same vector dimensions as the embedding model creates. See: https://docs.aws.amazon.com/neptune-analytics/latest/userguide/vector-index.html

Attach Credentials

Configure your AWS credentials with access to your Amazon Neptune Analytics resources by following the Configuration and credentials precedence.

  • For example, add your SSH access key session token via environment variables:
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
export AWS_SESSION_TOKEN=your-session-token
export AWS_DEFAULT_REGION=your-region
  • The IAM user or role making the request must have a policy attached that allows one of the following IAM actions in that neptune-graph:
    • neptune-graph:ReadDataViaQuery
    • neptune-graph:WriteDataViaQuery
    • neptune-graph:DeleteDataViaQuery

Usage

The Neptune memory store uses AWS LangChain Python API to connect to Neptune instances. For additional configuration options for connecting to your Amazon Neptune Analytics instance see AWS LangChain API documentation.

from mem0 import Memory

# This example must connect to a neptune-graph instance with 1536 vector dimensions specified.
config = {
    "embedder": {
        "provider": "openai",
        "config": {"model": "text-embedding-3-large", "embedding_dims": 1536},
    },
    "graph_store": {
        "provider": "neptune",
        "config": {
            "endpoint": "neptune-graph://<GRAPH_ID>",
        },
    },
}

m = Memory.from_config(config_dict=config)

Troubleshooting

Graph Operations

The Mem0’s graph supports the following operations:

Add Memories

Mem0 with Graph Memory supports both “user_id” and “agent_id” parameters. You can use either or both to organize your memories. Use “userId” and “agentId” in NodeSDK.

# Using only user_id
m.add("I like pizza", user_id="alice")

# Using both user_id and agent_id
m.add("I like pizza", user_id="alice", agent_id="food-assistant")

Get all memories

# Get all memories for a user
m.get_all(user_id="alice")

# Get all memories for a specific agent belonging to a user
m.get_all(user_id="alice", agent_id="food-assistant")

Search Memories

# Search memories for a user
m.search("tell me my name.", user_id="alice")

# Search memories for a specific agent belonging to a user
m.search("tell me my name.", user_id="alice", agent_id="food-assistant")

Delete all Memories

# Delete all memories for a user
m.delete_all(user_id="alice")

# Delete all memories for a specific agent belonging to a user
m.delete_all(user_id="alice", agent_id="food-assistant")

Example Usage

Here’s an example of how to use Mem0’s graph operations:

  1. First, we’ll add some memories for a user named Alice.
  2. Then, we’ll visualize how the graph evolves as we add more memories.
  3. You’ll see how entities and relationships are automatically extracted and connected in the graph.

Add Memories

Below are the steps to add memories and visualize the graph:

1

Add memory 'I like going to hikes'

m.add("I like going to hikes", user_id="alice123")

2

Add memory 'I love to play badminton'

m.add("I love to play badminton", user_id="alice123")

3

Add memory 'I hate playing badminton'

m.add("I hate playing badminton", user_id="alice123")

4

Add memory 'My friend name is john and john has a dog named tommy'

m.add("My friend name is john and john has a dog named tommy", user_id="alice123")

5

Add memory 'My name is Alice'

m.add("My name is Alice", user_id="alice123")

6

Add memory 'John loves to hike and Harry loves to hike as well'

m.add("John loves to hike and Harry loves to hike as well", user_id="alice123")

7

Add memory 'My friend peter is the spiderman'

m.add("My friend peter is the spiderman", user_id="alice123")

Search Memories

m.search("What is my name?", user_id="alice123")

Below graph visualization shows what nodes and relationships are fetched from the graph for the provided query.

m.search("Who is spiderman?", user_id="alice123")

Note: The Graph Memory implementation is not standalone. You will be adding/retrieving memories to the vector store and the graph store simultaneously.

Using Multiple Agents with Graph Memory

When working with multiple agents, you can use the “agent_id” parameter to organize memories by both user and agent. This allows you to:

  1. Create agent-specific knowledge graphs
  2. Share common knowledge between agents
  3. Isolate sensitive or specialized information to specific agents

Example: Multi-Agent Setup

# Add memories for different agents
m.add("I prefer Italian cuisine", user_id="bob", agent_id="food-assistant")
m.add("I'm allergic to peanuts", user_id="bob", agent_id="health-assistant")
m.add("I live in Seattle", user_id="bob")  # Shared across all agents

# Search within specific agent context
food_preferences = m.search("What food do I like?", user_id="bob", agent_id="food-assistant")
health_info = m.search("What are my allergies?", user_id="bob", agent_id="health-assistant")
location = m.search("Where do I live?", user_id="bob")  # Searches across all agents

If you want to use a managed version of Mem0, please check out Mem0. If you have any questions, please feel free to reach out to us using one of the following methods: