🔐 Mem0 is now SOC 2 and HIPAA compliant! We're committed to the highest standards of data security and privacy, enabling secure memory for enterprises, healthcare, and beyond. Learn more
Integrate Mem0 with OpenAI Agents SDK, a lightweight framework for building multi-agent workflows. This integration enables agents to access persistent memory across conversations, enhancing context retention and personalization.

Overview

  1. Store and retrieve memories from Mem0 within OpenAI agents
  2. Multi-agent workflows with shared memory
  3. Retrieve relevant memories for past conversations
  4. Personalized responses based on user history

Prerequisites

Before setting up Mem0 with OpenAI Agents SDK, ensure you have:
  1. Installed the required packages:
pip install openai-agents mem0ai
  1. Valid API keys:

Basic Integration Example

The following example demonstrates how to create an OpenAI agent with Mem0 memory integration:
import os
from agents import Agent, Runner, function_tool
from mem0 import MemoryClient

# Set up environment variables
os.environ["OPENAI_API_KEY"] = "your-openai-api-key"
os.environ["MEM0_API_KEY"] = "your-mem0-api-key"

# Initialize Mem0 client
mem0 = MemoryClient()

# Define memory tools for the agent
@function_tool
def search_memory(query: str, user_id: str) -> str:
    """Search through past conversations and memories"""
    memories = mem0.search(query, user_id=user_id, limit=3)
    if memories:
        return "\n".join([f"- {mem['memory']}" for mem in memories])
    return "No relevant memories found."

@function_tool
def save_memory(content: str, user_id: str) -> str:
    """Save important information to memory"""
    mem0.add([{"role": "user", "content": content}], user_id=user_id)
    return "Information saved to memory."

# Create agent with memory capabilities
agent = Agent(
    name="Personal Assistant",
    instructions="""You are a helpful personal assistant with memory capabilities.
    Use the search_memory tool to recall past conversations and user preferences.
    Use the save_memory tool to store important information about the user.
    Always personalize your responses based on available memory.""",
    tools=[search_memory, save_memory],
    model="gpt-4o"
)

def chat_with_agent(user_input: str, user_id: str) -> str:
    """
    Handle user input with automatic memory integration.

    Args:
        user_input: The user's message
        user_id: Unique identifier for the user

    Returns:
        The agent's response
    """
    # Run the agent (it will automatically use memory tools when needed)
    result = Runner.run_sync(agent, user_input)

    return result.final_output

# Example usage
if __name__ == "__main__":

    # preferences will be saved in memory (using save_memory tool)
    response_1 = chat_with_agent(
        "I love Italian food and I'm planning a trip to Rome next month",
        user_id="alice"
    )
    print(response_1)

    # memory will be retrieved using search_memory tool to answer the user query
    response_2 = chat_with_agent(
        "Give me some recommendations for food",
        user_id="alice"
    )
    print(response_2)

Multi-Agent Workflow with Handoffs

Create multiple specialized agents with proper handoffs and shared memory:
from agents import Agent, Runner, handoffs, function_tool

# Specialized agents
travel_agent = Agent(
    name="Travel Planner",
    instructions="""You are a travel planning specialist. Use get_user_context to
    understand the user's travel preferences and history before making recommendations.
    After providing your response, use store_conversation to save important details.""",
    tools=[search_memory, save_memory],
    model="gpt-4o"
)

health_agent = Agent(
    name="Health Advisor",
    instructions="""You are a health and wellness advisor. Use get_user_context to
    understand the user's health goals and dietary preferences.
    After providing advice, use store_conversation to save relevant information.""",
    tools=[search_memory, save_memory],
    model="gpt-4o"
)

# Triage agent with handoffs
triage_agent = Agent(
    name="Personal Assistant",
    instructions="""You are a helpful personal assistant that routes requests to specialists.
    For travel-related questions (trips, hotels, flights, destinations), hand off to Travel Planner.
    For health-related questions (fitness, diet, wellness, exercise), hand off to Health Advisor.
    For general questions, you can handle them directly using available tools.""",
    handoffs=[travel_agent, health_agent],
    model="gpt-4o"
)

def chat_with_handoffs(user_input: str, user_id: str) -> str:
    """
    Handle user input with automatic agent handoffs and memory integration.

    Args:
        user_input: The user's message
        user_id: Unique identifier for the user

    Returns:
        The agent's response
    """
    # Run the triage agent (it will automatically handoff when needed)
    result = Runner.run_sync(triage_agent, user_input)

    # Store the original conversation in memory
    conversation = [
        {"role": "user", "content": user_input},
        {"role": "assistant", "content": result.final_output}
    ]
    mem0.add(conversation, user_id=user_id)

    return result.final_output

# Example usage
response = chat_with_handoffs("Plan a healthy meal for my Italy trip", user_id="alex")
print(response)

Quick Start Chat Interface

Simple interactive chat with memory:
def interactive_chat():
    """Interactive chat interface with memory and handoffs"""
    user_id = input("Enter your user ID: ") or "demo_user"
    print(f"Chat started for user: {user_id}")
    print("Type 'quit' to exit\n")

    while True:
        user_input = input("You: ")
        if user_input.lower() == 'quit':
            break

        response = chat_with_handoffs(user_input, user_id)
        print(f"Assistant: {response}\n")

if __name__ == "__main__":
    interactive_chat()

Key Features

1. Automatic Memory Integration

  • Tool-Based Memory: Agents use function tools to search and save memories
  • Conversation Storage: All interactions are automatically stored
  • Context Retrieval: Agents can access relevant past conversations

2. Multi-Agent Memory Sharing

  • Shared Context: Multiple agents access the same memory store
  • Specialized Agents: Create domain-specific agents with shared memory
  • Seamless Handoffs: Agents maintain context across handoffs

3. Flexible Memory Operations

  • Retrieve Capabilities: Retrieve relevant memories from previous conversation
  • User Segmentation: Organize memories by user ID
  • Memory Management: Built-in tools for saving and retrieving information

Configuration Options

Customize memory behavior:
# Configure memory search
memories = mem0.search(
    query="travel preferences",
    user_id="alex",
    limit=5  # Number of memories to retrieve
)

# Add metadata to memories
mem0.add(
    messages=[{"role": "user", "content": "I prefer luxury hotels"}],
    user_id="alex",
    metadata={"category": "travel", "importance": "high"}
)

Help