🔐 Mem0 is now SOC 2 and HIPAA compliant! We're committed to the highest standards of data security and privacy, enabling secure memory for enterprises, healthcare, and beyond. Learn more

To use OpenAI LLM models, you have to set the OPENAI_API_KEY environment variable. You can obtain the OpenAI API key from the OpenAI Platform.

Usage

import os
from mem0 import Memory

os.environ["OPENAI_API_KEY"] = "your-api-key"

config = {
    "llm": {
        "provider": "openai",
        "config": {
            "model": "gpt-4o",
            "temperature": 0.2,
            "max_tokens": 2000,
        }
    }
}

# Use Openrouter by passing it's api key
# os.environ["OPENROUTER_API_KEY"] = "your-api-key"
# config = {
#    "llm": {
#        "provider": "openai",
#        "config": {
#            "model": "meta-llama/llama-3.1-70b-instruct",
#        }
#    }
# }

m = Memory.from_config(config)
messages = [
    {"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
    {"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
    {"role": "user", "content": "I’m not a big fan of thriller movies but I love sci-fi movies."},
    {"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]
m.add(messages, user_id="alice", metadata={"category": "movies"})

We also support the new OpenAI structured-outputs model.

import os
from mem0 import Memory

os.environ["OPENAI_API_KEY"] = "your-api-key"

config = {
    "llm": {
        "provider": "openai_structured",
        "config": {
            "model": "gpt-4o-2024-08-06",
            "temperature": 0.0,
        }
    }
}

m = Memory.from_config(config)

Config

All available parameters for the openai config are present in Master List of All Params in Config.