Overview
Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
To use OpenAI LLM models, you have to set the OPENAI_API_KEY
environment variable. You can obtain the OpenAI API key from the OpenAI Platform.
Once you have obtained the key, you can use it like this:
import os
from mem0 import Memory
os.environ["OPENAI_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "openai",
"config": {
"model": "gpt-4o",
"temperature": 0.2,
"max_tokens": 1500,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
Groq is the creator of the world’s first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
In order to use LLMs from Groq, go to their platform and get the API key. Set the API key as GROQ_API_KEY
environment variable to use the model as given below in the example.
import os
from mem0 import Memory
os.environ["GROQ_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "groq",
"config": {
"model": "mixtral-8x7b-32768",
"temperature": 0.1,
"max_tokens": 1000,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
TogetherAI
To use TogetherAI LLM models, you have to set the TOGETHER_API_KEY
environment variable. You can obtain the TogetherAI API key from their Account settings page.
Once you have obtained the key, you can use it like this:
import os
from mem0 import Memory
os.environ["TOGETHER_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "togetherai",
"config": {
"model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"temperature": 0.2,
"max_tokens": 1500,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
AWS Bedrock
- Before using the AWS Bedrock LLM, make sure you have the appropriate model access from Bedrock Console.
- You will also need to authenticate the
boto3
client by using a method in the AWS documentation
- You will have to export
AWS_REGION
, AWS_ACCESS_KEY
, and AWS_SECRET_ACCESS_KEY
to set environment variables.
import os
from mem0 import Memory
os.environ['AWS_REGION'] = 'us-east-1'
os.environ["AWS_ACCESS_KEY"] = "xx"
os.environ["AWS_SECRET_ACCESS_KEY"] = "xx"
config = {
"llm": {
"provider": "aws_bedrock",
"config": {
"model": "arn:aws:bedrock:us-east-1:123456789012:model/your-model-name",
"temperature": 0.2,
"max_tokens": 1500,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
Litellm
Litellm is compatible with over 100 large language models (LLMs), all using a standardized input/output format. You can explore the available models to use with Litellm. Ensure you set the API_KEY
for the model you choose to use.
import os
from mem0 import Memory
os.environ["OPENAI_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "litellm",
"config": {
"model": "gpt-3.5-turbo",
"temperature": 0.2,
"max_tokens": 1500,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
Google AI
To use Google AI model, you have to set the GOOGLE_API_KEY
environment variable. You can obtain the Google API key from the Google Maker Suite
Once you have obtained the key, you can use it like this:
import os
from mem0 import Memory
os.environ["GEMINI_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "litellm",
"config": {
"model": "gemini/gemini-pro",
"temperature": 0.2,
"max_tokens": 1500,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
Anthropic
To use anthropic’s models, please set the ANTHROPIC_API_KEY
which you find on their Account Settings Page.
import os
from mem0 import Memory
os.environ["ANTHROPIC_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "litellm",
"config": {
"model": "claude-3-opus-20240229",
"temperature": 0.1,
"max_tokens": 2000,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
Mistral AI
To use mistral’s models, please Obtain the Mistral AI api key from their console. Set the MISTRAL_API_KEY
environment variable to use the model as given below in the example.
import os
from mem0 import Memory
os.environ["MISTRAL_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "litellm",
"config": {
"model": "open-mixtral-8x7b",
"temperature": 0.1,
"max_tokens": 2000,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
OpenAI Azure
To use Azure AI models, you have to set the AZURE_API_KEY
, AZURE_API_BASE
, and AZURE_API_VERSION
environment variables. You can obtain the Azure API key from the Azure.
import os
from mem0 import Memory
os.environ["AZURE_API_KEY"] = "your-api-key"
os.environ["AZURE_API_BASE"] = "your-api-base-url"
os.environ["AZURE_API_VERSION"] = "version-to-use"
config = {
"llm": {
"provider": "litellm",
"config": {
"model": "azure_ai/command-r-plus",
"temperature": 0.1,
"max_tokens": 2000,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})