š¢ Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
Mem0 Now Supports Azure OpenAI Models in TypeScript SDK
To use Azure OpenAI models, you have to set the LLM_AZURE_OPENAI_API_KEY
, LLM_AZURE_ENDPOINT
, LLM_AZURE_DEPLOYMENT
and LLM_AZURE_API_VERSION
environment variables. You can obtain the Azure API key from the Azure.
Note: The following are currently unsupported with reasoning models Parallel tool calling
,temperature
, top_p
, presence_penalty
, frequency_penalty
, logprobs
, top_logprobs
, logit_bias
, max_tokens
import os
from mem0 import Memory
os.environ["OPENAI_API_KEY"] = "your-api-key" # used for embedding model
os.environ["LLM_AZURE_OPENAI_API_KEY"] = "your-api-key"
os.environ["LLM_AZURE_DEPLOYMENT"] = "your-deployment-name"
os.environ["LLM_AZURE_ENDPOINT"] = "your-api-base-url"
os.environ["LLM_AZURE_API_VERSION"] = "version-to-use"
config = {
"llm": {
"provider": "azure_openai",
"config": {
"model": "your-deployment-name",
"temperature": 0.1,
"max_tokens": 2000,
"azure_kwargs": {
"azure_deployment": "",
"api_version": "",
"azure_endpoint": "",
"api_key": "",
"default_headers": {
"CustomHeader": "your-custom-header",
}
}
}
}
}
m = Memory.from_config(config)
messages = [
{"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
{"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
{"role": "user", "content": "Iām not a big fan of thriller movies but I love sci-fi movies."},
{"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]
m.add(messages, user_id="alice", metadata={"category": "movies"})
We also support the new OpenAI structured-outputs model. Typescript SDK does not support the azure_openai_structured
model yet.
import os
from mem0 import Memory
os.environ["LLM_AZURE_OPENAI_API_KEY"] = "your-api-key"
os.environ["LLM_AZURE_DEPLOYMENT"] = "your-deployment-name"
os.environ["LLM_AZURE_ENDPOINT"] = "your-api-base-url"
os.environ["LLM_AZURE_API_VERSION"] = "version-to-use"
config = {
"llm": {
"provider": "azure_openai_structured",
"config": {
"model": "your-deployment-name",
"temperature": 0.1,
"max_tokens": 2000,
"azure_kwargs": {
"azure_deployment": "",
"api_version": "",
"azure_endpoint": "",
"api_key": "",
"default_headers": {
"CustomHeader": "your-custom-header",
}
}
}
}
}
All available parameters for the azure_openai
config are present in Master List of All Params in Config.