Supported LLMs
DeepSeek
š¢ Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
To use DeepSeek LLM models, you have to set the DEEPSEEK_API_KEY
environment variable. You can also optionally set DEEPSEEK_API_BASE
if you need to use a different API endpoint (defaults to āhttps://api.deepseek.comā).
Usage
You can also configure the API base URL in the config:
Config
All available parameters for the deepseek
config are present in Master List of All Params in Config.