Config
What is Config?
Config in mem0 is a dictionary that specifies the settings for your llms. It allows you to customize the behavior and connection details of your chosen llm.
How to Define Config
The config is defined as a Python dictionary with two main keys:
llm
: Specifies the llm provider and its configurationprovider
: The name of the llm (e.g., “openai”, “groq”)config
: A nested dictionary containing provider-specific settings
Config Values Precedence
Config values are applied in the following order of precedence (from highest to lowest):
- Values explicitly set in the
config
dictionary - Environment variables (e.g.,
OPENAI_API_KEY
,OPENAI_API_BASE
) - Default values defined in the LLM implementation
This means that values specified in the config
dictionary will override corresponding environment variables, which in turn override default values.
How to Use Config
Here’s a general example of how to use the config with mem0:
Why is Config Needed?
Config is essential for:
- Specifying which llm to use.
- Providing necessary connection details (e.g., model, api_key, temperature).
- Ensuring proper initialization and connection to your chosen llm.
Master List of All Params in Config
Here’s a comprehensive list of all parameters that can be used across different llms:
Here’s the table based on the provided parameters:
Parameter | Description | Provider |
---|---|---|
model | Embedding model to use | All |
temperature | Temperature of the model | All |
api_key | API key to use | All |
max_tokens | Tokens to generate | All |
top_p | Probability threshold for nucleus sampling | All |
top_k | Number of highest probability tokens to keep | All |
http_client_proxies | Allow proxy server settings | AzureOpenAI |
models | List of models | Openrouter |
route | Routing strategy | Openrouter |
openrouter_base_url | Base URL for Openrouter API | Openrouter |
site_url | Site URL | Openrouter |
app_name | Application name | Openrouter |
ollama_base_url | Base URL for Ollama API | Ollama |
openai_base_url | Base URL for OpenAI API | OpenAI |
azure_kwargs | Azure LLM args for initialization | AzureOpenAI |
Supported LLMs
For detailed information on configuring specific llms, please visit the LLMs section. There you’ll find information for each supported llm with provider-specific usage examples and configuration details.