How to define configurations?
The config is defined as an object (or dictionary) with two main keys:embedder
: Specifies the embedder provider and its configurationprovider
: The name of the embedder (e.g., “openai”, “ollama”)config
: A nested object or dictionary containing provider-specific settings
How to use configurations?
Here’s a general example of how to use the config with mem0:Why is Config Needed?
Config is essential for:- Specifying which embedding model to use.
- Providing necessary connection details (e.g., model, api_key, embedding_dims).
- Ensuring proper initialization and connection to your chosen embedder.
Master List of All Params in Config
Here’s a comprehensive list of all parameters that can be used across different embedders:Parameter | Description | Provider |
---|---|---|
model | Embedding model to use | All |
api_key | API key of the provider | All |
embedding_dims | Dimensions of the embedding model | All |
http_client_proxies | Allow proxy server settings | All |
ollama_base_url | Base URL for the Ollama embedding model | Ollama |
model_kwargs | Key-Value arguments for the Huggingface embedding model | Huggingface |
azure_kwargs | Key-Value arguments for the AzureOpenAI embedding model | Azure OpenAI |
openai_base_url | Base URL for OpenAI API | OpenAI |
vertex_credentials_json | Path to the Google Cloud credentials JSON file for VertexAI | VertexAI |
memory_add_embedding_type | The type of embedding to use for the add memory action | VertexAI |
memory_update_embedding_type | The type of embedding to use for the update memory action | VertexAI |
memory_search_embedding_type | The type of embedding to use for the search memory action | VertexAI |
lmstudio_base_url | Base URL for LM Studio API | LM Studio |