Supported Embedding Models
Ollama
You can use embedding models from Ollama to run Mem0 locally.
Usage
Config
Here are the parameters available for configuring Ollama embedder:
Parameter | Description | Default Value |
---|---|---|
model | The name of the OpenAI model to use | nomic-embed-text |
embedding_dims | Dimensions of the embedding model | 512 |
ollama_base_url | Base URL for ollama connection | None |