Supported LLMs
Ollama
You can use LLMs from Ollama to run Mem0 locally. These models support tool support.
Usage
Config
All available parameters for the ollama
config are present in Master List of All Params in Config.
You can use LLMs from Ollama to run Mem0 locally. These models support tool support.
All available parameters for the ollama
config are present in Master List of All Params in Config.