Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.

Usage

To use a llm, you must provide a configuration to customize its usage. If no configuration is supplied, a default configuration will be applied, and OpenAI will be used as the llm.

For a comprehensive list of available parameters for llm configuration, please refer to Config.

To view all supported llms, visit the Supported LLMs.

Structured vs Unstructured Outputs

Mem0 supports two types of OpenAI LLM formats, each with its own strengths and use cases:

Structured Outputs

Structured outputs are LLMs that align with OpenAI’s structured outputs model:

  • Optimized for: Returning structured responses (e.g., JSON objects)
  • Benefits: Precise, easily parseable data
  • Ideal for: Data extraction, form filling, API responses
  • Learn more: OpenAI Structured Outputs Guide

Unstructured Outputs

Unstructured outputs correspond to OpenAI’s standard, free-form text model:

  • Flexibility: Returns open-ended, natural language responses
  • Customization: Use the response_format parameter to guide output
  • Trade-off: Less efficient than structured outputs for specific data needs
  • Best for: Creative writing, explanations, general conversation

Choose the format that best suits your application’s requirements for optimal performance and usability.