Mastra
📢 Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
The Mastra integration demonstrates how to use Mastra’s agent system with Mem0 as the memory backend through custom tools. This enables agents to remember and recall information across conversations.
Overview
In this guide, we’ll create a Mastra agent that:
- Uses Mem0 to store information using a memory tool
- Retrieves relevant memories using a search tool
- Provides personalized responses based on past interactions
- Maintains context across conversations and sessions
Setup and Configuration
Install the required libraries:
Set up your environment variables:
Initialize Mem0 Integration
Import required modules and set up the Mem0 integration:
Create Memory Tools
Set up tools for memorizing and remembering information:
Create Mastra Agent
Initialize an agent with memory tools and clear instructions:
Key Features
- Tool-based Memory Control: The agent decides when to save and retrieve information using specific tools
- Semantic Search: Mem0 finds relevant memories based on semantic similarity, not just exact matches
- User-specific Memory Spaces: Each user_id maintains separate memory contexts
- Asynchronous Saving: Memories are saved in the background to reduce response latency
- Cross-conversation Persistence: Memories persist across different conversation threads
- Transparent Operations: Memory operations are visible through tool usage
Conclusion
By integrating Mastra with Mem0, you can build intelligent agents that learn and remember information across conversations. The tool-based approach provides transparency and control over memory operations, making it easy to create personalized and context-aware AI experiences.
Help
- For more details on Mastra, visit the Mastra documentation.
- For Mem0 documentation, refer to the Mem0 Platform.
- If you need further assistance, please feel free to reach out to us through the following methods: