How Mem0 Searches Memory
Mem0’s search operation lets agents ask natural-language questions and get back the memories that matter most. Like a smart librarian, it finds exactly what you need from everything you’ve stored.Why it matters
- Retrieves the right facts without rebuilding prompts from scratch.
- Supports both managed Platform and OSS so you can test locally and deploy at scale.
- Keeps results relevant with filters, rerankers, and thresholds.
Key terms
- Query – Natural-language question or statement you pass to
search. - Filters – JSON logic (AND/OR, comparison operators) that narrows results by user, categories, dates, etc.
- top_k / threshold – Controls how many memories return and the minimum similarity score.
- Rerank – Optional second pass that boosts precision when a reranker is configured.
Architecture

Architecture diagram illustrating the memory search process.
1
Query processing
Mem0 cleans and enriches your natural-language query so the downstream embedding search is accurate.
2
Vector search
Embeddings locate the closest memories using cosine similarity across your scoped dataset.
3
Filtering & reranking
Logical filters narrow candidates; rerankers or thresholds fine-tune ordering.
4
Results delivery
Formatted memories (with metadata and timestamps) return to your agent or calling service.
How does it work?
Search converts your natural language question into a vector embedding, then finds memories with similar embeddings in your database. The results are ranked by similarity score and can be further refined with filters or reranking.When should you use it?
- Context retrieval - When your agent needs past context to generate better responses
- Personalization - To recall user preferences, history, or past interactions
- Fact checking - To verify information against stored memories before responding
- Decision support - When agents need relevant background information to make decisions
Platform vs OSS usage
| Capability | Mem0 Platform | Mem0 OSS |
|---|---|---|
| user_id usage | In filters={"user_id": "alice"} for search/get_all | As parameter user_id="alice" for all operations |
| Filter syntax | Logical operators (AND, OR, comparisons) with field-level access | Basic field filters, extend via Python hooks |
| Reranking | Toggle rerank=True with managed reranker catalog | Requires configuring local or third-party rerankers |
| Thresholds | Request-level configuration (threshold, top_k) | Controlled via SDK parameters |
| Response metadata | Includes confidence scores, timestamps, dashboard visibility | Determined by your storage backend |
Search with Mem0 Platform
Search with Mem0 Open Source
Expect an array of memory documents. Platform responses include vectors, metadata, and timestamps; OSS returns your stored schema.
Filter patterns
Filters help narrow down search results. Common use cases: Filter by Session Context: Platform API:Tips for better search
- Use natural language: Mem0 understands intent, so describe what you’re looking for naturally
- Scope with user ID: Always provide
user_idto scope search to relevant memories- Platform API: Use
filters={"user_id": "alice"} - OSS: Use
user_id="alice"as parameter
- Platform API: Use
- Combine filters: Use AND/OR logic to create precise queries (Platform)
- Consider wildcard filters: Use wildcard filters (e.g.,
run_id: "*") for broader matches - Tune parameters: Adjust
top_kfor result count,thresholdfor relevance cutoff - Enable reranking: Use
rerank=True(default) when you have a reranker configured
More Details
For the full list of filter logic, comparison operators, and optional search parameters, see the Search Memory API Reference.Put it into practice
- Revisit the Add Memory guide to ensure you capture the context you expect to retrieve.
- Configure rerankers and filters in Advanced Retrieval for higher precision.
See it live
- Support Inbox with Mem0 demonstrates scoped search with rerankers.
- Tavily Search with Mem0 shows hybrid search in action.