Set your Mem0 Platform API key as an environment variable. You can replace <your-mem0-api-key> with your actual API key:
You can obtain your Mem0 Platform API key from the Mem0 Platform.
from dotenv import load_dotenvimport osload_dotenv()# os.environ["MEM0_API_KEY"] = "<your-mem0-api-key>"
Import the necessary modules and create a Mem0Memory instance:
from llama_index.memory.mem0 import Mem0Memorycontext = {"user_id": "alice"}memory_from_client = Mem0Memory.from_client( context=context, search_msg_limit=4, # optional, default is 5)
Context is used to identify the user, agent or the conversation in the Mem0. It is required to be passed in the at least one of the fields in the Mem0Memory constructor. It can be any of the following:
search_msg_limit is optional, default is 5. It is the number of messages from the chat history to be used for memory retrieval from Mem0. More number of messages will result in more context being used for retrieval but will also increase the retrieval time and might result in some unwanted results.
search_msg_limit is different from limit. limit is the number of messages to be retrieved from Mem0 and is used in search.
Use the SimpleChatEngine to start a chat with the agent with the memory.
from llama_index.core.chat_engine import SimpleChatEngineagent = SimpleChatEngine.from_defaults( llm=llm, memory=memory_from_client # or memory_from_config)# Start the chatresponse = agent.chat("Hi, My name is Alice")print(response)
Now we will learn how to use Mem0 with FunctionCalling and ReAct agents.Initialize the tools:
from llama_index.core.tools import FunctionTooldef call_fn(name: str): """Call the provided name. Args: name: str (Name of the person) """ print(f"Calling... {name}")def email_fn(name: str): """Email the provided name. Args: name: str (Name of the person) """ print(f"Emailing... {name}")call_tool = FunctionTool.from_defaults(fn=call_fn)email_tool = FunctionTool.from_defaults(fn=email_fn)
from llama_index.core.agent import FunctionCallingAgentagent = FunctionCallingAgent.from_tools( [call_tool, email_tool], llm=llm, memory=memory_from_client, # or memory_from_config verbose=True,)# Start the chatresponse = agent.chat("Hi, My name is Alice")print(response)
from llama_index.core.agent import ReActAgentagent = ReActAgent.from_tools( [call_tool, email_tool], llm=llm, memory=memory_from_client, # or memory_from_config verbose=True,)# Start the chatresponse = agent.chat("Hi, My name is Alice")print(response)
By integrating LlamaIndex with Mem0, you can build a personalized agent that can maintain context across interactions with the agent and provide tailored recommendations and assistance.
LlamaIndex Multiagent Cookbook
Build multi-agent systems with LlamaIndex and Mem0