Azure AI Search (formerly known as “Azure Cognitive Search”) provides secure information retrieval at scale over user-owned content in traditional and generative AI search applications.

Usage

import os
from mem0 import Memory

os.environ["OPENAI_API_KEY"] = "sk-xx"   #this key is used for embedding purpose

config = {
    "vector_store": {
        "provider": "azure_ai_search",
        "config": {
            "service_name": "ai-search-test",
            "api_key": "*****",
            "collection_name": "mem0", 
            "embedding_model_dims": 1536 ,
            "use_compression": False
        }
    }
}

m = Memory.from_config(config)
messages = [
    {"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
    {"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
    {"role": "user", "content": "I’m not a big fan of thriller movies but I love sci-fi movies."},
    {"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]
m.add(messages, user_id="alice", metadata={"category": "movies"})

Config

Let’s see the available parameters for the qdrant config: service_name (str): Azure Cognitive Search service name.

ParameterDescriptionDefault Value
service_nameAzure AI Search service nameNone
api_keyAPI key of the Azure AI Search serviceNone
collection_nameThe name of the collection/index to store the vectors, it will be created automatically if not existmem0
embedding_model_dimsDimensions of the embedding model1536
use_compressionUse scalar quantization vector compressionFalse