config
is defined as a Python dictionary with two main keys:llm
: Specifies the llm provider and its configuration
provider
: The name of the llm (e.g., βopenaiβ, βgroqβ)config
: A nested dictionary containing provider-specific settingsconfig
object/dictionaryOPENAI_API_KEY
, OPENAI_BASE_URL
)config
will override corresponding environment variables, which in turn override default values.
Parameter | Description | Provider |
---|---|---|
model | Embedding model to use | All |
temperature | Temperature of the model | All |
api_key | API key to use | All |
max_tokens | Tokens to generate | All |
top_p | Probability threshold for nucleus sampling | All |
top_k | Number of highest probability tokens to keep | All |
http_client_proxies | Allow proxy server settings | AzureOpenAI |
models | List of models | Openrouter |
route | Routing strategy | Openrouter |
openrouter_base_url | Base URL for Openrouter API | Openrouter |
site_url | Site URL | Openrouter |
app_name | Application name | Openrouter |
ollama_base_url | Base URL for Ollama API | Ollama |
openai_base_url | Base URL for OpenAI API | OpenAI |
azure_kwargs | Azure LLM args for initialization | AzureOpenAI |
deepseek_base_url | Base URL for DeepSeek API | DeepSeek |
xai_base_url | Base URL for XAI API | XAI |
sarvam_base_url | Base URL for Sarvam API | Sarvam |
reasoning_effort | Reasoning level (low, medium, high) | Sarvam |
frequency_penalty | Penalize frequent tokens (-2.0 to 2.0) | Sarvam |
presence_penalty | Penalize existing tokens (-2.0 to 2.0) | Sarvam |
seed | Seed for deterministic sampling | Sarvam |
stop | Stop sequences (max 4) | Sarvam |
lmstudio_base_url | Base URL for LM Studio API | LM Studio |