Skip to main content

Overview

Zeus supports flexible model configuration. Users can connect any OpenAI-compatible model using their own API Key, or use Zeus preset models.

Configuration Sources

Model configuration is selected by the following priority: User custom configuration > Request parameters > System default configuration.

LLMManager

The LLM model manager is responsible for creating model instances. _init_model() receives the LLM configuration passed from the frontend (including Base URL, API Key, model name, temperature, etc.) and creates a ChatOpenAI instance. It first checks whether a user custom model configuration exists (with their own API Key and Base URL). If not, it reads the system default configuration file and selects the corresponding model and key based on the provider. Supports any OpenAI-compatible API (OpenAI, Anthropic via proxy, local models, etc.).

Model Profiles

The system includes 200+ built-in model context window configurations, used for SummarizationMiddleware trigger threshold calculation:
Modelmax_input_tokensSummarization Trigger (85%)
gpt-4o128,000108,800
claude-3.5-sonnet200,000170,000
gpt-4-turbo128,000108,800
Default64,00054,400
When a model is not in the predefined list, default configuration is used.