Overview
Zeus supports flexible model configuration. Users can connect any OpenAI-compatible model using their own API Key, or use Zeus preset models.Configuration Sources
Model configuration is selected by the following priority: User custom configuration > Request parameters > System default configuration.LLMManager
The LLM model manager is responsible for creating model instances._init_model() receives the LLM configuration passed from the frontend (including Base URL, API Key, model name, temperature, etc.) and creates a ChatOpenAI instance.
It first checks whether a user custom model configuration exists (with their own API Key and Base URL). If not, it reads the system default configuration file and selects the corresponding model and key based on the provider. Supports any OpenAI-compatible API (OpenAI, Anthropic via proxy, local models, etc.).
Model Profiles
The system includes 200+ built-in model context window configurations, used forSummarizationMiddleware trigger threshold calculation:
| Model | max_input_tokens | Summarization Trigger (85%) |
|---|---|---|
| gpt-4o | 128,000 | 108,800 |
| claude-3.5-sonnet | 200,000 | 170,000 |
| gpt-4-turbo | 128,000 | 108,800 |
| Default | 64,000 | 54,400 |