mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-17 17:00:15 +01:00
* chore: update @librechat/agents to version 2.1.9
* feat: xAI standalone provider for agents
* chore: bump librechat-data-provider version to 0.7.6997
* fix: reorder import statements and enhance user listing output
* fix: Update Docker Compose commands to support v2 syntax with fallback
* 🔧 fix: drop `reasoning_effort` for o1-preview/mini models
* chore: requireLocalAuth logging
* fix: edge case artifact message editing logic to handle `new` conversation IDs
* fix: remove `temperature` from model options in OpenAIClient if o1-mini/preview
* fix: update type annotation for fetchPromisesMap to use Promise<string[]> instead of string[]
* feat: anthropic model fetching
* fix: update model name to use EModelEndpoint.openAI in fetchModels and fetchOpenAIModels
* fix: add error handling to modelController for loadModels
* fix: add error handling and logging for model fetching in loadDefaultModels
* ci: update getAnthropicModels tests to be asynchronous
* feat: add user ID to model options in OpenAI and custom endpoint initialization
---------
Co-authored-by: Andrei Berceanu <andreicberceanu@gmail.com>
Co-authored-by: KiGamji <maloyh44@gmail.com>
|
||
|---|---|---|
| .. | ||
| EndpointService.js | ||
| getCustomConfig.js | ||
| getEndpointsConfig.js | ||
| handleRateLimits.js | ||
| index.js | ||
| ldap.js | ||
| loadAsyncEndpoints.js | ||
| loadConfigEndpoints.js | ||
| loadConfigModels.js | ||
| loadConfigModels.spec.js | ||
| loadCustomConfig.js | ||
| loadCustomConfig.spec.js | ||
| loadDefaultEConfig.js | ||
| loadDefaultModels.js | ||
| loadOverrideConfig.js | ||