mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-17 00:40:14 +01:00
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936)
* chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
This commit is contained in:
parent
ccb46164c0
commit
fd70e21732
58 changed files with 809 additions and 523 deletions
15
.env.example
15
.env.example
|
|
@ -77,6 +77,19 @@ OPENAI_API_KEY=user_provided
|
|||
# https://github.com/waylaidwanderer/node-chatgpt-api#using-a-reverse-proxy
|
||||
# OPENAI_REVERSE_PROXY=
|
||||
|
||||
##########################
|
||||
# OpenRouter (overrides OpenAI and Plugins Endpoints):
|
||||
##########################
|
||||
|
||||
# OpenRouter is a legitimate proxy service to a multitude of LLMs, both closed and open source, including:
|
||||
# OpenAI models, Anthropic models, Meta's Llama models, pygmalionai/mythalion-13b
|
||||
# and many more open source models. Newer integrations are usually discounted, too!
|
||||
|
||||
# Note: this overrides the OpenAI and Plugins Endpoints.
|
||||
# See ./docs/install/free_ai_apis.md for more info.
|
||||
|
||||
# OPENROUTER_API_KEY=
|
||||
|
||||
##########################
|
||||
# AZURE Endpoint:
|
||||
##########################
|
||||
|
|
@ -156,6 +169,8 @@ BINGAI_TOKEN=user_provided
|
|||
# Leave it blank to use internal settings.
|
||||
# PLUGIN_MODELS=gpt-3.5-turbo,gpt-3.5-turbo-16k,gpt-3.5-turbo-0301,gpt-4,gpt-4-0314,gpt-4-0613
|
||||
|
||||
DEBUG_PLUGINS=true # Set to false or comment out to disable debug mode for plugins
|
||||
|
||||
# For securely storing credentials, you need a fixed key and IV. You can set them here for prod and dev environments
|
||||
# If you don't set them, the app will crash on startup.
|
||||
# You need a 32-byte key (64 characters in hex) and 16-byte IV (32 characters in hex)
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue