mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-16 16:30:15 +01:00
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936)
* chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
This commit is contained in:
parent
ccb46164c0
commit
fd70e21732
58 changed files with 809 additions and 523 deletions
|
|
@ -1,8 +1,34 @@
|
|||
# Free AI APIs
|
||||
|
||||
There are APIs offering free access to AI APIs via reverse proxy, and one of the major players, compatible with LibreChat, is NagaAI.
|
||||
There are APIs offering free/free-trial access to AI APIs via reverse proxy.
|
||||
|
||||
Feel free to check out the others, but I haven't personally tested them: [Free AI APIs](https://github.com/NovaOSS/free-ai-apis)
|
||||
Here is a well-maintained public list of [Free AI APIs](https://github.com/NovaOSS/free-ai-apis) that may or may not be compatible with LibreChat
|
||||
|
||||
### [OpenRouter](https://openrouter.ai/) ⇆ (preferred)
|
||||
|
||||
While not completely free, you get free trial credits when you [sign up to OpenRouter](https://openrouter.ai/), a legitimate proxy service to a multitude of LLMs, both closed and open source, including:
|
||||
- OpenAI models (great if you are barred from their API for whatever reason)
|
||||
- Anthropic Claude models (same as above)
|
||||
- Meta's Llama models
|
||||
- pygmalionai/mythalion-13b
|
||||
- and many more open source models. Newer integrations are usually discounted, too!
|
||||
|
||||
OpenRouter is so great, I decided to integrate it to the project as a standalone feature.
|
||||
|
||||
**Setup:**
|
||||
- Signup to [OpenRouter](https://openrouter.ai/) and create a key. You should name it and set a limit as well.
|
||||
- Set the environment variable `OPENROUTER_API_KEY` in your .env file to the key you just created.
|
||||
- Restart your LibreChat server and use the OpenAI or Plugins endpoints.
|
||||
|
||||
**Notes:**
|
||||
- [TODO] **In the future, you will be able to set up OpenRouter from the frontend as well.**
|
||||
- This will override the official OpenAI API or your reverse proxy settings for both Plugins and OpenAI.
|
||||
- On initial setup, you may need to refresh your page twice to see all their supported models populate automatically.
|
||||
- Plugins: Functions Agent works with OpenRouter when using OpenAI models.
|
||||
- Plugins: Turn functions off to try plugins with non-OpenAI models (ChatGPT plugins will not work and others may not work as expected).
|
||||
- Plugins: Make sure `PLUGINS_USE_AZURE` is not set in your .env file when wanting to use OpenRouter and you have Azure configured.
|
||||
|
||||
> ⚠️ OpenRouter is in a category of its own, and is highly recommended over the "free" services below. NagaAI and other 'free' API proxies tend to have intermittent issues, data leaks, and/or problems with the guidelines of the platforms they advertise on. Use the below at your own risk.
|
||||
|
||||
### NagaAI
|
||||
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue