Commit graph

18 commits

Author SHA1 Message Date
Neelesh Kumar
463ca5d613
📄 docs: Update apipie fetch.py in ai_endpoints.md (#2547)
* Update apipie fetch.py in ai_endpoints.md

Made the python code more pythonic

* fix bug that caused duplicate model_ids
2024-04-27 18:37:33 -04:00
Danny Avila
63ef15ab63
🦙 feat: Fetch list of Ollama Models (#2565)
* 🦙 feat: Fetch list of Ollama Models

* style: better Tag text styling for light mode
2024-04-27 18:27:04 -04:00
Danny Avila
099aa9dead
feat: Stop Sequences for Conversations & Presets (#2536)
* feat: `stop` conversation parameter

* feat: Tag primitive

* feat: dynamic tags

* refactor: update tag styling

* feat: add stop sequences to OpenAI settings

* fix(Presentation): prevent `SidePanel` re-renders that flicker side panel

* refactor: use stop placeholder

* feat: type and schema update for `stop` and `TPreset` in generation param related types

* refactor: pass conversation to dynamic settings

* refactor(OpenAIClient): remove default handling for `modelOptions.stop`

* docs: fix Google AI Setup formatting

* feat: current_model

* docs: WIP update

* fix(ChatRoute): prevent default preset override before `hasSetConversation.current` becomes true by including latest conversation state as template

* docs: update docs with more info on `stop`

* chore: bump config_version

* refactor: CURRENT_MODEL handling
2024-04-25 11:40:17 -04:00
Fuegovic
ca9a0fe629
🥧 feat: APIpie support (#2524) 2024-04-24 20:32:18 -04:00
Danny Avila
c96f067689
🔧 fix: Resolve Proper Dependencies to fix Application Error (#2488)
* chore: bump data-provider

* feat: script to check recent dependency updates

* fix: override vite/rollup version for vite build fix
- also remove unused vite-plugin-html
- add vite build to file output command

* chore: bump rollup override to last known working version (v4.16.0 is breaking)

* chore(vite): increase file size cache for workbox

* fix: resolve openai to last known version using assistants v1 latest features and default header

* chore: update openrouter examples
2024-04-22 12:52:30 -04:00
Fuegovic
4196a86fa9
🦙 doc update: llama3 (#2470)
* docs: update breaking_changes.md

* docs: update ai_endpoints.md -> llama3 for Ollama and groq

* librechat.yaml: update groq models

* Update breaking_changes.md

logs location

* Update breaking_changes.md

---------

Co-authored-by: Danny Avila <danny@librechat.ai>
2024-04-19 21:40:12 -04:00
David LaPorte
476767355b
🚅 docs(ai_endpoints): Reflect correct LiteLLM baseURL when using docker-compose (#2324)
Added note to LiteLLM baseURL to reflect docker-compose usage
2024-04-05 19:35:34 -04:00
Danny Avila
fb80af05be
🧠 fix(Cohere): map to expected SDK params (#2329) 2024-04-05 16:45:18 -04:00
Danny Avila
cd7f3a51e1
🧠 feat: Cohere support as Custom Endpoint (#2328)
* chore: bump cohere-ai, fix firebase vulnerabilities by going down versions

* feat: cohere rates and context windows

* feat(createCoherePayload): transform openai payload for cohere compatibility

* feat: cohere backend support

* refactor(UnknownIcon): optimize icon render and add cohere

* docs: add cohere to Compatible AI Endpoints

* Update ai_endpoints.md
2024-04-05 15:19:41 -04:00
Till Zoppke
ed17e17a73
📖 docs: Note on 'host.docker.internal' for Ollama Config (#2274)
* docs: update URL to access ollama and comment on 'host.docker.internal'

* Update ai_endpoints.md

---------

Co-authored-by: Danny Avila <danacordially@gmail.com>
2024-04-02 03:25:15 -04:00
Danny Avila
7f83a060a0
🔍 chore: Clean Up Documentation (#2217)
* fix(initializeClient.spec.js): remove condition failing test on local installations

* docs: remove comments and invalid html as is required by embeddings generator and add new documentation guidelines
2024-03-26 13:40:00 -04:00
Hermes Trismegistus
ed64c76053
📖 docs: Update ShuttleAI Fibonacci Image (#2160) 2024-03-21 22:41:58 -04:00
Hermes Trismegistus
1ee2c32a67
🚀 feat: Add ShuttleAI as Known Endpoint (#2152)
Added new Official Known Endpoint (ShuttleAI)
2024-03-21 09:17:57 -04:00
Fuegovic
db870e55c3
🔖 chore: update groq models (#2031) 2024-03-09 08:32:08 -05:00
bsu3338
78f52859c4
📚 docs: Separate LiteLLM and Ollama Documentation (#1948)
* Separate LiteLLM and Ollama Documentation

* Clarify Ollama Setup

* Fix litellm config
2024-03-02 12:42:02 -05:00
Fuegovic
53ae2d7bfb
🤖feat: add multiple known endpoints (#1917)
* feat: add known endpoints

* docs: add known endpoints

* update ai_endpoints.md

remove the groq icon from the example

* Update ai_endpoints.md

---------

Co-authored-by: Danny Avila <messagedaniel@protonmail.com>
2024-02-28 08:46:21 -05:00
Danny Avila
c37d5568bf
🍞 fix: Minor fixes and improved Bun support (#1916)
* fix(bun): fix bun compatibility to allow gzip header: https://github.com/oven-sh/bun/issues/267#issuecomment-1854460357

* chore: update custom config examples

* fix(OpenAIClient.chatCompletion): remove redundant call of stream.controller.abort() as `break` aborts the request and prevents abort errors when not called redundantly

* chore: bump bun.lockb

* fix: remove result-thinking class when message is no longer streaming

* fix(bun): improve Bun support by forcing use of old method in bun env, also update old methods with new customizable params

* fix(ci): pass tests
2024-02-27 17:51:16 -05:00
Danny Avila
5d887492ea
🤖 docs: Add Groq and other Compatible AI Endpoints (#1915)
* chore: bump bun dependencies

* feat: make `groq` a known endpoint

* docs: compatible ai endpoints

* Update ai_endpoints.md

* Update ai_endpoints.md
2024-02-27 13:42:10 -05:00