LibreChat/api/server/services/Endpoints/gptPlugins
Danny Avila 6ba7f60eec
🪙 feat: Configure Max Context and Output Tokens (#2648)
* chore: make frequent 'error' log into 'debug' log

* feat: add maxContextTokens as a conversation field

* refactor(settings): increase popover height

* feat: add DynamicInputNumber and maxContextTokens to all endpoints that support it (frontend), fix schema

* feat: maxContextTokens handling (backend)

* style: revert popover height

* feat: max tokens

* fix: Ollama Vision firebase compatibility

* fix: Ollama Vision, use message_file_map to determine multimodal request

* refactor: bring back MobileNav and improve title styling
2024-05-09 13:27:13 -04:00
..
buildOptions.js 🪙 feat: Configure Max Context and Output Tokens (#2648) 2024-05-09 13:27:13 -04:00
index.js feat(Google): Support all Text/Chat Models, Response streaming, PaLM -> Google 🤖 (#1316) 2023-12-10 14:54:13 -05:00
initializeClient.js 🧑‍💻 refactor: Display Client-facing Errors (#2476) 2024-04-21 08:31:54 -04:00
initializeClient.spec.js 🧑‍💻 refactor: Display Client-facing Errors (#2476) 2024-04-21 08:31:54 -04:00