LibreChat/api/server/services/Endpoints
Danny Avila 5d40d0a37a
⚙️ feat: Adjust Rate of Stream Progress (#3244)
* chore: bump data-provider and add MESSAGES CacheKey

* refactor: avoid saving messages while streaming, save partial text to cache instead

* fix(ci): processChunks

* chore: logging aborted request to debug

* feat: set stream rate for token processing

* chore: specify default stream rate

* fix(ci): Update AppService.js to use optional chaining for endpointLocals assignment

* refactor: abstract the error handler

* feat: streamRate for assistants; refactor: update default rate for token

* refactor: update error handling in assistants/errors.js

* refactor: update error handling in assistants/errors.js
2024-07-17 10:47:17 -04:00
..
anthropic ⚙️ feat: Adjust Rate of Stream Progress (#3244) 2024-07-17 10:47:17 -04:00
assistants 🤖 feat: OpenAI Assistants v2 (initial support) (#2781) 2024-05-19 12:56:55 -04:00
azureAssistants 🤖 feat: OpenAI Assistants v2 (initial support) (#2781) 2024-05-19 12:56:55 -04:00
custom ⚙️ feat: Adjust Rate of Stream Progress (#3244) 2024-07-17 10:47:17 -04:00
google ⚙️ feat: Adjust Rate of Stream Progress (#3244) 2024-07-17 10:47:17 -04:00
gptPlugins ⚙️ feat: Adjust Rate of Stream Progress (#3244) 2024-07-17 10:47:17 -04:00
openAI ⚙️ feat: Adjust Rate of Stream Progress (#3244) 2024-07-17 10:47:17 -04:00