* chore: rename dir from `assistant` to plural
* feat: `assistants` field for azure config, spread options in AppService
* refactor: rename constructAzureURL param for azure as `azureOptions`
* chore: bump openai and bun
* chore(loadDefaultModels): change naming of assistant -> assistants
* feat: load azure settings with currect baseURL for assistants' initializeClient
* refactor: add `assistants` flags to groups and model configs, add mapGroupToAzureConfig
* feat(loadConfigEndpoints): initialize assistants endpoint if azure flag `assistants` is enabled
* feat(AppService): determine assistant models on startup, throw Error if none
* refactor(useDeleteAssistantMutation): send model along with assistant id for delete mutations
* feat: support listing and deleting assistants with azure
* feat: add model query to assistant avatar upload
* feat: add azure support for retrieveRun method
* refactor: update OpenAIClient initialization
* chore: update README
* fix(ci): tests passing
* refactor(uploadOpenAIFile): improve logging and use more efficient REST API method
* refactor(useFileHandling): add model to metadata to target Azure region compatible with current model
* chore(files): add azure naming pattern for valid file id recognition
* fix(assistants): initialize openai with first available assistant model if none provided
* refactor(uploadOpenAIFile): add content type for azure, initialize formdata before azure options
* refactor(sleep): move sleep function out of Runs and into `~/server/utils`
* fix(azureOpenAI/assistants): make sure to only overwrite models with assistant models if `assistants` flag is enabled
* refactor(uploadOpenAIFile): revert to old method
* chore(uploadOpenAIFile): use enum for file purpose
* docs: azureOpenAI update guide with more info, examples
* feat: enable/disable assistant capabilities and specify retrieval models
* refactor: optional chain conditional statement in loadConfigModels.js
* docs: add assistants examples
* chore: update librechat.example.yaml
* docs(azure): update note of file upload behavior in Azure OpenAI Assistants
* chore: update docs and add descriptive message about assistant errors
* fix: prevent message submission with invalid assistant or if files loading
* style: update Landing icon & text when assistant is not selected
* chore: bump librechat-data-provider to 0.4.8
* fix(assistants/azure): assign req.body.model for proper azure init to abort runs
* chore: bump browserslist-db@latest
* refactor(EndpointService): simplify with `generateConfig`, utilize optional baseURL for OpenAI-based endpoints, use `isUserProvided` helper fn wherever needed
* refactor(custom/initializeClient): use standardized naming for common variables
* feat: user provided baseURL for openAI-based endpoints
* refactor(custom/initializeClient): re-order operations
* fix: knownendpoints enum definition and add FetchTokenConfig, bump data-provider
* refactor(custom): use tokenKey dependent on userProvided conditions for caching and fetching endpointTokenConfig, anticipate token rates from custom config
* refactor(custom): assure endpointTokenConfig is only accessed from cache if qualifies for fetching
* fix(ci): update tests for initializeClient based on userProvideURL changes
* fix(EndpointService): correct baseURL env var for assistants: `ASSISTANTS_BASE_URL`
* fix: unnecessary run cancellation on res.close() when response.run is completed
* feat(assistants): user provided URL option
* ci: update tests and add test for `assistants` endpoint
* chore: leaner condition for request closing
* chore: more descriptive error message to provide keys again