🅰️ feat: Azure AI Studio, Models as a Service Support (#1902)

* feat(data-provider): add Azure serverless inference handling through librechat.yaml

* feat(azureOpenAI): serverless inference handling in api

* docs: update docs with new azureOpenAI endpoint config fields and serverless inference endpoint setup

* chore: remove unnecessary checks for apiKey as schema would not allow apiKey to be undefined

* ci(azureOpenAI): update tests for serverless configurations
This commit is contained in:
Danny Avila 2024-02-26 19:10:29 -05:00 committed by GitHub
parent 6d6b3c9c1d
commit 08d4b3cc8a
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
9 changed files with 460 additions and 26 deletions

View file

@ -19,8 +19,12 @@ export type TAzureModelConfig = z.infer<typeof modelConfigSchema>;
export const azureBaseSchema = z.object({
apiKey: z.string(),
instanceName: z.string(),
serverless: z.boolean().optional(),
instanceName: z.string().optional(),
deploymentName: z.string().optional(),
addParams: z.record(z.any()).optional(),
dropParams: z.array(z.string()).optional(),
forcePrompt: z.boolean().optional(),
version: z.string().optional(),
baseURL: z.string().optional(),
additionalHeaders: z.record(z.any()).optional(),