mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-26 13:18:51 +01:00
🅰️ feat: Azure AI Studio, Models as a Service Support (#1902)
* feat(data-provider): add Azure serverless inference handling through librechat.yaml * feat(azureOpenAI): serverless inference handling in api * docs: update docs with new azureOpenAI endpoint config fields and serverless inference endpoint setup * chore: remove unnecessary checks for apiKey as schema would not allow apiKey to be undefined * ci(azureOpenAI): update tests for serverless configurations
This commit is contained in:
parent
6d6b3c9c1d
commit
08d4b3cc8a
9 changed files with 460 additions and 26 deletions
|
|
@ -843,6 +843,47 @@ endpoints:
|
|||
- **Note**: It's recommended to use a custom env. variable reference for the values of field, as shown in the example.
|
||||
- **Note**: `api-key` header value is sent on every request
|
||||
|
||||
#### **serverless**:
|
||||
|
||||
> Indicates the use of a serverless inference endpoint for Azure OpenAI chat completions.
|
||||
|
||||
- Type: Boolean
|
||||
- **Optional**
|
||||
- **Description**: When set to `true`, specifies that the group is configured to use serverless inference endpoints as an Azure "Models as a Service" model.
|
||||
- **Example**: `serverless: true`
|
||||
- **Note**: [More info here](./azure_openai.md#serverless-inference-endpoints)
|
||||
|
||||
#### **addParams**:
|
||||
|
||||
> Adds additional parameters to requests.
|
||||
|
||||
- Type: Object/Dictionary
|
||||
- **Description**: Adds/Overrides parameters. Useful for specifying API-specific options.
|
||||
- **Example**:
|
||||
```yaml
|
||||
addParams:
|
||||
safe_prompt: true
|
||||
```
|
||||
|
||||
#### **dropParams**:
|
||||
|
||||
> Removes [default parameters](#default-parameters) from requests.
|
||||
|
||||
- Type: Array/List of Strings
|
||||
- **Description**: Excludes specified [default parameters](#default-parameters). Useful for APIs that do not accept or recognize certain parameters.
|
||||
- **Example**: `dropParams: ["stop", "user", "frequency_penalty", "presence_penalty"]`
|
||||
- **Note**: For a list of default parameters sent with every request, see the ["Default Parameters"](#default-parameters) Section below.
|
||||
|
||||
#### **forcePrompt**:
|
||||
|
||||
> If `true`, sends a `prompt` parameter instead of `messages`.
|
||||
|
||||
- Type: Boolean
|
||||
- Example: `forcePrompt: false`
|
||||
- **Note**: This combines all messages into a single text payload, [following OpenAI format](https://github.com/pvicente/openai-python/blob/main/chatml.md), and
|
||||
|
||||
uses the `/completions` endpoint of your baseURL rather than `/chat/completions`.
|
||||
|
||||
#### **models**:
|
||||
|
||||
> Configuration for individual models within a group.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue