🐳 : Further Docker build Cleanup & Docs Update (#1502)

* refactor: post-cleanup changes:
- add more unnecessary paths to .dockerignore
- remove librechat.yaml from main compose file (prevents from being required)
- do not create librechat.yaml during build (does nothing)

* docs: make config file instructions easier to read, more info throughout other docs

* docs: add custom config to menu

* Update custom_config.md

* Update docker_compose_install.md
This commit is contained in:
Danny Avila 2024-01-06 11:59:08 -05:00 committed by GitHub
parent 5d7869d3d5
commit 3183d6b678
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
12 changed files with 181 additions and 37 deletions

View file

@ -1,7 +1,17 @@
node_modules
**/.circleci
**/.editorconfig
**/.dockerignore
**/.git
**/.DS_Store
**/.vscode
**/node_modules
# Specific patterns to ignore
data-node
meili_data*
librechat*
Dockerfile*
docs
# Ignore all hidden files
.*

View file

@ -6,10 +6,10 @@ WORKDIR /app
# Allow mounting of these files, which have no default
# values.
RUN touch .env librechat.yaml
RUN touch .env
# Install call deps - Install curl for health check
RUN apk --no-cache add curl && \
npm ci
npm ci
# React client build
ENV NODE_OPTIONS="--max-old-space-size=2048"

View file

@ -8,6 +8,11 @@ version: '3.4'
# # SAVE THIS FILE AS 'docker-compose.override.yaml'
# # AND USE THE 'docker-compose build' & 'docker-compose up -d' COMMANDS AS YOU WOULD NORMALLY DO
# # USE LIBRECHAT CONFIG FILE
# api:
# volumes:
# - ./librechat.yaml:/app/librechat.yaml
# # BUILD FROM LATEST IMAGE
# api:
# image: ghcr.io/danny-avila/librechat-dev:latest

View file

@ -25,7 +25,6 @@ services:
volumes:
- ./.env:/app/.env
- ./images:/app/client/public/images
- ./librechat.yaml:/app/librechat.yaml
mongodb:
container_name: chat-mongodb
image: mongo

View file

@ -48,6 +48,8 @@ Using the default environment values from [/.env.example](https://github.com/dan
This guide will walk you through setting up each Endpoint as needed.
For **custom endpoint** configuration, such as adding [Mistral AI](https://docs.mistral.ai/platform/client/) or [Openrouter](https://openrouter.ai/) refer to the **[librechat.yaml configuration guide](./custom_config.md)**.
**Reminder: If you use docker, you should [rebuild the docker image (here's how)](dotenv.md) each time you update your credentials**
*Note: Configuring pre-made Endpoint/model/conversation settings as singular options for your users is a planned feature. See the related discussion here: [System-wide custom model settings (lightweight GPTs) #1291](https://github.com/danny-avila/LibreChat/discussions/1291)*

View file

@ -1,15 +1,29 @@
---
title: 🖥️ Custom Endpoints & Config
description: Comprehensive guide for configuring the `librechat.yaml` file AKA the LibreChat Config file. This document is your one-stop resource for understanding and customizing endpoints & other integrations.
weight: -10
---
# LibreChat Configuration Guide
This document provides detailed instructions for configuring the `librechat.yaml` file used by LibreChat.
Welcome to the guide for configuring the **librechat.yaml** file in LibreChat.
In future updates, some of the configurations from [your `.env` file](./dotenv.md) will migrate here.
This file enables the integration of custom AI endpoints, enabling you to connect with any AI provider compliant with OpenAI API standards.
Further customization of the current configurations are also planned.
This includes providers like [Mistral AI](https://docs.mistral.ai/platform/client/), as well as reverse proxies that facilitate access to OpenAI servers, adding them alongside existing endpoints like Anthropic.
![image](https://github.com/danny-avila/LibreChat/assets/110412045/fd0d2307-008f-4e1d-b75b-4f141070ce71)
Future updates will streamline configuration further by migrating some settings from [your `.env` file](./dotenv.md) to `librechat.yaml`.
Stay tuned for ongoing enhancements to customize your LibreChat instance!
# Table of Contents
1. [Intro](#librechat-configuration-guide)
- [Configuration Overview](#configuration-overview)
- [Setup](#setup)
- [Docker Setup](#docker-setup)
- [Config Structure](#config-structure)
- [1. Version](#1-version)
- [2. Cache Settings](#2-cache-settings)
- [3. Endpoints](#3-endpoints)
@ -19,10 +33,39 @@ Further customization of the current configurations are also planned.
- [Breakdown of Default Params](#breakdown-of-default-params)
- [Example Config](#example-config)
## Configuration Overview
## Setup
**The `librechat.yaml` file should be placed in the root of the project where the .env file is located.**
The `librechat.yaml` file contains several key sections.
You can copy the [example config file](#example-config) as a good starting point while reading the rest of the guide.
The example config file has some options ready to go for Mistral AI and Openrouter.
## Docker Setup
For Docker, you need to make use of an [override file](./docker_override), named `docker-compose.override.yml`, to ensure the config file works for you.
- First, make sure your containers stop running with `docker-compose down`
- Create or edit existing `docker-compose.override.yml` at the root of the project:
```yaml
# For more details on the override file, see the Docker Override Guide:
# https://docs.librechat.ai/install/configuration/docker_override.html
version: '3.4'
services:
api:
volumes:
- ./librechat.yaml:/app/librechat.yaml
```
- Start docker again, and you should see your config file settings apply
```bash
docker-compose up # no need to rebuild
```
## Config Structure
**Note:** Fields not specifically mentioned as required are optional.
@ -48,36 +91,61 @@ The `librechat.yaml` file contains several key sections.
- **Description**: Each object in the array represents a unique endpoint configuration.
- **Required**
#### Endpoint Object Structure
## Endpoint Object Structure
Each endpoint in the `custom` array should have the following structure:
- **name**: A unique name for the endpoint.
```yaml
# Example Endpoint Object Structure
endpoints:
custom:
- name: "Mistral"
apiKey: "${YOUR_ENV_VAR_KEY}"
baseURL: "https://api.mistral.ai/v1"
models:
default: ["mistral-tiny", "mistral-small", "mistral-medium"]
titleConvo: true
titleModel: "mistral-tiny"
summarize: false
summaryModel: "mistral-tiny"
forcePrompt: false
modelDisplayLabel: "Mistral"
addParams:
safe_mode: true
dropParams: ["stop", "temperature", "top_p"]
```
### **name**:
> A unique name for the endpoint.
- Type: String
- Example: `name: "Mistral"`
- **Required**
- **Note**: Will be used as the "title" in the Endpoints Selector
- **apiKey**: Your API key for the service. Can reference an environment variable, or allow user to provide the value.
### **apiKey**:
> Your API key for the service. Can reference an environment variable, or allow user to provide the value.
- Type: String (apiKey | `"user_provided"`)
- **Example**: `apiKey: "${MISTRAL_API_KEY}"` | `apiKey: "your_api_key"` | `apiKey: "user_provided"`
- Example: `apiKey: "${MISTRAL_API_KEY}"` | `apiKey: "your_api_key"` | `apiKey: "user_provided"`
- **Required**
- **Note**: It's highly recommended to use the env. variable reference for this field, i.e. `${YOUR_VARIABLE}`
- **baseURL**: Base URL for the API. Can reference an environment variable, or allow user to provide the value.
### **baseURL**:
> Base URL for the API. Can reference an environment variable, or allow user to provide the value.
- Type: String (baseURL | `"user_provided"`)
- **Example**: `baseURL: "https://api.mistral.ai/v1"` | `baseURL: "${MISTRAL_BASE_URL}"` | `baseURL: "user_provided"`
- Example: `baseURL: "https://api.mistral.ai/v1"` | `baseURL: "${MISTRAL_BASE_URL}"` | `baseURL: "user_provided"`
- **Required**
- **Note**: It's highly recommended to use the env. variable reference for this field, i.e. `${YOUR_VARIABLE}`
- **iconURL**: The URL to use as the Endpoint Icon.
### **iconURL**:
> The URL to use as the Endpoint Icon.
- Type: Boolean
- Example: `iconURL: https://github.com/danny-avila/LibreChat/raw/main/docs/assets/LibreChat.svg`
- **Note**: The following are "known endpoints" (case-insensitive), which have icons provided for them. If your endpoint `name` matches these, you should omit this field:
- **Note**: The following are "known endpoints" (case-insensitive), which have icons provided for them. If your endpoint `name` matches the following names, you should omit this field:
- "Mistral"
- "OpenRouter"
- **models**: Configuration for models.
- **Required**
### **models**:
> Configuration for models.
- **Required**
- **default**: An array of strings indicating the default models to use. At least one value is required.
- Type: Array of Strings
- Example: `default: ["mistral-tiny", "mistral-small", "mistral-medium"]`
@ -87,36 +155,45 @@ Each endpoint in the `custom` array should have the following structure:
- Example: `fetch: true`
- **Note**: May cause slowdowns during initial use of the app if the response is delayed. Defaults to `false`.
- **titleConvo**: Enables title conversation when set to `true`.
### **titleConvo**:
> Enables title conversation when set to `true`.
- Type: Boolean
- Example: `titleConvo: true`
- **titleMethod**: Chooses between "completion" or "functions" for title method.
### **titleMethod**:
> Chooses between "completion" or "functions" for title method.
- Type: String (`"completion"` | `"functions"`)
- Example: `titleMethod: "completion"`
- **Note**: Defaults to "completion" if omitted.
- **titleModel**: Specifies the model to use for titles.
### **titleModel**:
> Specifies the model to use for titles.
- Type: String
- Example: `titleModel: "mistral-tiny"`
- **Note**: Defaults to "gpt-3.5-turbo" if omitted. May cause issues if "gpt-3.5-turbo" is not available.
- **summarize**: Enables summarization when set to `true`.
### **summarize**:
> Enables summarization when set to `true`.
- Type: Boolean
- Example: `summarize: false`
- **Note**: This feature requires an OpenAI Functions compatible API
- **summaryModel**: Specifies the model to use if summarization is enabled.
### **summaryModel**:
> Specifies the model to use if summarization is enabled.
- Type: String
- Example: `summaryModel: "mistral-tiny"`
- **Note**: Defaults to "gpt-3.5-turbo" if omitted. May cause issues if "gpt-3.5-turbo" is not available.
- **forcePrompt**: If `true`, sends a `prompt` parameter instead of `messages`.
### **forcePrompt**:
> If `true`, sends a `prompt` parameter instead of `messages`.
- Type: Boolean
- Example: `forcePrompt: false`
- **Note**: This combines all messages into a single text payload, [following OpenAI format](https://github.com/pvicente/openai-python/blob/main/chatml.md), and uses the `/completions` endpoint of your baseURL rather than `/chat/completions`.
- **Note**: This combines all messages into a single text payload, [following OpenAI format](https://github.com/pvicente/openai-python/blob/main/chatml.md), and
- **modelDisplayLabel**: The label displayed in messages next to the Icon for the current AI model.
uses the `/completions` endpoint of your baseURL rather than `/chat/completions`.
### **modelDisplayLabel**:
> The label displayed in messages next to the Icon for the current AI model.
- Type: String
- Example: `modelDisplayLabel: "Mistral"`
- **Note**: The display order is:
@ -124,7 +201,8 @@ Each endpoint in the `custom` array should have the following structure:
- 2. Label derived from the model name (if applicable)
- 3. This value, `modelDisplayLabel`, is used if the above are not specified. Defaults to "AI".
- **addParams**: Adds additional parameters to requests.
### **addParams**:
> Adds additional parameters to requests.
- Type: Object/Dictionary
- **Description**: Adds/Overrides parameters. Useful for specifying API-specific options.
- **Example**:
@ -133,12 +211,12 @@ Each endpoint in the `custom` array should have the following structure:
safe_mode: true
```
- **dropParams**: Removes default parameters from requests.
### **dropParams**:
> Removes [default parameters](#default-parameters) from requests.
- Type: Array/List of Strings
- **Description**: Excludes specified default parameters. Useful for APIs that do not accept or recognize certain parameters.
- **Description**: Excludes specified [default parameters](#default-parameters). Useful for APIs that do not accept or recognize certain parameters.
- **Example**: `dropParams: ["stop", "temperature", "top_p"]`
- **Note**: For a list of default parameters sent with every request, see the "Default Parameters" Section below.
- **Note**: For a list of default parameters sent with every request, see the ["Default Parameters"](#default-parameters) Section below.
## Additional Notes
- Ensure that all URLs and keys are correctly specified to avoid connectivity issues.

View file

@ -34,7 +34,18 @@ Open your `docker-compose.override.yml` file with vscode or any text editor.
Make your desired changes by uncommenting the relevant sections and customizing them as needed.
For example, if you want to use a prebuilt image for the `api` service and expose MongoDB's port, your `docker-compose.override.yml` might look like this:
For example, if you want to make sure Docker can use your `librechat.yaml` file for [custom configuration](./custom_config.md), it would look like this:
```yaml
version: '3.4'
services:
api:
volumes:
- ./librechat.yaml:/app/librechat.yaml
```
Or, if you want to use a prebuilt image for the `api` service and expose MongoDB's port, your `docker-compose.override.yml` might look like this:
```yaml
version: '3.4'

View file

@ -1,7 +1,7 @@
---
title: ⚙️ Environment Variables
description: Comprehensive guide for configuring your application's environment with the `.env` file. This document is your one-stop resource for understanding and customizing the environment variables that will shape your application's behavior in different contexts.
weight: -10
weight: -11
---
# .env File Configuration

View file

@ -7,7 +7,7 @@ weight: 2
# Configuration
* ⚙️ [Environment Variables](./dotenv.md)
* 🖥️ [Custom Config & Endpoints](./configuration/custom_config.md)
* 🖥️ [Custom Endpoints & Config](./custom_config.md)
* 🐋 [Docker Compose Override](./docker_override.md)
---
* 🤖 [AI Setup](./ai_setup.md)

View file

@ -17,7 +17,7 @@ weight: 1
## **[Configuration](./configuration/index.md)**
* ⚙️ [Environment Variables](./configuration/dotenv.md)
* 🖥️ [Custom Config & Endpoints](./configuration/custom_config.md)
* 🖥️ [Custom Endpoints & Config](./configuration/custom_config.md)
* 🐋 [Docker Compose Override](./configuration/docker_override.md)
* 🤖 [AI Setup](./configuration/ai_setup.md)
* 🚅 [LiteLLM](./configuration/litellm.md)

View file

@ -222,4 +222,40 @@ podman stop librechat && systemctl --user start container-librechat
---
## Integrating the Configuration File in Podman Setup
When using Podman for setting up LibreChat, you can also integrate the [`librechat.yaml` configuration file](../configuration/custom_config.md).
This file allows you to define specific settings and AI endpoints, such as Mistral AI, tailoring the application to your needs.
After creating your `.env` file as detailed in the previous steps, follow these instructions to integrate the `librechat.yaml` configuration:
1. Place your `librechat.yaml` file in your project's root directory.
2. Modify the Podman run command for the LibreChat container to include a volume argument that maps the `librechat.yaml` file inside the container. This can be done by adding the following line to your Podman run command:
```bash
-v "./librechat.yaml:/app/librechat.yaml"
```
For example, the modified Podman run command for starting LibreChat will look like this:
```bash
podman run \
--name="librechat" \
--network=librechat \
--env-file="./.env" \
-v "./librechat.yaml:/app/librechat.yaml" \
-p 3080:3080 \
--detach \
librechat:local;
```
By mapping the `librechat.yaml` file into the container, Podman ensures that your custom configurations are applied to LibreChat, enabling a tailored AI experience.
Ensure that the `librechat.yaml` file is correctly formatted and contains valid settings.
Any errors in this file might affect the functionality of LibreChat. For more information on configuring `librechat.yaml`, refer to the [configuration guide](../configuration/custom_config.md).
---
>⚠️ Note: If you're having trouble, before creating a new issue, please search for similar ones on our [#issues thread on our discord](https://discord.gg/weqZFtD9C4) or our [troubleshooting discussion](https://github.com/danny-avila/LibreChat/discussions/categories/troubleshooting) on our Discussions page. If you don't find a relevant issue, feel free to create a new one and provide as much detail as possible.

View file

@ -34,6 +34,9 @@ Before running LibreChat with Docker, you need to configure some settings:
#### [AI Setup](../configuration/ai_setup.md) (Required)
At least one AI endpoint should be setup for use.
#### [Custom Endpoints & Configuration](../configuration/custom_config.md#docker-setup) (Optional)
Allows you to customize AI endpoints, such as Mistral AI, and other settings to suit your specific needs.
#### [Manage Your MongoDB Database](../../features/manage_your_database.md) (Optional)
Safely access and manage your MongoDB database using Mongo Express