🪪mkdocs: social cards (#1428)

* mkdocs plugins: add plugin for social cards and plugin that allow to exclude a folder

* docs: fix hyperlinks

* mkdocs: social cards (descriptions) for 'contributions' and 'deployment' guides

* mkdocs: social cards (descriptions) for all 'index.md'

* mkdocs: social cards (descriptions) for 'features' and 'plugins'

* mkdocs: social cards (descriptions) for 'general_info'

* mkdocs: social cards (descriptions) for 'configuration'

* mkdocs: social cards (descriptions) for 'installation'

* mkdocs: minor fixes

* update librechat.svg

* update how_to_contribute.md

add reference to the official GitHub documentation
This commit is contained in:
Fuegovic 2023-12-28 17:10:06 -05:00 committed by GitHub
parent 18cd02d44e
commit bce4f41fae
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
62 changed files with 393 additions and 329 deletions

View file

@ -1,5 +1,6 @@
---
title: 😈 Bing Jailbreak
description: Quick overview of the Bing jailbreak and Sydney's system message
weight: -3
---
@ -31,6 +32,6 @@ using internet slang often. Answer using the same language as the user."
## References
For more info on the Bing Jailbreak and general jailbreaking guidelines:
https://github.com/waylaidwanderer/node-chatgpt-api
[https://github.com/waylaidwanderer/node-chatgpt-api](https://github.com/waylaidwanderer/node-chatgpt-api)
https://www.make-safe-ai.com/is-bing-chat-safe/
[https://www.make-safe-ai.com/is-bing-chat-safe/](https://www.make-safe-ai.com/is-bing-chat-safe/)

View file

@ -1,5 +1,6 @@
---
title: Features
description: "✨ In-depth guides about various LibreChat features: plugins, presets, automated moderation, logging..."
weight: 2
---

View file

@ -1,5 +1,6 @@
---
title: 🪵 Logging System
description: This doc explains how to use the logging feature of LibreChat, which saves error and debug logs in the `/api/logs` folder. You can use these logs to troubleshoot issues, monitor your server, and report bugs. You can also disable debug logs if you want to save space.
weight: -5
---
@ -9,9 +10,9 @@ LibreChat has central logging built into its backend (api).
Log files are saved in `/api/logs`. Error logs are saved by default. Debug logs are enabled by default but can be turned off if not desired.
This allows you to monitor your server through external tools that inspect log files, such as [the ELK stack](https://aws.amazon.com/what-is/elk-stack/).
This allows you to monitor your server through external tools that inspect log files, such as **[the ELK stack](https://aws.amazon.com/what-is/elk-stack/)**.
Debug logs are essential for developer work and fixing issues. If you encounter any problems running LibreChat, reproduce as close as possible, and [report the issue](https://github.com/danny-avila/LibreChat/issues) with your logs found in `./api/logs/debug-%DATE%.log`.
Debug logs are essential for developer work and fixing issues. If you encounter any problems running LibreChat, reproduce as close as possible, and **[report the issue](https://github.com/danny-avila/LibreChat/issues)** with your logs found in `./api/logs/debug-%DATE%.log`.
Errors logs are also saved in the same location: `./api/logs/error-%DATE%.log`. If you have meilisearch configured, there is a separate log file for this as well.

View file

@ -1,5 +1,6 @@
---
title: 🍃 Manage Your Database
description: How to install and configure Mongo Express to securely access and manage your MongoDB database in Docker.
weight: -6
---

View file

@ -1,5 +1,6 @@
---
title: 🔨 Automated Moderation
description: The Automated Moderation System uses a scoring mechanism to track user violations. As users commit actions like excessive logins, registrations, or messaging, they accumulate violation scores. Upon reaching a set threshold, the user and their IP are temporarily banned. This system ensures platform security by monitoring and penalizing rapid or suspicious activities.
weight: -8
---
## Automated Moderation System (optional)
@ -33,7 +34,7 @@ The project's current rate limiters are as follows (see below under setup for de
### Setup
The following are all of the related env variables to make use of and configure the mod system. Note this is also found in the [/.env.example](/.env.example) file, to be set in your own `.env` file.
The following are all of the related env variables to make use of and configure the mod system. Note this is also found in the [/.env.example](https://github.com/danny-avila/LibreChat/blob/main/.env.example) file, to be set in your own `.env` file.
```bash
BAN_VIOLATIONS=true # Whether or not to enable banning users for violations (they will still be logged)

View file

@ -1,5 +1,6 @@
---
title: 📦 PandoraNext
description: How to deploy PandoraNext to enable the `CHATGPT_REVERSE_PROXY` for use with LibreChat.
weight: -4
---
@ -18,63 +19,64 @@ You can use it locally in docker or deploy it onthe web for remote access.
For local deployment using Docker, the steps are as follows:
### 1. **Clone or Download the Repository:**
Get the latest release from the [PandoraNext GitHub repository](https://github.com/pandora-next/deploy).
Get the latest release from the [PandoraNext GitHub repository](https://github.com/pandora-next/deploy).
```bash
git clone https://github.com/pandora-next/deploy.git
```
```bash
git clone https://github.com/pandora-next/deploy.git
```
### 2. Get your PandoraNext license id here: [PandoraNext Dashboard](https://dash.pandoranext.com/)
### 2. Get your PandoraNext `License ID`
Visit the **[PandoraNext Dashboard](https://dash.pandoranext.com/)** to get your `license ID`
### 3. **Configure `config.json`:**
Within the cloned repository, in the `data` folder, edit `config.json`. Specify your `license_id` and `proxy_api_prefix`. For the `proxy_api_prefix`, use at least 8 characters, avoid characters that can't be used in a URL and make sure it's unique.
Within the cloned repository, in the `data` folder, edit `config.json`. Specify your `license_id` and `proxy_api_prefix`. For the `proxy_api_prefix`, use at least 8 characters, avoid characters that can't be used in a URL and make sure it's unique.
Here's the `config.json` for your reference:
Here's the `config.json` for your reference:
```json
{
"bind": "0.0.0.0:8181",
"tls": {
"enabled": false,
"cert_file": "",
"key_file": ""
},
"timeout": 600,
"proxy_url": "",
"license_id": "",
"public_share": false,
"site_password": "",
"setup_password": "",
"server_tokens": true,
"proxy_api_prefix": "",
"isolated_conv_title": "*",
"captcha": {
"provider": "",
"site_key": "",
"site_secret": "",
"site_login": false,
"setup_login": false,
"oai_username": false,
"oai_password": false
},
"whitelist": null
}
```
```json
{
"bind": "0.0.0.0:8181",
"tls": {
"enabled": false,
"cert_file": "",
"key_file": ""
},
"timeout": 600,
"proxy_url": "",
"license_id": "",
"public_share": false,
"site_password": "",
"setup_password": "",
"server_tokens": true,
"proxy_api_prefix": "",
"isolated_conv_title": "*",
"captcha": {
"provider": "",
"site_key": "",
"site_secret": "",
"site_login": false,
"setup_login": false,
"oai_username": false,
"oai_password": false
},
"whitelist": null
}
```
### 4. **Set Up the LibreChat `.env` Filer:**
In the `.env` file within your LibreChat directory, you'll need to set the `CHATGPT_REVERSE_PROXY` variable:
In the `.env` file within your LibreChat directory, you'll need to set the `CHATGPT_REVERSE_PROXY` variable:
```bash
CHATGPT_REVERSE_PROXY=http://host.docker.internal:8181/your_proxy_api_prefix_here/backend-api/conversation
```
- Replace `your_proxy_api_prefix_here` with the actual proxy API prefix.
```bash
CHATGPT_REVERSE_PROXY=http://host.docker.internal:8181/your_proxy_api_prefix_here/backend-api/conversation
```
- Replace `your_proxy_api_prefix_here` with the actual proxy API prefix.
### 5. **Start Docker Containers:**
From the PandoraNext directory, run the following command to launch the Docker containers:
From the PandoraNext directory, run the following command to launch the Docker containers:
```bash
docker-compose up -d
```
```bash
docker-compose up -d
```
---
@ -82,85 +84,86 @@ For local deployment using Docker, the steps are as follows:
To deploy PandoraNext online by duplicating the Hugging Face Space, follow these steps:
### 1. Get your PandoraNext license id here: [PandoraNext Dashboard](https://dash.pandoranext.com/)
### 1. Get your PandoraNext `License ID`
Visit the **[PandoraNext Dashboard](https://dash.pandoranext.com/)** to get your `license ID`
### 2. **Configure `config.json`:**
Edit the following `config.json`. Specify your `license_id` and `proxy_api_prefix`. For the `proxy_api_prefix`, use at least 8 characters, avoid characters that can't be used in a URL and make sure it's unique.
Edit the following `config.json`. Specify your `license_id` and `proxy_api_prefix`. For the `proxy_api_prefix`, use at least 8 characters, avoid characters that can't be used in a URL and make sure it's unique.
Here's the `config.json` for your reference:
Here's the `config.json` for your reference:
```json
{
"bind": "0.0.0.0:8181",
"tls": {
"enabled": false,
"cert_file": "",
"key_file": ""
},
"timeout": 600,
"proxy_url": "",
"license_id": "",
"public_share": false,
"site_password": "",
"setup_password": "",
"server_tokens": true,
"proxy_api_prefix": "",
"isolated_conv_title": "*",
"captcha": {
"provider": "",
"site_key": "",
"site_secret": "",
"site_login": false,
"setup_login": false,
"oai_username": false,
"oai_password": false
},
"whitelist": null
}
```
```json
{
"bind": "0.0.0.0:8181",
"tls": {
"enabled": false,
"cert_file": "",
"key_file": ""
},
"timeout": 600,
"proxy_url": "",
"license_id": "",
"public_share": false,
"site_password": "",
"setup_password": "",
"server_tokens": true,
"proxy_api_prefix": "",
"isolated_conv_title": "*",
"captcha": {
"provider": "",
"site_key": "",
"site_secret": "",
"site_login": false,
"setup_login": false,
"oai_username": false,
"oai_password": false
},
"whitelist": null
}
```
### 3. **Hugging Face Space:**
Visit the [PandoraNext LibreChat Space](https://huggingface.co/spaces/LibreChat/PandoraNext) on Hugging Face.
Visit the [PandoraNext LibreChat Space](https://huggingface.co/spaces/LibreChat/PandoraNext) on Hugging Face.
### 4. **Duplicate the Space:**
Utilize the available options to duplicate or fork the space into your own Hugging Face account.
Utilize the available options to duplicate or fork the space into your own Hugging Face account.
### 5. **Fill the required secrets**
When asked for the `SECRETS`,
- for `CONFIG_JSON` use the whole content of the `config.json` you just modified,
- for `TOKENS_JSON` use the following default `token.json`:
```json
{
"test-1": {
"token": "access token / session token / refresh token",
"shared": true,
"show_user_info": false
},
"test-2": {
"token": "access token / session token / refresh token",
"shared": true,
"show_user_info": true,
"plus": true
},
"test2": {
"token": "access token / session token / refresh token / share token",
"password": "12345"
}
When asked for the `SECRETS`,
- for `CONFIG_JSON` use the whole content of the `config.json` you just modified,
- for `TOKENS_JSON` use the following default `token.json`:
```json
{
"test-1": {
"token": "access token / session token / refresh token",
"shared": true,
"show_user_info": false
},
"test-2": {
"token": "access token / session token / refresh token",
"shared": true,
"show_user_info": true,
"plus": true
},
"test2": {
"token": "access token / session token / refresh token / share token",
"password": "12345"
}
```
}
```
### 6. **Configure LibreChat:**
In the .env file (or secrets settings if you host LibreChat on Hugging Face), set the `CHATGPT_REVERSE_PROXY` variable using the following format:
In the .env file (or secrets settings if you host LibreChat on Hugging Face), set the `CHATGPT_REVERSE_PROXY` variable using the following format:
```bash
CHATGPT_REVERSE_PROXY=http://your_server_domain.com/your_proxy_api_prefix_here/backend-api/conversation
```
```bash
CHATGPT_REVERSE_PROXY=http://your_server_domain.com/your_proxy_api_prefix_here/backend-api/conversation
```
- Replace `your_server_domain.com` with the domain of your deployed space.
- you can use this format: `https://username-pandoranext.hf.space` (replace `username` with your Huggingface username)
- Replace `your_proxy_api_prefix_here` with the `proxy_api_prefix` you have set in your `config.json`.
- The resulting URL should look similar to:
`https://username-pandoranext.hf.space/your_proxy_api_prefix_here/backend-api/conversation`
- Replace `your_server_domain.com` with the domain of your deployed space.
- you can use this format: `https://username-pandoranext.hf.space` (replace `username` with your Huggingface username)
- Replace `your_proxy_api_prefix_here` with the `proxy_api_prefix` you have set in your `config.json`.
- The resulting URL should look similar to:
`https://username-pandoranext.hf.space/your_proxy_api_prefix_here/backend-api/conversation`
## Final Notes

View file

@ -1,5 +1,6 @@
---
title: ⚡ Azure AI Search
description: How to configure Azure AI Search for answers to your questions with assistance from GPT.
weight: -4
---
# Azure AI Search Plugin
@ -34,7 +35,7 @@ This is the authentication key to use when utilizing the search endpoint. Please
## Create or log in to your account on Azure Portal
**1.** Visit [https://azure.microsoft.com/en-us/](https://azure.microsoft.com/en-us/) and click on `Get started` or `Try Azure for Free` to create an account and sign in.
**1.** Visit **[https://azure.microsoft.com/en-us/](https://azure.microsoft.com/en-us/)** and click on `Get started` or `Try Azure for Free` to create an account and sign in.
**2.** Choose pay per use or Azure Free with $200.
@ -76,7 +77,7 @@ Now select the free option or select your preferred option (may incur charges).
![image](https://github.com/itzraiss/images/blob/main/Captura%20de%20tela%202023-11-26%20152107.png)
**2.** Follow the Microsoft tutorial.[https://learn.microsoft.com/en-us/azure/search/search-get-started-portal](https://learn.microsoft.com/en-us/azure/search/search-get-started-portal), after finishing, save the name given to the index somewhere.
**2.** Follow the Microsoft tutorial: **[https://learn.microsoft.com/en-us/azure/search/search-get-started-portal](https://learn.microsoft.com/en-us/azure/search/search-get-started-portal)**, after finishing, save the name given to the index somewhere.
**3.** Now you have your `AZURE_AI_SEARCH_INDEX_NAME`, copy and save it in a local safe place.
@ -117,7 +118,7 @@ The following are configuration values that are not required but can be specifie
If there are concerns that the search result data may be too large and exceed the prompt size, consider reducing the size of the search result data by using AZURE_AI_SEARCH_SEARCH_OPTION_TOP and AZURE_AI_SEARCH_SEARCH_OPTION_SELECT.
For details on each parameter, please refer to the following document:
https://learn.microsoft.com/en-us/rest/api/searchservice/search-documents
**[https://learn.microsoft.com/en-us/rest/api/searchservice/search-documents](https://learn.microsoft.com/en-us/rest/api/searchservice/search-documents)**
```env
AZURE_AI_SEARCH_API_VERSION=2023-10-01-Preview

View file

@ -1,16 +1,17 @@
---
title: 🧑‍💼 Official ChatGPT Plugins
description: How to add official OpenAI Plugins to LibreChat
weight: -8
---
# Using official ChatGPT Plugins / OpenAPI specs
ChatGPT plugins are API integrations for OpenAI models that extend their capabilities. They are structured around three key components: an API, an **OpenAPI specification** (spec for short), and a JSON **Plugin Manifest** file.
To learn more about them, or how to make your own, read here: [ChatGPT Plugins: Getting Started](https://platform.openai.com/docs/plugins/getting-started).
To learn more about them, or how to make your own, read here: **[ChatGPT Plugins: Getting Started](https://platform.openai.com/docs/plugins/getting-started)**
Thanks to the introduction of [OpenAI Functions](https://openai.com/blog/function-calling-and-other-api-updates) and their utilization in [Langchain](https://js.langchain.com/docs/modules/chains/openai_functions/openapi), it's now possible to directly use OpenAI Plugins through LibreChat, without building any custom langchain tools. The main use case we gain from integrating them to LibreChat is to allow use of plugins with gpt-3.5 models, and without ChatGPT Plus. They also find a great use case when you want to limit your own private API's interactions with chat.openai.com and their servers in favor of a self-hosted LibreChat instance.
Thanks to the introduction of **[OpenAI Functions](https://openai.com/blog/function-calling-and-other-api-updates)** and their utilization in **[Langchain](https://js.langchain.com/docs/modules/chains/openai_functions/openapi)**, it's now possible to directly use OpenAI Plugins through LibreChat, without building any custom langchain tools. The main use case we gain from integrating them to LibreChat is to allow use of plugins with gpt-3.5 models, and without ChatGPT Plus. They also find a great use case when you want to limit your own private API's interactions with chat.openai.com and their servers in favor of a self-hosted LibreChat instance.
### Table of Contents
<!-- ### Table of Contents
- [Using official ChatGPT Plugins / OpenAPI specs](#using-official-chatgpt-plugins--openapi-specs)
- [Table of Contents](#table-of-contents)
- [Intro](#intro)
@ -23,7 +24,7 @@ Thanks to the introduction of [OpenAI Functions](https://openai.com/blog/functio
- [Custom OpenAPI Spec files](#custom-openapi-spec-files)
- [Plugins with Authentication](#plugins-with-authentication)
- [Showcase](#showcase)
- [Disclaimers](#disclaimers)
- [Disclaimers](#disclaimers) -->
## Intro
@ -32,11 +33,11 @@ Before continuing, it's important to fully distinguish what a Manifest file is v
### **[Plugin Manifest File:](https://platform.openai.com/docs/plugins/getting-started/plugin-manifest)**
- Usually hosted on the APIs domain as `https://example.com/.well-known/ai-plugin.json`
- The manifest file is required for LLMs to connect with your plugin. If there is no file found, the plugin cannot be installed.
- Has required properties, and will error if they are missing. Check what they are in the [OpenAI Docs](https://platform.openai.com/docs/plugins/getting-started/plugin-manifest)
- Has required properties, and will error if they are missing. Check what they are in the **[OpenAI Docs](https://platform.openai.com/docs/plugins/getting-started/plugin-manifest)**
- Has optional properties, specific to LibreChat, that will enable them to work consistently, or for customizing headers/params made by every API call (see below)
### **[OpenAPI Spec](https://platform.openai.com/docs/plugins/getting-started/openapi-definition)**
- The OpenAPI specification is used to document the API that the plugin will interact with. It is a [universal format](https://www.openapis.org/) meant to standardize API definitions.
- The OpenAPI specification is used to document the API that the plugin will interact with. It is a **[universal format](https://www.openapis.org/)** meant to standardize API definitions.
- Referenced by the Manifest file in its `api.url` property
- Usually as `https://example.com/openapi.yaml` or `.../swagger.yaml`
- Can be a .yaml or .json file
@ -53,7 +54,7 @@ Download the Plugin manifest file, or copy the raw JSON data into a new file, an
`api\app\clients\tools\.well-known`
You should see multiple manifest files that have been tested, or edited, to work with LibreChat. ~~I've renamed them by their `name_for_model` property and it's recommended, but not required, that you do the same.~~ As of v0.5.8, It's **required** to name the manifest JSON file after its `name_for_model` property should you add one yourself.
You should see multiple manifest files that have been tested, or edited, to work with LibreChat. As of v0.5.8, It's **required** to name the manifest JSON file after its `name_for_model` property should you add one yourself.
After doing so, start/re-start the project server and they should now load in the Plugin store.
@ -152,7 +153,7 @@ curl -H "Authorization: Bearer ffc5226d1af346c08a98dee7deec9f76" https://example
As of now, LibreChat only supports plugins using Bearer Authentication, like in the example above.
If your plugin requires authentication, it's necessary to have these fields filled in your manifest file according to [OpenAI definitions](https://platform.openai.com/docs/plugins/getting-started/plugin-manifest), which for Bearer Authentication must follow the schema above.
If your plugin requires authentication, it's necessary to have these fields filled in your manifest file according to **[OpenAI definitions](https://platform.openai.com/docs/plugins/getting-started/plugin-manifest)**, which for Bearer Authentication must follow the schema above.
Important: Some ChatGPT plugins may use Bearer Auth., but have either stale verification tokens in their manifest, or only support calls from OpenAI servers. Web Pilot is one with the latter case, and thankfully it has a required header field for allowing non-OpenAI origination. See above for editing headers.
@ -168,16 +169,16 @@ Important: Some ChatGPT plugins may use Bearer Auth., but have either stale veri
## Disclaimers
Use of ChatGPT Plugins is only possible with official OpenAI models and their use of [Functions](https://platform.openai.com/docs/api-reference/chat/create#chat/create-functions). If you are accessing OpenAI models via reverse proxy through some 3rd party service, function calling may not be supported.
Use of ChatGPT Plugins is only possible with official OpenAI models and their use of **[Functions](https://platform.openai.com/docs/api-reference/chat/create#chat/create-functions)**. If you are accessing OpenAI models via reverse proxy through some 3rd party service, function calling may not be supported.
This implementation depends on the [LangChain OpenAPI Chain](https://js.langchain.com/docs/modules/chains/openai_functions/openapi) and general improvements to its use here will have to be made to the LangChainJS library.
This implementation depends on the **[LangChain OpenAPI Chain](https://js.langchain.com/docs/modules/chains/openai_functions/openapi)** and general improvements to its use here will have to be made to the LangChainJS library.
Custom Langchain Tools are preferred over ChatGPT Plugins/OpenAPI specs as this can be more token-efficient, especially with OpenAI Functions. A better alternative may be to make a Langchain tool modelled after an OpenAPI spec, for which I'll make a guide soon.
LibreChat's implementation is not 1:1 with ChatGPT's, as OpenAI has a robust, exclusive, and restricted authentication pipeline with its models & specific plugins, which are not as limited by context windows and token usage. Furthermore, some of their hosted plugins requiring authentication will not work, especially those with OAuth or stale verification tokens, and some may not be handled by the LLM in the same manner, especially those requiring multi-step API calls.
Some plugins may detect that the API call does not originate from OpenAI's servers, will either be defunct outside of chat.openai.com or need special handling, and/or editing of their manifest/spec files. This is not to say plugin use will not improve and more closely mirror how ChatGPT handles plugins, but there is still work to this end. In short, some will work perfectly while others may not work at all.
Some plugins may detect that the API call does not originate from OpenAI's servers, will either be defunct outside of **[chat.openai.com](https://chat.openai.com/)** or need special handling, and/or editing of their manifest/spec files. This is not to say plugin use will not improve and more closely mirror how ChatGPT handles plugins, but there is still work to this end. In short, some will work perfectly while others may not work at all.
The use of ChatGPT Plugins with LibreChat does not violate OpenAI's [Terms of Service](https://openai.com/policies/terms-of-use). According to their [Service Terms](https://openai.com/policies/service-terms) and [Usage Policies](https://openai.com/policies/usage-policies), the host, in this case OpenAI, is not responsible for the plugins hosted on their site and their usage outside of their platform, chat.openai.com. Furthermore, there is no explicit mention of restrictions on accessing data that is not directly displayed to the user. Therefore, accessing the payload of their plugins for display purposes is not in violation of their Terms of Service.
The use of ChatGPT Plugins with LibreChat does not violate OpenAI's **[Terms of Service](https://openai.com/policies/terms-of-use)**. According to their **[Service Terms](https://openai.com/policies/service-terms)** and **[Usage Policies](https://openai.com/policies/usage-policies)**, the host, in this case OpenAI, is not responsible for the plugins hosted on their site and their usage outside of their platform, **[chat.openai.com](https://chat.openai.com/)**. Furthermore, there is no explicit mention of restrictions on accessing data that is not directly displayed to the user. Therefore, accessing the payload of their plugins for display purposes is not in violation of their Terms of Service.
Please note that the ChatGPT Plugins integration is currently in an alpha state, and you may encounter errors. Although preliminary testing has been conducted, not all plugins have been thoroughly tested, and you may find that some I haven't added will not work for any one of the reasons I've mentioned above. Some of the errors may be caused by the plugin itself, and will also not work on https://chat.openai.com/. If you encounter any errors, double checking if they work on the official site is advisable before reporting them as a GitHub issue. I can only speak for the ones I tested and included, and the date of inclusion.
Please note that the ChatGPT Plugins integration is currently in an alpha state, and you may encounter errors. Although preliminary testing has been conducted, not all plugins have been thoroughly tested, and you may find that some I haven't added will not work for any one of the reasons I've mentioned above. Some of the errors may be caused by the plugin itself, and will also not work on **[chat.openai.com](https://chat.openai.com/)**. If you encounter any errors, double checking if they work on the official site is advisable before reporting them as a GitHub issue. I can only speak for the ones I tested and included, and the date of inclusion.

View file

@ -1,5 +1,6 @@
---
title: 🔎 Google Search
description: How to set up and use the Google Search Plugin, which allows you to query Google with GPT's help.
weight: -7
---
@ -10,9 +11,9 @@ GOOGLE_API_KEY="...."
GOOGLE_CSE_ID="...."
```
You first need to create a programmable search engine and get the search engine ID: https://developers.google.com/custom-search/docs/tutorial/creatingcse
You first need to create a programmable search engine and get the search engine ID: **[https://developers.google.com/custom-search/docs/tutorial/creatingcse](https://developers.google.com/custom-search/docs/tutorial/creatingcse)**
Then you can get the API key, click the "Get a key" button on this page: https://developers.google.com/custom-search/v1/introduction
Then you can get the API key, click the "Get a key" button on this page: **[https://developers.google.com/custom-search/v1/introduction](https://developers.google.com/custom-search/v1/introduction)**
<!-- You can limit the max price that is charged for a single search request by setting `MAX_SEARCH_PRICE` in your `.env` file. -->

View file

@ -1,5 +1,6 @@
---
title: Plugins
description: 🔌 All about plugins, how to make them, how use the official ChatGPT plugins, and how to configure custom plugins.
weight: -10
---

View file

@ -1,5 +1,6 @@
---
title: 🔌 Introduction
description: This doc introduces the plugins endpoint, which enables you to use different LLMs and tools with more flexibility and control. You can change your settings and plugins on the fly, and use plugins to access various sources of information and assistance.
weight: -10
---
# Plugins Endpoint
@ -9,7 +10,7 @@ weight: -10
The plugins endpoint opens the door to prompting LLMs in new ways other than traditional input/output prompting.
The first step is using chain-of-thought prompting & ["agency"](https://zapier.com/blog/ai-agent/) for using plugins/tools in a fashion mimicing the official ChatGPT Plugins feature.
The first step is using chain-of-thought prompting & **["agency"](https://zapier.com/blog/ai-agent/)** for using plugins/tools in a fashion mimicing the official ChatGPT Plugins feature.
More than this, you can use this endpoint for changing your conversation settings mid-conversation. Unlike the official ChatGPT site and all other endpoints, you can switch models, presets, and settings mid-convo, even when you have no plugins selected. This is useful if you first want a creative response from GPT-4, and then a deterministic, lower cost response from GPT-3. Soon, you will be able to use Google, HuggingFace, local models, all in this or a similar endpoint in the same modular manner.
@ -52,8 +53,8 @@ Clicking on **"Show Agent Settings"** will allow you to modify parameters for th
- **[Stable Diffusion](./stable_diffusion.md)**
- **[Wolfram](./wolfram.md)**
- **DALL-E** - same setup as above, you just need an OpenAI key, and it's made distinct from your main API key to make Chats but it can be the same one
- **Zapier** - You need a Zapier account. Get your [API key from here](https://nla.zapier.com/credentials/) after you've made an account
- Create allowed actions - Follow step 3 in this [getting start guide](https://nla.zapier.com/start/) from Zapier
- **Zapier** - You need a Zapier account. Get your **[API key from here](https://nla.zapier.com/credentials/)** after you've made an account
- Create allowed actions - Follow step 3 in this **[Start Here guide](https://nla.zapier.com/start/)** from Zapier
- ⚠️ NOTE: zapier is known to be finicky with certain actions. I found that writing email drafts is probably the best use of it
- there are improvements that can be made to override the official NLA integration and that is TBD
- **Browser/Scraper** - This is not to be confused with 'browsing' on chat.openai.com (which is technically a plugin suite or multiple plugins)
@ -62,7 +63,7 @@ Clicking on **"Show Agent Settings"** will allow you to modify parameters for th
- A better solution for 'browsing' is planned but can't guarantuee when
- This plugin is best used in combination with google so it doesn't hallucinate webpages to visit
- **Serpapi** - an alternative to Google search but not as performant in my opinion
- You can get an API key here: https://serpapi.com/dashboard
- You can get an API key here: **[https://serpapi.com/dashboard](https://serpapi.com/dashboard)**
- For free tier, you are limited to 100 queries/month
- With google, you are limited to 100/day for free, which is a better deal, and any after may cost you a few pennies

View file

@ -1,5 +1,6 @@
---
title: 🛠️ Make Your Own
description: This doc shows you how to create custom plugins for LibreChat by extending the LangChain `Tool` class. You will learn how to use different APIs and functions with your plugins, and how to integrate them with the LangChain framework.
weight: -9
---
# Making your own Plugin
@ -8,11 +9,11 @@ Creating custom plugins for this project involves extending the `Tool` class fro
**Note:** I will use the word plugin interchangeably with tool, as the latter is specific to LangChain, and we are mainly conforming to the library.
You are essentially creating DynamicTools in LangChain speak. See the [LangChainJS docs](https://js.langchain.com/docs/modules/agents/tools/dynamic) for more info.
You are essentially creating DynamicTools in LangChain speak. See the **[LangChainJS docs](https://js.langchain.com/docs/modules/agents/tools/dynamic)** for more info.
This guide will walk you through the process of creating your own custom plugins, using the `StableDiffusionAPI` and `WolframAlphaAPI` tools as examples.
When using the Functions Agent (the default mode for plugins), tools are converted to [OpenAI functions](https://openai.com/blog/function-calling-and-other-api-updates); in any case, plugins/tools are invoked conditionally based on the LLM generating a specific format that we parse.
When using the Functions Agent (the default mode for plugins), tools are converted to **[OpenAI functions](https://openai.com/blog/function-calling-and-other-api-updates)**; in any case, plugins/tools are invoked conditionally based on the LLM generating a specific format that we parse.
The most common implementation of a plugin is to make an API call based on the natural language input from the AI, but there is virtually no limit in programmatic use case.
@ -47,7 +48,7 @@ Remember, the key to creating a custom plugin is to extend the `Tool` class and
**Multi-Input Plugins**
If you would like to make a plugin that would benefit from multiple inputs from the LLM, instead of a singular input string as we will review, you need to make a LangChain [StructuredTool](https://blog.langchain.dev/structured-tools/) instead. A detailed guide for this is in progress, but for now, you can look at how I've made StructuredTools in this directory: `api\app\clients\tools\structured\`. This guide is foundational to understanding StructuredTools, and it's recommended you continue reading to better understand LangChain tools first. The blog linked above is also helpful once you've read through this guide.
If you would like to make a plugin that would benefit from multiple inputs from the LLM, instead of a singular input string as we will review, you need to make a LangChain **[StructuredTool](https://blog.langchain.dev/structured-tools/)** instead. A detailed guide for this is in progress, but for now, you can look at how I've made StructuredTools in this directory: `api\app\clients\tools\structured\`. This guide is foundational to understanding StructuredTools, and it's recommended you continue reading to better understand LangChain tools first. The blog linked above is also helpful once you've read through this guide.
---
@ -131,7 +132,7 @@ class StableDiffusionAPI extends Tool {
The `_call` method is where the main functionality of your plugin is implemented. This method is called when the language model decides to use your plugin. It should take an `input` parameter and return a result.
> In a basic Tool, the LLM will generate one string value as an input. If your plugin requires multiple inputs from the LLM, read the [StructuredTools](#StructuredTools) section.
> In a basic Tool, the LLM will generate one string value as an input. If your plugin requires multiple inputs from the LLM, read the **[StructuredTools](#StructuredTools)** section.
```javascript
class StableDiffusionAPI extends Tool {

View file

@ -1,17 +1,18 @@
---
title: 🖌️ Stable Diffusion
description: How to set up and configure the Stable Diffusion plugin
weight: -6
---
# Stable Diffusion Plugin
To use Stable Diffusion with this project, you will either need to download and install [stable-diffusion-webui](https://github.com/AUTOMATIC1111/stable-diffusion-webui) or, for a dockerized deployment, you can also use [stable-diffusion-webui-docker](https://github.com/AbdBarho/stable-diffusion-webui-docker)
To use Stable Diffusion with this project, you will either need to download and install **[AUTOMATIC1111 - Stable Diffusion WebUI](https://github.com/AUTOMATIC1111/stable-diffusion-webui)** or, for a dockerized deployment, you can also use **[stable-diffusion-webui-docker](https://github.com/AbdBarho/stable-diffusion-webui-docker)**
With the docker deployment you can skip step 2 and step 3, use the setup instructions from their repository instead.
- Note: you need a compatible GPU ("CPU-only" is possible but very slow). Nvidia is recommended, but there is no clear resource on incompatible GPUs. Any decent GPU should work.
## 1. Follow download and installation instructions from [stable-diffusion-webui readme](https://github.com/AUTOMATIC1111/stable-diffusion-webui)
## 1. Follow download and installation instructions from **[stable-diffusion-webui readme](https://github.com/AUTOMATIC1111/stable-diffusion-webui)**
## 2. Edit your run script settings

View file

@ -1,5 +1,6 @@
---
title: 🧠 Wolfram|Alpha
description: How to set up and configure the Wolfram Alpha plugin
weight: -5
---
@ -9,18 +10,23 @@ An AppID must be supplied in all calls to the Wolfram|Alpha API.
- Note: Wolfram API calls are limited to 100 calls/day and 2000/month for regular users.
## 1. Make an account at <a href='http://products.wolframalpha.com/api/'>Wolfram|Alpha</a>
## 2. Go to the <a href='https://developer.wolframalpha.com/portal/myapps/'>Developer Portal</a> click on "Get an AppID".
## 3. Configure it in LibreChat
### Select the plugins endpoint
### Make an account
- Visit: **[products.wolframalpha.com/api/](https://products.wolframalpha.com/api/)** to create your account
### Get your AppID
- Go to the **[Developer Portal](https://developer.wolframalpha.com/portal/myapps/)** click on `Get an AppID`.
### Configure it in LibreChat
- Select the plugins endpoint
![plugins_endpoint](https://github.com/danny-avila/LibreChat/assets/32828263/7db788a5-2173-4115-b34b-43ea132dae69)
### Open the Plugin store
- Open the Plugin store
![plugin_store](https://github.com/danny-avila/LibreChat/assets/32828263/12a51feb-c030-4cf0-8429-16360270988d)
### Install Wolfram and Provide your AppID
- Install Wolfram and Provide your AppID
![wolfram-1](https://github.com/danny-avila/LibreChat/assets/32828263/bd165497-d529-441d-8372-a68db19adc3f)
- Alternatively: you (the admin) can set the value in `\.env` to bypass the prompt: `WOLFRAM_APP_ID=your_app_id`
> Alternatively: you (the admin) can set the value in `\.env` to bypass the prompt: `WOLFRAM_APP_ID=your_app_id`
## 5. Select the plugin and enjoy!
### Select the plugin and enjoy!
![wolfram-2](https://github.com/danny-avila/LibreChat/assets/32828263/2825e961-6c46-4728-96cd-1012a0862943)

View file

@ -1,5 +1,6 @@
---
title: 🔖 Presets
description: The "presets" feature in our app is a powerful tool that allows users to save and load predefined settings for their conversations. Users can import and export these presets as JSON files, set a default preset, and share them with others on Discord.
weight: -9
---
# Guide to Using the "Presets" Feature

View file

@ -1,5 +1,6 @@
---
title: ✨ Third-Party Tools and Contributions
description: Collection of third-party tools provided by the community
weight: -2
---

View file

@ -1,5 +1,6 @@
---
title: 🪙 Token Usage
description: This doc covers how to track and control your token usage for the OpenAI/Plugins endpoints in LibreChat. You will learn how to view your transactions, enable user balances, and add credits to your account.
weight: -7
---
# Token Usage