mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-17 00:40:14 +01:00
🥷 docs: Ninja - ChatGPT-browser reverse proxy (#1697)
* 🥷 docs: Ninja ChatGPT-browser reverse proxy * 🥷 docs: breaking changes
This commit is contained in:
parent
a9220375d3
commit
2b4870892a
6 changed files with 165 additions and 2 deletions
10
.env.example
10
.env.example
|
|
@ -44,7 +44,7 @@ DEBUG_CONSOLE=false
|
|||
# Endpoints #
|
||||
#===================================================#
|
||||
|
||||
# ENDPOINTS=openAI,azureOpenAI,bingAI,google,gptPlugins,anthropic
|
||||
# ENDPOINTS=openAI,azureOpenAI,bingAI,chatGPTBrowser,google,gptPlugins,anthropic
|
||||
|
||||
PROXY=
|
||||
|
||||
|
|
@ -80,6 +80,14 @@ AZURE_USE_MODEL_AS_DEPLOYMENT_NAME=TRUE
|
|||
BINGAI_TOKEN=user_provided
|
||||
# BINGAI_HOST=https://cn.bing.com
|
||||
|
||||
#============#
|
||||
# ChatGPT #
|
||||
#============#
|
||||
|
||||
CHATGPT_TOKEN=
|
||||
CHATGPT_MODELS=text-davinci-002-render-sha
|
||||
# CHATGPT_REVERSE_PROXY=
|
||||
|
||||
#============#
|
||||
# Google #
|
||||
#============#
|
||||
|
|
|
|||
|
|
@ -26,6 +26,7 @@ weight: 2
|
|||
* 🔥 [Firebase CDN](./firebase.md)
|
||||
* 🍃 [Manage Your Database](./manage_your_database.md)
|
||||
* 🪵 [Logging System](./logging_system.md)
|
||||
* 🥷 [Ninja (ChatGPT reverse proxy)](./ninja.md)
|
||||
* 😈 [Bing Jailbreak](./bing_jailbreak.md)
|
||||
|
||||
---
|
||||
|
|
|
|||
105
docs/features/ninja.md
Normal file
105
docs/features/ninja.md
Normal file
|
|
@ -0,0 +1,105 @@
|
|||
---
|
||||
title: 🥷 Ninja (ChatGPT reverse proxy)
|
||||
description: How to deploy Ninja, and enable the `CHATGPT_REVERSE_PROXY` for use with LibreChat.
|
||||
weight: -3
|
||||
---
|
||||
|
||||
# Ninja Deployment Guide
|
||||
|
||||
If you're looking to use the ChatGPT Endpoint in LibreChat **(not to be confused with [OpenAI's Official API](../install/configuration/ai_setup.md#openai))**, setting up a reverse proxy is an essential. Ninja offers a solution for this purpose, and this guide will walk you through deploying Ninja to enable the `CHATGPT_REVERSE_PROXY` for use with LibreChat. See their official GitHub for more info: [https://github.com/gngpp/ninja](https://github.com/gngpp/ninja)
|
||||
|
||||
> Using this method you will only be able to use `text-davinci-002-render-sha` with Ninja in LibreChat. Other models offered with a `plus` subscription will not work.
|
||||
|
||||
You can use it locally in Docker or deploy it on the web for remote access.
|
||||
|
||||
---
|
||||
|
||||
## Deploy Locally Using Docker:
|
||||
|
||||
For local deployment using Docker, the steps are as follows:
|
||||
|
||||
### 1. **Create a Ninja folder and a docker-compose.yml file inside it:**
|
||||
- Edit the docker-compose file like this:
|
||||
|
||||
```yaml
|
||||
version: '3.4'
|
||||
|
||||
services:
|
||||
ninja:
|
||||
image: gngpp/ninja:latest
|
||||
container_name: ninja
|
||||
restart: unless-stopped
|
||||
command: run
|
||||
ports:
|
||||
- "7999:7999"
|
||||
```
|
||||
|
||||
### 2. **Set Up the LibreChat `.env` File:**
|
||||
In the `.env` file within your LibreChat directory, you'll need to set the `CHATGPT_REVERSE_PROXY` variable:
|
||||
|
||||
```bash
|
||||
CHATGPT_REVERSE_PROXY=http://host.docker.internal:7999/backend-api/conversation
|
||||
```
|
||||
|
||||
### 3. **Start Docker Containers:**
|
||||
From the Ninja directory, run the following command to launch the Docker containers:
|
||||
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Alternate Docker Method:
|
||||
|
||||
You can add it to the LibreChat override file if you prefer
|
||||
|
||||
### 1. **Edit or Create the override file:**
|
||||
In the LibreChat folder, find the `docker-compose.override.yml` file. (If you haven't created it yet you can either rename the `docker-compose.override.yml.example` to `docker-compose.override.yml`, or create a new one)
|
||||
|
||||
The override file should contain this:
|
||||
|
||||
```yaml
|
||||
version: '3.4'
|
||||
|
||||
services:
|
||||
|
||||
ninja:
|
||||
image: gngpp/ninja:latest
|
||||
container_name: ninja
|
||||
restart: unless-stopped
|
||||
command: run
|
||||
ports:
|
||||
- "7999:7999"
|
||||
```
|
||||
|
||||
### 2. **Set Up the LibreChat `.env` File:**
|
||||
In the `.env` file within your LibreChat directory, you'll need to set the `CHATGPT_REVERSE_PROXY` variable:
|
||||
|
||||
```bash
|
||||
CHATGPT_REVERSE_PROXY=http://host.docker.internal:7999/backend-api/conversation
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Deploy Online on Hugging Face:
|
||||
|
||||
To deploy Ninja online by duplicating the Hugging Face Space, follow these steps:
|
||||
|
||||
### 1. **Hugging Face Space:**
|
||||
Visit the [Ninja LibreChat Space](https://huggingface.co/spaces/LibreChat/Ninja) on Hugging Face.
|
||||
|
||||
### 2. **Duplicate the Space:**
|
||||
Utilize the available options to duplicate or fork the space into your own Hugging Face account.
|
||||
|
||||
### 3. **Configure LibreChat:**
|
||||
In the .env file (or secrets settings if you host LibreChat on Hugging Face), set the `CHATGPT_REVERSE_PROXY` variable using the following format:
|
||||
|
||||
```bash
|
||||
CHATGPT_REVERSE_PROXY=http://your_hf_space_url.com/backend-api/conversation
|
||||
```
|
||||
|
||||
- Replace `your_hf_space_url.com` with the domain of your deployed space.
|
||||
- Note: you can use this format: `https://your_username-ninja.hf.space` (replace `your_username` with your Huggingface username).
|
||||
- The resulting URL should look similar to:
|
||||
`https://your_username-ninja.hf.space/backend-api/conversation`
|
||||
|
|
@ -9,6 +9,12 @@ weight: -10
|
|||
**If you experience any issues after updating, we recommend clearing your browser cache and cookies.**
|
||||
Certain changes in the updates may impact cookies, leading to unexpected behaviors if not cleared properly.
|
||||
|
||||
## January 31th 2024
|
||||
- A new method to use the ChatGPT endpoint is now documented. It uses "Ninja"
|
||||
- For more info:
|
||||
- [Ninja Deployment Guide](../features/ninja.md)
|
||||
- [Ninja GitHub repo](https://github.com/gngpp/ninja/tree/main)
|
||||
|
||||
## January 30th 2024
|
||||
- Since PandoraNext has shut down, the ChatGPTbrowser endpoint is no longer available in LibreChat.
|
||||
- For more info:
|
||||
|
|
|
|||
|
|
@ -446,6 +446,27 @@ OpenRouter is integrated to the LibreChat by overriding the OpenAI endpoint.
|
|||
|
||||
**Important:** Stability for Unofficial APIs are not guaranteed. Access methods to these APIs are hacky, prone to errors, and patching, and are marked lowest in priority in LibreChat's development.
|
||||
|
||||
### ChatGPTBrowser
|
||||
|
||||
**Backend Access to https://chat.openai.com/api**
|
||||
|
||||
This is not to be confused with [OpenAI's Official API](#openai)!
|
||||
|
||||
> Note that this is disabled by default and requires additional configuration to work.
|
||||
> Also, using this may have your data exposed to 3rd parties if using a proxy, and OpenAI may flag your account.
|
||||
> See: [ChatGPT Reverse Proxy](../../features/ninja.md)
|
||||
|
||||
To get your Access token for ChatGPT Browser Access, you need to:
|
||||
|
||||
- Go to **[https://chat.openai.com](https://chat.openai.com)**
|
||||
- Create an account or log in with your existing one
|
||||
- Visit **[https://chat.openai.com/api/auth/session](https://chat.openai.com/api/auth/session)**
|
||||
- Copy the value of the "accessToken" field and save it in ./.env as CHATGPT_ACCESS_TOKEN
|
||||
|
||||
Warning: There may be a chance of your account being banned if you deploy the app to multiple users with this method. Use at your own risk.
|
||||
|
||||
---
|
||||
|
||||
### BingAI
|
||||
I recommend using Microsoft Edge for this:
|
||||
|
||||
|
|
|
|||
|
|
@ -135,7 +135,7 @@ In this section you can configure the endpoints and models selection, their API
|
|||
- `PROXY` is to be used by all endpoints (leave blank by default)
|
||||
|
||||
```bash
|
||||
ENDPOINTS=openAI,azureOpenAI,bingAI,google,gptPlugins,anthropic
|
||||
ENDPOINTS=openAI,azureOpenAI,bingAI,chatGPTBrowser,google,gptPlugins,anthropic
|
||||
PROXY=
|
||||
```
|
||||
|
||||
|
|
@ -245,6 +245,28 @@ BINGAI_TOKEN=user_provided
|
|||
BINGAI_HOST=
|
||||
```
|
||||
|
||||
### ChatGPT
|
||||
see: [ChatGPT Free Access token](../configuration/ai_setup.md#chatgptbrowser)
|
||||
|
||||
> **Warning**: To use this endpoint you'll have to set up your own reverse proxy. Here is the installation guide to deploy your own (based on [Ninja](https://github.com/gngpp/ninja)): **[Ninja Deployment Guide](../../features/ninja.md)**
|
||||
|
||||
```bash
|
||||
CHATGPT_REVERSE_PROXY=<YOUR-REVERSE-PROXY>
|
||||
```
|
||||
|
||||
> **Note:** If you're a GPT plus user you can try adding `gpt-4`, `gpt-4-plugins`, `gpt-4-code-interpreter`, and `gpt-4-browsing` to the list above and use the models for these features; **however, the view/display portion of these features are not supported**, but you can use the underlying models, which have higher token context
|
||||
|
||||
> This method **might only works** with `text-davinci-002-render-sha` and **might stop working** at any moment.
|
||||
|
||||
- Leave `CHATGPT_TOKEN=` blank to disable this endpoint
|
||||
- Set `CHATGPT_TOKEN=` to "user_provided" to allow users to provide their own API key from the WebUI
|
||||
- It is not recommended to provide your token in the `.env` file since it expires often and sharing it could get you banned.
|
||||
|
||||
```bash
|
||||
CHATGPT_TOKEN=
|
||||
CHATGPT_MODELS=text-davinci-002-render-sha
|
||||
```
|
||||
|
||||
### Google
|
||||
Follow these instructions to setup the [Google Endpoint](./ai_setup.md#google)
|
||||
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue