feat: update env example.

feat: support OPENAI_REVERSE_PROXY
feat: support set availModels in env file

fix: chatgpt Browser send logic refactor.

fix: title wrong usage of responseMessage
BREAKING: some env paramaters has been changed!
This commit is contained in:
Wentao Lyu 2023-04-05 21:21:02 +08:00
parent a5202f84cc
commit 22b9524ad3
19 changed files with 259 additions and 197 deletions

View file

@ -1,81 +1,82 @@
### Local
- **Install the prerequisites**
- **Download chatgpt-clone**
- Download the latest release here: https://github.com/danny-avila/chatgpt-clone/releases/
- Or by clicking on the green code button in the top of the page and selecting "Download ZIP"
- Or (Recommended if you have Git installed) pull the latest release from the main branch
- If you downloaded a zip file, extract the content in "C:/chatgpt-clone/"
-**IMPORTANT : If you install the files somewhere else modify the instructions accordingly**
- **To enable the Conversation search feature:**
-IF YOU DON'T WANT THIS FEATURE YOU CAN SKIP THIS STEP
- Download MeileSearch latest release from : https://github.com/meilisearch/meilisearch/releases
- Copy it to "C:/chatgpt-clone/"
- Rename the file to "meilisearch.exe"
- Open it by double clicking on it
- Copy the generated Master Key and save it somewhere (You will need it later)
- **Download and Install Node.js**
- Navigate to https://nodejs.org/en/download and to download the latest Node.js version for your OS (The Node.js installer includes the NPM package manager.)
- **Create a MongoDB database**
- Navigate to https://www.mongodb.com/ and Sign In or Create an account
- Create a new project
- Build a Database using the free plan and name the cluster (example: chatgpt-clone)
- Use the "Username and Password" method for authentication
- Add your current IP to the access list
- Then in the Database Deployment tab click on Connect
- In "Choose a connection method" select "Connect your application"
- Driver = Node.js / Version = 4.1 or later
- Copy the connection string and save it somewhere(you will need it later)
- **Get your OpenAI API key** here: https://platform.openai.com/account/api-keys and save it somewhere safe (you will need it later)
- **Get your Bing Access Token**
- Using MS Edge, navigate to bing.com
- Make sure you are logged in
- Open the DevTools by pressing F12 on your keyboard
- Click on the tab "Application" (On the left of the DevTools)
- Expand the "Cookies" (Under "Storage")
- You need to copy the value of the "_U" cookie, save it somewhere, you will need it later
- **Install the prerequisites**
- **Download chatgpt-clone**
- Download the latest release here: https://github.com/danny-avila/chatgpt-clone/releases/
- Or by clicking on the green code button in the top of the page and selecting "Download ZIP"
- Or (Recommended if you have Git installed) pull the latest release from the main branch
- If you downloaded a zip file, extract the content in "C:/chatgpt-clone/" -**IMPORTANT : If you install the files somewhere else modify the instructions accordingly**
- **To enable the Conversation search feature:**
-IF YOU DON'T WANT THIS FEATURE YOU CAN SKIP THIS STEP
- Download MeileSearch latest release from : https://github.com/meilisearch/meilisearch/releases
- Copy it to "C:/chatgpt-clone/"
- Rename the file to "meilisearch.exe"
- Open it by double clicking on it
- Copy the generated Master Key and save it somewhere (You will need it later)
- **Download and Install Node.js**
- Navigate to https://nodejs.org/en/download and to download the latest Node.js version for your OS (The Node.js installer includes the NPM package manager.)
- **Create a MongoDB database**
- Navigate to https://www.mongodb.com/ and Sign In or Create an account
- Create a new project
- Build a Database using the free plan and name the cluster (example: chatgpt-clone)
- Use the "Username and Password" method for authentication
- Add your current IP to the access list
- Then in the Database Deployment tab click on Connect
- In "Choose a connection method" select "Connect your application"
- Driver = Node.js / Version = 4.1 or later
- Copy the connection string and save it somewhere(you will need it later)
- **Get your OpenAI API key** here: https://platform.openai.com/account/api-keys and save it somewhere safe (you will need it later)
- **Get your Bing Access Token**
- Using MS Edge, navigate to bing.com
- Make sure you are logged in
- Open the DevTools by pressing F12 on your keyboard
- Click on the tab "Application" (On the left of the DevTools)
- Expand the "Cookies" (Under "Storage")
- You need to copy the value of the "\_U" cookie, save it somewhere, you will need it later
- **Create the ".env" File** You will need all your credentials, (API keys, access tokens, and Mongo Connection String, MeileSearch Master Key)
- Open "C:/chatgpt-clone/api/.env.example" in a text editor
- At this line **MONGO_URI="mongodb://127.0.0.1:27017/chatgpt-clone"**
Replace mongodb://127.0.0.1:27017/chatgpt-clone with the MondoDB connection string you saved earlier, **remove "&w=majority" at the end**
- It should look something like this: "MONGO_URI="mongodb+srv://username:password@chatgpt-clone.lfbcwz3.mongodb.net/?retryWrites=true"
- At this line **OPENAI_KEY=** you need to add your openai API key
- Add your Bing token to this line **BING_TOKEN=** (needed for BingChat & Sydney)
- If you want to enable Search, **SEARCH=TRUE** if you do not want to enable search **SEARCH=FALSE**
- Add your previously saved MeiliSearch Master key to this line **MEILI_MASTER_KEY=** (the key is needed if search is enabled even on local install or you may encounter errors)
- Save the file as **"C:/chatgpt-clone/api/.env"**
- Open "C:/chatgpt-clone/api/.env.example" in a text editor
- At this line **MONGO_URI="mongodb://127.0.0.1:27017/chatgpt-clone"**
Replace mongodb://127.0.0.1:27017/chatgpt-clone with the MondoDB connection string you saved earlier, **remove "&w=majority" at the end**
- It should look something like this: "MONGO_URI="mongodb+srv://username:password@chatgpt-clone.lfbcwz3.mongodb.net/?retryWrites=true"
- At this line **OPENAI_KEY=** you need to add your openai API key
- Add your Bing token to this line **BINGAI_TOKEN=** (needed for BingChat & Sydney)
- If you want to enable Search, **SEARCH=TRUE** if you do not want to enable search **SEARCH=FALSE**
- Add your previously saved MeiliSearch Master key to this line **MEILI_MASTER_KEY=** (the key is needed if search is enabled even on local install or you may encounter errors)
- Save the file as **"C:/chatgpt-clone/api/.env"**
**DO THIS ONCE AFTER EVERY UPDATE**
- **Run** `npm ci` in the "C:/chatgpt-clone/api" directory
- **Run** `npm ci` in the "C:/chatgpt-clone/client" directory
- **Run** `npm run build` in the "C:/chatgpt-clone/client"
**DO THIS EVERY TIME YOU WANT TO START CHATGPT-CLONE**
- **Run** `"meilisearch --master-key put_your_meilesearch_Master_Key_here"` in the "C:/chatgpt-clone" directory (Only if SEARCH=TRUE)
- **Run** `npm start` in the "C:/chatgpt-clone/api" directory
- **Run** `npm start` in the "C:/chatgpt-clone/api" directory
- **Visit** http://localhost:3080 (default port) & enjoy
OPTIONAL BUT RECOMMENDED
- **Make a batch file to automate the starting process**
- Open a text editor
- Paste the following code in a new document
- Put your MeiliSearch master key instead of "your_master_key_goes_here"
- Save the file as "C:/chatgpt-clone/chatgpt-clone.bat"
- you can make a shortcut of this batch file and put it anywhere
- Open a text editor
- Paste the following code in a new document
- Put your MeiliSearch master key instead of "your_master_key_goes_here"
- Save the file as "C:/chatgpt-clone/chatgpt-clone.bat"
- you can make a shortcut of this batch file and put it anywhere
```
REM the meilisearch executable needs to be at the root of the chatgpt-clone directory
start "MeiliSearch" cmd /k "meilisearch --master-key your_master_key_goes_here
REM ↑↑↑ meilisearch is the name of the meilisearch executable, put your own master key there
REM ↑↑↑ meilisearch is the name of the meilisearch executable, put your own master key there
start "ChatGPT-Clone" cmd /k "cd api && npm start"

View file

@ -15,34 +15,68 @@ NODE_ENV=development
# Change this to your MongoDB URI if different and I recommend appending chatgpt-clone
MONGO_URI="mongodb://127.0.0.1:27017/chatgpt-clone"
# API key configuration.
# Leave blank if you don't want them.
#############################
# Endpoint OpenAI:
#############################
# Access key from OpenAI platform
# Leave it blank to disable this endpoint
OPENAI_KEY=
# Default ChatGPT API Model, options: 'gpt-4', 'text-davinci-003', 'gpt-3.5-turbo', 'gpt-3.5-turbo-0301'
# you will have errors if you don't have access to a model like 'gpt-4', defaults to turbo if left empty/excluded.
DEFAULT_API_GPT=gpt-3.5-turbo
# Identify the available models, sperate by comma, and not space in it
# Leave it blank to use internal settings.
# OPENAI_MODELS=gpt-4,text-davinci-003,gpt-3.5-turbo,gpt-3.5-turbo-0301
# _U Cookies Value from bing.com
BING_TOKEN=
# Reverse proxy setting for OpenAI
# https://github.com/waylaidwanderer/node-chatgpt-api#using-a-reverse-proxy
# OPENAI_REVERSE_PROXY=<YOUR REVERSE PROXY>
#############################
# Endpoint BingAI (Also jailbreak Sydney):
#############################
# BingAI Tokens: the "_U" cookies value from bing.com
# Leave it and BINGAI_USER_TOKEN blank to disable this endpoint.
BINGAI_TOKEN=
# BingAI User defined Token
# Allow user to set their own token by client
# Uncomment this to enable this feature.
# (Not implemented yet.)
# BINGAI_USER_TOKEN=1
#############################
# Endpoint chatGPT:
#############################
# ChatGPT Browser Client (free but use at your own risk)
# Access token from https://chat.openai.com/api/auth/session
# Exposes your access token to a 3rd party
# Exposes your access token to CHATGPT_REVERSE_PROXY
# Leave it blank to disable this endpoint
CHATGPT_TOKEN=
# If you have access to other models on the official site, you can use them here.
# Defaults to 'text-davinci-002-render-sha' if left empty.
# options: gpt-4, text-davinci-002-render, text-davinci-002-render-paid, or text-davinci-002-render-sha
# You cannot use a model that your account does not have access to. You can check
# which ones you have access to by opening DevTools and going to the Network tab.
# Refresh the page and look at the response body for https://chat.openai.com/backend-api/models.
BROWSER_MODEL=
# Identify the available models, sperate by comma, and not space in it
# Leave it blank to use internal settings.
# CHATGPT_MODELS=text-davinci-002-render-sha,text-davinci-002-render-paid,gpt-4
# Reverse proxy setting for OpenAI
# https://github.com/waylaidwanderer/node-chatgpt-api#using-a-reverse-proxy
# By default it will use the node-chatgpt-api recommended proxy, (it's a third party server)
# CHATGPT_REVERSE_PROXY=<YOUR REVERSE PROXY>
#############################
# Search:
#############################
# ENABLING SEARCH MESSAGES/CONVOS
# Requires installation of free self-hosted Meilisearch or Paid Remote Plan (Remote not tested)
# The easiest setup for this is through docker-compose, which takes care of it for you.
# SEARCH=TRUE
SEARCH=TRUE
# SEARCH=1
SEARCH=1
# REQUIRED FOR SEARCH: MeiliSearch Host, mainly for api server to connect to the search server.
# must replace '0.0.0.0' with 'meilisearch' if serving meilisearch with docker-compose
@ -63,8 +97,11 @@ MEILI_HTTP_ADDR='0.0.0.0:7700' # <-- local/remote
MEILI_MASTER_KEY=JKMW-hGc7v_D1FkJVdbRSDNFLZcUv3S75yrxXP0SmcU # <-- ready made secure key for docker-compose
#############################
# User System
# global enable/disable the sample user system.
#############################
# Enable the user system.
# this is not a ready to use user system.
# dont't use it, unless you can write your own code.
# ENABLE_USER_SYSTEM= # <-- make sure you don't comment this back in if you're not using your own user system

View file

@ -22,7 +22,7 @@ const askBing = async ({
const bingAIClient = new BingAIClient({
// "_U" cookie from bing.com
userToken: process.env.BING_TOKEN,
userToken: process.env.BINGAI_TOKEN,
// If the above doesn't work, provide all your cookies as a string instead
// cookies: '',
debug: false,

View file

@ -1,12 +1,6 @@
require('dotenv').config();
const { KeyvFile } = require('keyv-file');
const modelMap = new Map([
['Default (GPT-3.5)', 'text-davinci-002-render-sha'],
['Legacy (GPT-3.5)', 'text-davinci-002-render-paid'],
['GPT-4', 'gpt-4']
]);
const browserClient = async ({
text,
parentMessageId,
@ -21,11 +15,11 @@ const browserClient = async ({
};
const clientOptions = {
// Warning: This will expose your access token to a third party. Consider the risks before using this.
// Warning: This will expose your access token to a third party. Consider the risks before using this.
reverseProxyUrl: process.env.CHATGPT_REVERSE_PROXY || 'https://bypass.churchless.tech/api/conversation',
// Access token from https://chat.openai.com/api/auth/session
accessToken: process.env.CHATGPT_TOKEN,
model: modelMap.get(model),
model: model,
// debug: true
proxy: process.env.PROXY || null
};

View file

@ -22,6 +22,9 @@ const askClient = async ({
};
const clientOptions = {
// Warning: This will expose your access token to a third party. Consider the risks before using this.
reverseProxyUrl: process.env.OPENAI_REVERSE_PROXY || null,
modelOptions: {
model: model,
temperature,
@ -29,6 +32,7 @@ const askClient = async ({
presence_penalty,
frequency_penalty
},
chatGptLabel,
promptPrefix,
proxy: process.env.PROXY || null,

View file

@ -1,40 +0,0 @@
require('dotenv').config();
const { KeyvFile } = require('keyv-file');
const askSydney = async ({ text, onProgress, convo }) => {
const { BingAIClient } = (await import('@waylaidwanderer/chatgpt-api'));
const sydneyClient = new BingAIClient({
// "_U" cookie from bing.com
userToken: process.env.BING_TOKEN,
// If the above doesn't work, provide all your cookies as a string instead
// cookies: '',
debug: false,
cache: { store: new KeyvFile({ filename: './data/cache.json' }) }
});
let options = {
jailbreakConversationId: true,
onProgress,
};
if (convo.jailbreakConversationId) {
options = { ...options, jailbreakConversationId: convo.jailbreakConversationId, parentMessageId: convo.parentMessageId };
}
if (convo.toneStyle) {
options.toneStyle = convo.toneStyle;
}
console.log('sydney options', options);
const res = await sydneyClient.sendMessage(text, options
);
return res;
// for reference:
// https://github.com/waylaidwanderer/node-chatgpt-api/blob/main/demos/use-bing-client.js
};
module.exports = { askSydney };

View file

@ -1,7 +1,6 @@
const { askClient } = require('./clients/chatgpt-client');
const { browserClient } = require('./clients/chatgpt-browser');
const { askBing } = require('./clients/bingai');
const { askSydney } = require('./clients/sydney');
const titleConvo = require('./titleConvo');
const getCitations = require('../lib/parse/getCitations');
const citeText = require('../lib/parse/citeText');
@ -10,7 +9,6 @@ module.exports = {
askClient,
browserClient,
askBing,
askSydney,
titleConvo,
getCitations,
citeText

View file

@ -1,6 +1,7 @@
const express = require('express');
const crypto = require('crypto');
const router = express.Router();
const { getChatGPTBrowserModels } = require('../endpoints');
const { titleConvo, browserClient } = require('../../../app/');
const { saveMessage, getConvoTitle, saveConvo, updateConvo, getConvo } = require('../../../models');
const { handleError, sendMessage, createOnProgress, handleText } = require('./handlers');
@ -18,6 +19,7 @@ router.post('/', async (req, res) => {
// build user message
const conversationId = oldConversationId || crypto.randomUUID();
const isNewConversation = !oldConversationId;
const userMessageId = crypto.randomUUID();
const userParentMessageId = parentMessageId || '00000000-0000-0000-0000-000000000000';
const userMessage = {
@ -34,6 +36,10 @@ router.post('/', async (req, res) => {
model: req.body?.model || 'text-davinci-002-render-sha'
};
const availableModels = getChatGPTBrowserModels();
if (availableModels.find(model => model === endpointOption.model) === undefined)
return handleError(res, { text: 'Illegal request: model' });
console.log('ask log', {
userMessage,
endpointOption,
@ -52,6 +58,7 @@ router.post('/', async (req, res) => {
// eslint-disable-next-line no-use-before-define
return await ask({
isNewConversation,
userMessage,
endpointOption,
conversationId,
@ -63,6 +70,7 @@ router.post('/', async (req, res) => {
});
const ask = async ({
isNewConversation,
userMessage,
endpointOption,
conversationId,
@ -71,9 +79,7 @@ const ask = async ({
req,
res
}) => {
const { text, parentMessageId: userParentMessageId, messageId: userMessageId } = userMessage;
const client = browserClient;
let { text, parentMessageId: userParentMessageId, messageId: userMessageId } = userMessage;
res.writeHead(200, {
Connection: 'keep-alive',
@ -89,7 +95,7 @@ const ask = async ({
const progressCallback = createOnProgress();
const abortController = new AbortController();
res.on('close', () => abortController.abort());
let gptResponse = await client({
let response = await browserClient({
text,
parentMessageId: userParentMessageId,
conversationId,
@ -98,50 +104,60 @@ const ask = async ({
abortController
});
gptResponse.text = gptResponse.response;
console.log('CLIENT RESPONSE', gptResponse);
console.log('CLIENT RESPONSE', response);
if (!gptResponse.parentMessageId) {
gptResponse.parentMessageId = overrideParentMessageId || userMessageId;
delete gptResponse.response;
// STEP1 generate response message
response.text = response.response || '**ChatGPT refused to answer.**';
let responseMessage = {
conversationId: response.conversationId,
messageId: response.messageId,
parentMessageId: overrideParentMessageId || response.parentMessageId || userMessageId,
text: await handleText(response),
sender: endpointOption?.chatGptLabel || 'ChatGPT'
};
await saveMessage(responseMessage);
// STEP2 update the conversation
conversationId = responseMessage.conversationId || conversationId;
// First update conversationId if needed
let conversationUpdate = { conversationId, endpoint: 'chatGPTBrowser' };
if (conversationId != responseMessage.conversationId && isNewConversation)
conversationUpdate = {
...conversationUpdate,
conversationId: conversationId,
newConversationId: responseMessage.conversationId || conversationId
};
conversationId = responseMessage.conversationId || conversationId;
await saveConvo(req?.session?.user?.username, conversationUpdate);
// STEP3 update the user message
userMessage.conversationId = conversationId;
userMessage.messageId = responseMessage.parentMessageId;
// If response has parentMessageId, the fake userMessage.messageId should be updated to the real one.
if (!overrideParentMessageId) {
const oldUserMessageId = userMessageId;
await saveMessage({ ...userMessage, messageId: oldUserMessageId, newMessageId: userMessage.messageId });
}
gptResponse.sender = 'ChatGPT';
// gptResponse.model = model;
gptResponse.text = await handleText(gptResponse);
// if (convo.chatGptLabel?.length > 0 && model === 'chatgptCustom') {
// gptResponse.chatGptLabel = convo.chatGptLabel;
// }
// if (convo.promptPrefix?.length > 0 && model === 'chatgptCustom') {
// gptResponse.promptPrefix = convo.promptPrefix;
// }
gptResponse.parentMessageId = overrideParentMessageId || userMessageId;
if (userParentMessageId.startsWith('000')) {
await saveMessage({ ...userMessage, conversationId: gptResponse.conversationId });
}
await saveMessage(gptResponse);
await updateConvo(req?.session?.user?.username, {
...gptResponse,
oldConvoId: conversationId
});
userMessageId = userMessage.messageId;
sendMessage(res, {
title: await getConvoTitle(req?.session?.user?.username, conversationId),
final: true,
conversation: await getConvo(req?.session?.user?.username, conversationId),
requestMessage: userMessage,
responseMessage: gptResponse
responseMessage: responseMessage
});
res.end();
if (userParentMessageId == '00000000-0000-0000-0000-000000000000') {
const title = await titleConvo({ endpoint: endpointOption?.endpoint, text, response: gptResponse });
const title = await titleConvo({ endpoint: endpointOption?.endpoint, text, response: responseMessage });
await updateConvo(req?.session?.user?.username, {
conversationId: gptResponse.conversationId,
conversationId: conversationId,
title
});
}

View file

@ -1,6 +1,7 @@
const express = require('express');
const crypto = require('crypto');
const router = express.Router();
const { getOpenAIModels } = require('../endpoints');
const { titleConvo, askClient } = require('../../../app/');
const { saveMessage, getConvoTitle, saveConvo, updateConvo, getConvo } = require('../../../models');
const { handleError, sendMessage, createOnProgress, handleText } = require('./handlers');
@ -40,6 +41,10 @@ router.post('/', async (req, res) => {
frequency_penalty: req.body?.frequency_penalty || 0
};
const availableModels = getOpenAIModels();
if (availableModels.find(model => model === endpointOption.model) === undefined)
return handleError(res, { text: 'Illegal request: model' });
console.log('ask log', {
userMessage,
endpointOption,
@ -150,7 +155,7 @@ const ask = async ({
res.end();
if (userParentMessageId == '00000000-0000-0000-0000-000000000000') {
const title = await titleConvo({ endpoint: endpointOption?.endpoint, text, response });
const title = await titleConvo({ endpoint: endpointOption?.endpoint, text, response: responseMessage });
await updateConvo(req?.session?.user?.username, {
conversationId: conversationId,
title

View file

@ -1,17 +1,27 @@
const express = require('express');
const router = express.Router();
const getOpenAIModels = () => {
let models = ['gpt-4', 'text-davinci-003', 'gpt-3.5-turbo', 'gpt-3.5-turbo-0301'];
if (process.env.OPENAI_MODELS) models = String(process.env.OPENAI_MODELS).split(',');
return models;
};
const getChatGPTBrowserModels = () => {
let models = ['text-davinci-002-render-sha', 'text-davinci-002-render-paid', 'gpt-4'];
if (process.env.CHATGPT_MODELS) models = String(process.env.CHATGPT_MODELS).split(',');
return models;
};
router.get('/', function (req, res) {
const azureOpenAI = !!process.env.AZURE_OPENAI_KEY;
const openAI = process.env.OPENAI_KEY
? { availableModels: ['gpt-4', 'text-davinci-003', 'gpt-3.5-turbo', 'gpt-3.5-turbo-0301'] }
: false;
const bingAI = !!process.env.BING_TOKEN;
const chatGPTBrowser = process.env.OPENAI_KEY
? { availableModels: ['Default (GPT-3.5)', 'Legacy (GPT-3.5)', 'GPT-4'] }
: false;
const openAI = process.env.OPENAI_KEY ? { availableModels: getOpenAIModels() } : false;
const bingAI = !!process.env.BINGAI_TOKEN;
const chatGPTBrowser = process.env.CHATGPT_TOKEN ? { availableModels: getChatGPTBrowserModels() } : false;
res.send(JSON.stringify({ azureOpenAI, openAI, bingAI, chatGPTBrowser }));
});
module.exports = router;
module.exports = { router, getOpenAIModels, getChatGPTBrowserModels };

View file

@ -6,7 +6,7 @@ const prompts = require('./prompts');
const search = require('./search');
const tokenizer = require('./tokenizer');
const me = require('./me');
const endpoints = require('./endpoints');
const { router: endpoints } = require('./endpoints');
const { router: auth, authenticatedOr401, authenticatedOrRedirect } = require('./auth');
module.exports = {

View file

@ -20,13 +20,18 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
const setPresets = useSetRecoilState(store.presets);
const availableEndpoints = useRecoilValue(store.availableEndpoints);
const endpointsFilter = useRecoilValue(store.endpointsFilter);
const setOption = param => newValue => {
let update = {};
update[param] = newValue;
setPreset(prevState =>
cleanupPreset({
...prevState,
...update
preset: {
...prevState,
...update
},
endpointsFilter
})
);
};
@ -38,7 +43,7 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
axios({
method: 'post',
url: '/api/presets',
data: cleanupPreset(preset),
data: cleanupPreset({ preset, endpointsFilter }),
withCredentials: true
}).then(res => {
setPresets(res?.data);
@ -47,7 +52,7 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
const exportPreset = () => {
exportFromJSON({
data: cleanupPreset(preset),
data: cleanupPreset({ preset, endpointsFilter }),
fileName: `${preset?.title}.json`,
exportType: exportFromJSON.types.json
});

View file

@ -1,4 +1,5 @@
import React, { useEffect, useState } from 'react';
import { useRecoilValue } from 'recoil';
import exportFromJSON from 'export-from-json';
import DialogTemplate from '../ui/DialogTemplate.jsx';
import { Dialog, DialogButton } from '../ui/Dialog.tsx';
@ -7,12 +8,15 @@ import cleanupPreset from '~/utils/cleanupPreset';
import Settings from './Settings';
import store from '~/store';
// A preset dialog to show readonly preset values.
const EndpointOptionsDialog = ({ open, onOpenChange, preset: _preset, title }) => {
// const [title, setTitle] = useState('My Preset');
const [preset, setPreset] = useState(_preset);
const [saveAsDialogShow, setSaveAsDialogShow] = useState(false);
const endpointsFilter = useRecoilValue(store.endpointsFilter);
const setOption = param => newValue => {
let update = {};
@ -29,7 +33,7 @@ const EndpointOptionsDialog = ({ open, onOpenChange, preset: _preset, title }) =
const exportPreset = () => {
exportFromJSON({
data: cleanupPreset(preset),
data: cleanupPreset({ preset, endpointsFilter }),
fileName: `${preset?.title}.json`,
exportType: exportFromJSON.types.json
});

View file

@ -1,5 +1,5 @@
import React, { useEffect, useState } from 'react';
import { useSetRecoilState } from 'recoil';
import { useSetRecoilState, useRecoilValue } from 'recoil';
import axios from 'axios';
import DialogTemplate from '../ui/DialogTemplate';
import { Dialog } from '../ui/Dialog.tsx';
@ -13,14 +13,18 @@ import store from '~/store';
const SaveAsPresetDialog = ({ open, onOpenChange, preset }) => {
const [title, setTitle] = useState(preset?.title || 'My Preset');
const setPresets = useSetRecoilState(store.presets);
const endpointsFilter = useRecoilValue(store.endpointsFilter);
const defaultTextProps =
'rounded-md border border-gray-300 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.10)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-400 dark:bg-gray-700 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
const submitPreset = () => {
const _preset = cleanupPreset({
...preset,
title
preset: {
...preset,
title
},
endpointsFilter
});
axios({

View file

@ -21,6 +21,12 @@ function ChatGPTOptions() {
const models = endpointsConfig?.['chatGPTBrowser']?.['availableModels'] || [];
// const modelMap = new Map([
// ['Default (GPT-3.5)', 'text-davinci-002-render-sha'],
// ['Legacy (GPT-3.5)', 'text-davinci-002-render-paid'],
// ['GPT-4', 'gpt-4']
// ]);
const setOption = param => newValue => {
let update = {};
update[param] = newValue;
@ -43,7 +49,7 @@ function ChatGPTOptions() {
showLabel={false}
className={cn(
cardStyle,
'min-w-48 z-50 flex h-[40px] w-48 flex-none items-center justify-center px-4 ring-0 hover:cursor-pointer hover:bg-slate-50 focus:ring-0 focus:ring-offset-0 data-[state=open]:bg-slate-50 dark:bg-gray-700 dark:hover:bg-gray-600 dark:data-[state=open]:bg-gray-600'
'z-50 flex h-[40px] w-[260px] min-w-[260px] flex-none items-center justify-center px-4 ring-0 hover:cursor-pointer hover:bg-slate-50 focus:ring-0 focus:ring-offset-0 data-[state=open]:bg-slate-50 dark:bg-gray-700 dark:hover:bg-gray-600 dark:data-[state=open]:bg-gray-600'
)}
/>
</div>

View file

@ -1,9 +1,9 @@
import React from 'react';
import { useSetRecoilState } from 'recoil';
import { FileUp } from 'lucide-react';
import store from '~/store';
import axios from 'axios';
import cleanupPreset from '~/utils/cleanupPreset.js';
import { useRecoilValue } from 'recoil';
import store from '~/store';
// async function fetchPresets(callback) {
// try {
@ -21,6 +21,7 @@ import cleanupPreset from '~/utils/cleanupPreset.js';
const FileUpload = ({ onFileSelected }) => {
// const setPresets = useSetRecoilState(store.presets);
const endpointsFilter = useRecoilValue(store.endpointsFilter);
const handleFileChange = event => {
const file = event.target.files[0];
@ -29,7 +30,7 @@ const FileUpload = ({ onFileSelected }) => {
const reader = new FileReader();
reader.onload = e => {
const jsonData = JSON.parse(e.target.result);
onFileSelected({ ...cleanupPreset(jsonData), presetId: null });
onFileSelected({ ...cleanupPreset({ preset: jsonData, endpointsFilter }), presetId: null });
};
reader.readAsText(file);
};

View file

@ -1,4 +1,4 @@
const cleanupPreset = _preset => {
const cleanupPreset = ({ preset: _preset, endpointsFilter = {} }) => {
const { endpoint } = _preset;
let preset = {};
@ -6,7 +6,7 @@ const cleanupPreset = _preset => {
preset = {
endpoint,
presetId: _preset?.presetId ?? null,
model: _preset?.model ?? 'gpt-3.5-turbo',
model: _preset?.model ?? endpointsFilter[endpoint]?.availableModels?.[0] ?? 'gpt-3.5-turbo',
chatGptLabel: _preset?.chatGptLabel ?? null,
promptPrefix: _preset?.promptPrefix ?? null,
temperature: _preset?.temperature ?? 1,
@ -25,11 +25,12 @@ const cleanupPreset = _preset => {
toneStyle: _preset?.toneStyle ?? 'fast',
title: _preset?.title ?? 'New Preset'
};
} else if (endpoint === 'chatGPTBrowser') {
} else if (endpoint === 'chatGPT') {
preset = {
endpoint,
presetId: _preset?.presetId ?? null,
model: _preset?.model ?? 'Default (GPT-3.5)',
model:
_preset?.model ?? endpointsFilter[endpoint]?.availableModels?.[0] ?? 'text-davinci-002-render-sha',
title: _preset?.title ?? 'New Preset'
};
} else if (endpoint === null) {

View file

@ -1,9 +1,15 @@
const buildDefaultConversation = ({ conversation, endpoint, lastConversationSetup = {} }) => {
const buildDefaultConversation = ({
conversation,
endpoint,
endpointsFilter = {},
lastConversationSetup = {}
}) => {
if (endpoint === 'azureOpenAI' || endpoint === 'openAI') {
conversation = {
...conversation,
endpoint,
model: lastConversationSetup?.model ?? 'gpt-3.5-turbo',
model:
lastConversationSetup?.model ?? endpointsFilter[endpoint]?.availableModels?.[0] ?? 'gpt-3.5-turbo',
chatGptLabel: lastConversationSetup?.chatGptLabel ?? null,
promptPrefix: lastConversationSetup?.promptPrefix ?? null,
temperature: lastConversationSetup?.temperature ?? 1,
@ -28,7 +34,10 @@ const buildDefaultConversation = ({ conversation, endpoint, lastConversationSetu
conversation = {
...conversation,
endpoint,
model: lastConversationSetup?.model ?? 'Default (GPT-3.5)'
model:
lastConversationSetup?.model ??
endpointsFilter[endpoint]?.availableModels?.[0] ??
'text-davinci-002-render-sha'
};
} else if (endpoint === null) {
conversation = {
@ -56,7 +65,8 @@ const getDefaultConversation = ({ conversation, prevConversation, endpointsFilte
conversation = buildDefaultConversation({
conversation,
endpoint,
lastConversationSetup: preset
lastConversationSetup: preset,
endpointsFilter
});
return conversation;
} else {
@ -72,7 +82,8 @@ const getDefaultConversation = ({ conversation, prevConversation, endpointsFilte
// conversation = buildDefaultConversation({
// conversation,
// endpoint,
// lastConversationSetup: prevConversation
// lastConversationSetup: prevConversation,
// endpointsFilter
// });
// return conversation;
// }
@ -84,7 +95,7 @@ const getDefaultConversation = ({ conversation, prevConversation, endpointsFilte
const { endpoint = null } = lastConversationSetup;
if (endpointsFilter?.[endpoint]) {
conversation = buildDefaultConversation({ conversation, endpoint });
conversation = buildDefaultConversation({ conversation, endpoint, endpointsFilter });
return conversation;
}
} catch (error) {}
@ -93,10 +104,10 @@ const getDefaultConversation = ({ conversation, prevConversation, endpointsFilte
const endpoint = ['openAI', 'azureOpenAI', 'bingAI', 'chatGPTBrowser'].find(e => endpointsFilter?.[e]);
if (endpoint) {
conversation = buildDefaultConversation({ conversation, endpoint });
conversation = buildDefaultConversation({ conversation, endpoint, endpointsFilter });
return conversation;
} else {
conversation = buildDefaultConversation({ conversation, endpoint: null });
conversation = buildDefaultConversation({ conversation, endpoint: null, endpointsFilter });
return conversation;
}
};

View file

@ -7,6 +7,7 @@ const useMessageHandler = () => {
const currentConversation = useRecoilValue(store.conversation) || {};
const setSubmission = useSetRecoilState(store.submission);
const isSubmitting = useRecoilValue(store.isSubmitting);
const endpointsFilter = useRecoilValue(store.endpointsFilter);
const latestMessage = useRecoilValue(store.latestMessage);
@ -27,7 +28,8 @@ const useMessageHandler = () => {
if (endpoint === 'azureOpenAI' || endpoint === 'openAI') {
endpointOption = {
endpoint,
model: currentConversation?.model ?? 'gpt-3.5-turbo',
model:
currentConversation?.model ?? endpointsFilter[endpoint]?.availableModels?.[0] ?? 'gpt-3.5-turbo',
chatGptLabel: currentConversation?.chatGptLabel ?? null,
promptPrefix: currentConversation?.promptPrefix ?? null,
temperature: currentConversation?.temperature ?? 1,
@ -52,7 +54,10 @@ const useMessageHandler = () => {
} else if (endpoint === 'chatGPTBrowser') {
endpointOption = {
endpoint,
model: currentConversation?.model ?? 'Default (GPT-3.5)'
model:
currentConversation?.model ??
endpointsFilter[endpoint]?.availableModels?.[0] ??
'text-davinci-002-render-sha'
};
responseSender = 'ChatGPT';
} else if (endpoint === null) {