LibreChat/client/src/utils/buildDefaultConvo.ts

75 lines
2 KiB
TypeScript
Raw Normal View History

🤖 feat: Model Specs & Save Tools per Convo/Preset (#2578) * WIP: first pass ModelSpecs * refactor(onSelectEndpoint): use `getConvoSwitchLogic` * feat: introduce iconURL, greeting, frontend fields for conversations/presets/messages * feat: conversation.iconURL & greeting in Landing * feat: conversation.iconURL & greeting in New Chat button * feat: message.iconURL * refactor: ConversationIcon -> ConvoIconURL * WIP: add spec as a conversation field * refactor: useAppStartup, set spec on initial load for new chat, allow undefined spec, add localStorage keys enum, additional type fields for spec * feat: handle `showIconInMenu`, `showIconInHeader`, undefined `iconURL` and no specs on initial load * chore: handle undefined or empty modelSpecs * WIP: first pass, modelSpec schema for custom config * refactor: move default filtered tools definition to ToolService * feat: pass modelSpecs from backend via startupConfig * refactor: modelSpecs config, return and define list * fix: react error and include iconURL in responseMessage * refactor: add iconURL to responseMessage only * refactor: getIconEndpoint * refactor: pass TSpecsConfig * fix(assistants): differentiate compactAssistantSchema, correctly resets shared conversation state with other endpoints * refactor: assistant id prefix localStorage key * refactor: add more LocalStorageKeys and replace hardcoded values * feat: prioritize spec on new chat behavior: last selected modelSpec behavior (localStorage) * feat: first pass, interface config * chore: WIP, todo: add warnings based on config.modelSpecs settings. * feat: enforce modelSpecs if configured * feat: show config file yaml errors * chore: delete unused legacy Plugins component * refactor: set tools to localStorage from recoil store * chore: add stable recoil setter to useEffect deps * refactor: save tools to conversation documents * style(MultiSelectPop): dynamic height, remove unused import * refactor(react-query): use localstorage keys and pass config to useAvailablePluginsQuery * feat(utils): add mapPlugins * refactor(Convo): use conversation.tools if defined, lastSelectedTools if not * refactor: remove unused legacy code using `useSetOptions`, remove conditional flag `isMultiChat` for using legacy settings * refactor(PluginStoreDialog): add exhaustive-deps which are stable react state setters * fix(HeaderOptions): pass `popover` as true * refactor(useSetStorage): use project enums * refactor: use LocalStorageKeys enum * fix: prevent setConversation from setting falsy values in lastSelectedTools * refactor: use map for availableTools state and available Plugins query * refactor(updateLastSelectedModel): organize logic better and add note on purpose * fix(setAgentOption): prevent reseting last model to secondary model for gptPlugins * refactor(buildDefaultConvo): use enum * refactor: remove `useSetStorage` and consolidate areas where conversation state is saved to localStorage * fix: conversations retain tools on refresh * fix(gptPlugins): prevent nullish tools from being saved * chore: delete useServerStream * refactor: move initial plugins logic to useAppStartup * refactor(MultiSelectDropDown): add more pass-in className props * feat: use tools in presets * chore: delete unused usePresetOptions * refactor: new agentOptions default handling * chore: note * feat: add label and custom instructions to agents * chore: remove 'disabled with tools' message * style: move plugins to 2nd column in parameters * fix: TPreset type for agentOptions * fix: interface controls * refactor: add interfaceConfig, use Separator within Switcher * refactor: hide Assistants panel if interface.parameters are disabled * fix(Header): only modelSpecs if list is greater than 0 * refactor: separate MessageIcon logic from useMessageHelpers for better react rule-following * fix(AppService): don't use reserved keyword 'interface' * feat: set existing Icon for custom endpoints through iconURL * fix(ci): tests passing for App Service * docs: refactor custom_config.md for readability and better organization, also include missing values * docs: interface section and re-organize docs * docs: update modelSpecs info * chore: remove unused files * chore: remove unused files * chore: move useSetIndexOptions * chore: remove unused file * chore: move useConversation(s) * chore: move useDefaultConvo * chore: move useNavigateToConvo * refactor: use plugin install hook so it can be used elsewhere * chore: import order * update docs * refactor(OpenAI/Plugins): allow modelLabel as an initial value for chatGptLabel * chore: remove unused EndpointOptionsPopover and hide 'Save as Preset' button if preset UI visibility disabled * feat(loadDefaultInterface): issue warnings based on values * feat: changelog for custom config file * docs: add additional changelog note * fix: prevent unavailable tool selection from preset and update availableTools on Plugin installations * feat: add `filteredTools` option in custom config * chore: changelog * fix(MessageIcon): always overwrite conversation.iconURL in messageSettings * fix(ModelSpecsMenu): icon edge cases * fix(NewChat): dynamic icon * fix(PluginsClient): always include endpoint in responseMessage * fix: always include endpoint and iconURL in responseMessage across different response methods * feat: interchangeable keys for modelSpec enforcing
2024-04-30 22:11:48 -04:00
import { parseConvo, EModelEndpoint } from 'librechat-data-provider';
import type { TConversation } from 'librechat-data-provider';
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
import getLocalStorageItems from './getLocalStorageItems';
const buildDefaultConvo = ({
conversation,
endpoint,
models,
lastConversationSetup,
}: {
conversation: TConversation;
endpoint: EModelEndpoint;
models: string[];
// TODO: fix this type as we should allow undefined
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
lastConversationSetup: TConversation;
}) => {
const { lastSelectedModel, lastSelectedTools, lastBingSettings } = getLocalStorageItems();
const { jailbreak, toneStyle } = lastBingSettings;
const endpointType = lastConversationSetup?.endpointType ?? conversation?.endpointType;
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
if (!endpoint) {
return {
...conversation,
💫 feat: Config File & Custom Endpoints (#1474) * WIP(backend/api): custom endpoint * WIP(frontend/client): custom endpoint * chore: adjust typedefs for configs * refactor: use data-provider for cache keys and rename enums and custom endpoint for better clarity and compatibility * feat: loadYaml utility * refactor: rename back to from and proof-of-concept for creating schemas from user-defined defaults * refactor: remove custom endpoint from default endpointsConfig as it will be exclusively managed by yaml config * refactor(EndpointController): rename variables for clarity * feat: initial load custom config * feat(server/utils): add simple `isUserProvided` helper * chore(types): update TConfig type * refactor: remove custom endpoint handling from model services as will be handled by config, modularize fetching of models * feat: loadCustomConfig, loadConfigEndpoints, loadConfigModels * chore: reorganize server init imports, invoke loadCustomConfig * refactor(loadConfigEndpoints/Models): return each custom endpoint as standalone endpoint * refactor(Endpoint/ModelController): spread config values after default (temporary) * chore(client): fix type issues * WIP: first pass for multiple custom endpoints - add endpointType to Conversation schema - add update zod schemas for both convo/presets to allow non-EModelEndpoint value as endpoint (also using type assertion) - use `endpointType` value as `endpoint` where mapping to type is necessary using this field - use custom defined `endpoint` value and not type for mapping to modelsConfig - misc: add return type to `getDefaultEndpoint` - in `useNewConvo`, add the endpointType if it wasn't already added to conversation - EndpointsMenu: use user-defined endpoint name as Title in menu - TODO: custom icon via custom config, change unknown to robot icon * refactor(parseConvo): pass args as an object and change where used accordingly; chore: comment out 'create schema' code * chore: remove unused availableModels field in TConfig type * refactor(parseCompactConvo): pass args as an object and change where used accordingly * feat: chat through custom endpoint * chore(message/convoSchemas): avoid saving empty arrays * fix(BaseClient/saveMessageToDatabase): save endpointType * refactor(ChatRoute): show Spinner if endpointsQuery or modelsQuery are still loading, which is apparent with slow fetching of models/remote config on first serve * fix(useConversation): assign endpointType if it's missing * fix(SaveAsPreset): pass real endpoint and endpointType when saving Preset) * chore: recorganize types order for TConfig, add `iconURL` * feat: custom endpoint icon support: - use UnknownIcon in all icon contexts - add mistral and openrouter as known endpoints, and add their icons - iconURL support * fix(presetSchema): move endpointType to default schema definitions shared between convoSchema and defaults * refactor(Settings/OpenAI): remove legacy `isOpenAI` flag * fix(OpenAIClient): do not invoke abortCompletion on completion error * feat: add responseSender/label support for custom endpoints: - use defaultModelLabel field in endpointOption - add model defaults for custom endpoints in `getResponseSender` - add `useGetSender` hook which uses EndpointsQuery to determine `defaultModelLabel` - include defaultModelLabel from endpointConfig in custom endpoint client options - pass `endpointType` to `getResponseSender` * feat(OpenAIClient): use custom options from config file * refactor: rename `defaultModelLabel` to `modelDisplayLabel` * refactor(data-provider): separate concerns from `schemas` into `parsers`, `config`, and fix imports elsewhere * feat: `iconURL` and extract environment variables from custom endpoint config values * feat: custom config validation via zod schema, rename and move to `./projectRoot/librechat.yaml` * docs: custom config docs and examples * fix(OpenAIClient/mistral): mistral does not allow singular system message, also add `useChatCompletion` flag to use openai-node for title completions * fix(custom/initializeClient): extract env var and use `isUserProvided` function * Update librechat.example.yaml * feat(InputWithLabel): add className props, and forwardRef * fix(streamResponse): handle error edge case where either messages or convos query throws an error * fix(useSSE): handle errorHandler edge cases where error response is and is not properly formatted from API, especially when a conversationId is not yet provided, which ensures stream is properly closed on error * feat: user_provided keys for custom endpoints * fix(config/endpointSchema): do not allow default endpoint values in custom endpoint `name` * feat(loadConfigModels): extract env variables and optimize fetching models * feat: support custom endpoint iconURL for messages and Nav * feat(OpenAIClient): add/dropParams support * docs: update docs with default params, add/dropParams, and notes to use config file instead of `OPENAI_REVERSE_PROXY` * docs: update docs with additional notes * feat(maxTokensMap): add mistral models (32k context) * docs: update openrouter notes * Update ai_setup.md * docs(custom_config): add table of contents and fix note about custom name * docs(custom_config): reorder ToC * Update custom_config.md * Add note about `max_tokens` field in custom_config.md
2024-01-03 09:22:48 -05:00
endpointType,
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
endpoint,
};
}
const availableModels = models;
const model = lastConversationSetup?.model ?? lastSelectedModel?.[endpoint];
const secondaryModel =
🤖 feat: Model Specs & Save Tools per Convo/Preset (#2578) * WIP: first pass ModelSpecs * refactor(onSelectEndpoint): use `getConvoSwitchLogic` * feat: introduce iconURL, greeting, frontend fields for conversations/presets/messages * feat: conversation.iconURL & greeting in Landing * feat: conversation.iconURL & greeting in New Chat button * feat: message.iconURL * refactor: ConversationIcon -> ConvoIconURL * WIP: add spec as a conversation field * refactor: useAppStartup, set spec on initial load for new chat, allow undefined spec, add localStorage keys enum, additional type fields for spec * feat: handle `showIconInMenu`, `showIconInHeader`, undefined `iconURL` and no specs on initial load * chore: handle undefined or empty modelSpecs * WIP: first pass, modelSpec schema for custom config * refactor: move default filtered tools definition to ToolService * feat: pass modelSpecs from backend via startupConfig * refactor: modelSpecs config, return and define list * fix: react error and include iconURL in responseMessage * refactor: add iconURL to responseMessage only * refactor: getIconEndpoint * refactor: pass TSpecsConfig * fix(assistants): differentiate compactAssistantSchema, correctly resets shared conversation state with other endpoints * refactor: assistant id prefix localStorage key * refactor: add more LocalStorageKeys and replace hardcoded values * feat: prioritize spec on new chat behavior: last selected modelSpec behavior (localStorage) * feat: first pass, interface config * chore: WIP, todo: add warnings based on config.modelSpecs settings. * feat: enforce modelSpecs if configured * feat: show config file yaml errors * chore: delete unused legacy Plugins component * refactor: set tools to localStorage from recoil store * chore: add stable recoil setter to useEffect deps * refactor: save tools to conversation documents * style(MultiSelectPop): dynamic height, remove unused import * refactor(react-query): use localstorage keys and pass config to useAvailablePluginsQuery * feat(utils): add mapPlugins * refactor(Convo): use conversation.tools if defined, lastSelectedTools if not * refactor: remove unused legacy code using `useSetOptions`, remove conditional flag `isMultiChat` for using legacy settings * refactor(PluginStoreDialog): add exhaustive-deps which are stable react state setters * fix(HeaderOptions): pass `popover` as true * refactor(useSetStorage): use project enums * refactor: use LocalStorageKeys enum * fix: prevent setConversation from setting falsy values in lastSelectedTools * refactor: use map for availableTools state and available Plugins query * refactor(updateLastSelectedModel): organize logic better and add note on purpose * fix(setAgentOption): prevent reseting last model to secondary model for gptPlugins * refactor(buildDefaultConvo): use enum * refactor: remove `useSetStorage` and consolidate areas where conversation state is saved to localStorage * fix: conversations retain tools on refresh * fix(gptPlugins): prevent nullish tools from being saved * chore: delete useServerStream * refactor: move initial plugins logic to useAppStartup * refactor(MultiSelectDropDown): add more pass-in className props * feat: use tools in presets * chore: delete unused usePresetOptions * refactor: new agentOptions default handling * chore: note * feat: add label and custom instructions to agents * chore: remove 'disabled with tools' message * style: move plugins to 2nd column in parameters * fix: TPreset type for agentOptions * fix: interface controls * refactor: add interfaceConfig, use Separator within Switcher * refactor: hide Assistants panel if interface.parameters are disabled * fix(Header): only modelSpecs if list is greater than 0 * refactor: separate MessageIcon logic from useMessageHelpers for better react rule-following * fix(AppService): don't use reserved keyword 'interface' * feat: set existing Icon for custom endpoints through iconURL * fix(ci): tests passing for App Service * docs: refactor custom_config.md for readability and better organization, also include missing values * docs: interface section and re-organize docs * docs: update modelSpecs info * chore: remove unused files * chore: remove unused files * chore: move useSetIndexOptions * chore: remove unused file * chore: move useConversation(s) * chore: move useDefaultConvo * chore: move useNavigateToConvo * refactor: use plugin install hook so it can be used elsewhere * chore: import order * update docs * refactor(OpenAI/Plugins): allow modelLabel as an initial value for chatGptLabel * chore: remove unused EndpointOptionsPopover and hide 'Save as Preset' button if preset UI visibility disabled * feat(loadDefaultInterface): issue warnings based on values * feat: changelog for custom config file * docs: add additional changelog note * fix: prevent unavailable tool selection from preset and update availableTools on Plugin installations * feat: add `filteredTools` option in custom config * chore: changelog * fix(MessageIcon): always overwrite conversation.iconURL in messageSettings * fix(ModelSpecsMenu): icon edge cases * fix(NewChat): dynamic icon * fix(PluginsClient): always include endpoint in responseMessage * fix: always include endpoint and iconURL in responseMessage across different response methods * feat: interchangeable keys for modelSpec enforcing
2024-04-30 22:11:48 -04:00
endpoint === EModelEndpoint.gptPlugins
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
? lastConversationSetup?.agentOptions?.model ?? lastSelectedModel?.secondaryModel
: null;
let possibleModels: string[], secondaryModels: string[];
if (availableModels.includes(model)) {
possibleModels = [model, ...availableModels];
} else {
possibleModels = [...availableModels];
}
if (secondaryModel && availableModels.includes(secondaryModel)) {
secondaryModels = [secondaryModel, ...availableModels];
} else {
secondaryModels = [...availableModels];
}
💫 feat: Config File & Custom Endpoints (#1474) * WIP(backend/api): custom endpoint * WIP(frontend/client): custom endpoint * chore: adjust typedefs for configs * refactor: use data-provider for cache keys and rename enums and custom endpoint for better clarity and compatibility * feat: loadYaml utility * refactor: rename back to from and proof-of-concept for creating schemas from user-defined defaults * refactor: remove custom endpoint from default endpointsConfig as it will be exclusively managed by yaml config * refactor(EndpointController): rename variables for clarity * feat: initial load custom config * feat(server/utils): add simple `isUserProvided` helper * chore(types): update TConfig type * refactor: remove custom endpoint handling from model services as will be handled by config, modularize fetching of models * feat: loadCustomConfig, loadConfigEndpoints, loadConfigModels * chore: reorganize server init imports, invoke loadCustomConfig * refactor(loadConfigEndpoints/Models): return each custom endpoint as standalone endpoint * refactor(Endpoint/ModelController): spread config values after default (temporary) * chore(client): fix type issues * WIP: first pass for multiple custom endpoints - add endpointType to Conversation schema - add update zod schemas for both convo/presets to allow non-EModelEndpoint value as endpoint (also using type assertion) - use `endpointType` value as `endpoint` where mapping to type is necessary using this field - use custom defined `endpoint` value and not type for mapping to modelsConfig - misc: add return type to `getDefaultEndpoint` - in `useNewConvo`, add the endpointType if it wasn't already added to conversation - EndpointsMenu: use user-defined endpoint name as Title in menu - TODO: custom icon via custom config, change unknown to robot icon * refactor(parseConvo): pass args as an object and change where used accordingly; chore: comment out 'create schema' code * chore: remove unused availableModels field in TConfig type * refactor(parseCompactConvo): pass args as an object and change where used accordingly * feat: chat through custom endpoint * chore(message/convoSchemas): avoid saving empty arrays * fix(BaseClient/saveMessageToDatabase): save endpointType * refactor(ChatRoute): show Spinner if endpointsQuery or modelsQuery are still loading, which is apparent with slow fetching of models/remote config on first serve * fix(useConversation): assign endpointType if it's missing * fix(SaveAsPreset): pass real endpoint and endpointType when saving Preset) * chore: recorganize types order for TConfig, add `iconURL` * feat: custom endpoint icon support: - use UnknownIcon in all icon contexts - add mistral and openrouter as known endpoints, and add their icons - iconURL support * fix(presetSchema): move endpointType to default schema definitions shared between convoSchema and defaults * refactor(Settings/OpenAI): remove legacy `isOpenAI` flag * fix(OpenAIClient): do not invoke abortCompletion on completion error * feat: add responseSender/label support for custom endpoints: - use defaultModelLabel field in endpointOption - add model defaults for custom endpoints in `getResponseSender` - add `useGetSender` hook which uses EndpointsQuery to determine `defaultModelLabel` - include defaultModelLabel from endpointConfig in custom endpoint client options - pass `endpointType` to `getResponseSender` * feat(OpenAIClient): use custom options from config file * refactor: rename `defaultModelLabel` to `modelDisplayLabel` * refactor(data-provider): separate concerns from `schemas` into `parsers`, `config`, and fix imports elsewhere * feat: `iconURL` and extract environment variables from custom endpoint config values * feat: custom config validation via zod schema, rename and move to `./projectRoot/librechat.yaml` * docs: custom config docs and examples * fix(OpenAIClient/mistral): mistral does not allow singular system message, also add `useChatCompletion` flag to use openai-node for title completions * fix(custom/initializeClient): extract env var and use `isUserProvided` function * Update librechat.example.yaml * feat(InputWithLabel): add className props, and forwardRef * fix(streamResponse): handle error edge case where either messages or convos query throws an error * fix(useSSE): handle errorHandler edge cases where error response is and is not properly formatted from API, especially when a conversationId is not yet provided, which ensures stream is properly closed on error * feat: user_provided keys for custom endpoints * fix(config/endpointSchema): do not allow default endpoint values in custom endpoint `name` * feat(loadConfigModels): extract env variables and optimize fetching models * feat: support custom endpoint iconURL for messages and Nav * feat(OpenAIClient): add/dropParams support * docs: update docs with default params, add/dropParams, and notes to use config file instead of `OPENAI_REVERSE_PROXY` * docs: update docs with additional notes * feat(maxTokensMap): add mistral models (32k context) * docs: update openrouter notes * Update ai_setup.md * docs(custom_config): add table of contents and fix note about custom name * docs(custom_config): reorder ToC * Update custom_config.md * Add note about `max_tokens` field in custom_config.md
2024-01-03 09:22:48 -05:00
const convo = parseConvo({
endpoint,
endpointType,
conversation: lastConversationSetup,
possibleValues: {
models: possibleModels,
secondaryModels,
},
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
});
💫 feat: Config File & Custom Endpoints (#1474) * WIP(backend/api): custom endpoint * WIP(frontend/client): custom endpoint * chore: adjust typedefs for configs * refactor: use data-provider for cache keys and rename enums and custom endpoint for better clarity and compatibility * feat: loadYaml utility * refactor: rename back to from and proof-of-concept for creating schemas from user-defined defaults * refactor: remove custom endpoint from default endpointsConfig as it will be exclusively managed by yaml config * refactor(EndpointController): rename variables for clarity * feat: initial load custom config * feat(server/utils): add simple `isUserProvided` helper * chore(types): update TConfig type * refactor: remove custom endpoint handling from model services as will be handled by config, modularize fetching of models * feat: loadCustomConfig, loadConfigEndpoints, loadConfigModels * chore: reorganize server init imports, invoke loadCustomConfig * refactor(loadConfigEndpoints/Models): return each custom endpoint as standalone endpoint * refactor(Endpoint/ModelController): spread config values after default (temporary) * chore(client): fix type issues * WIP: first pass for multiple custom endpoints - add endpointType to Conversation schema - add update zod schemas for both convo/presets to allow non-EModelEndpoint value as endpoint (also using type assertion) - use `endpointType` value as `endpoint` where mapping to type is necessary using this field - use custom defined `endpoint` value and not type for mapping to modelsConfig - misc: add return type to `getDefaultEndpoint` - in `useNewConvo`, add the endpointType if it wasn't already added to conversation - EndpointsMenu: use user-defined endpoint name as Title in menu - TODO: custom icon via custom config, change unknown to robot icon * refactor(parseConvo): pass args as an object and change where used accordingly; chore: comment out 'create schema' code * chore: remove unused availableModels field in TConfig type * refactor(parseCompactConvo): pass args as an object and change where used accordingly * feat: chat through custom endpoint * chore(message/convoSchemas): avoid saving empty arrays * fix(BaseClient/saveMessageToDatabase): save endpointType * refactor(ChatRoute): show Spinner if endpointsQuery or modelsQuery are still loading, which is apparent with slow fetching of models/remote config on first serve * fix(useConversation): assign endpointType if it's missing * fix(SaveAsPreset): pass real endpoint and endpointType when saving Preset) * chore: recorganize types order for TConfig, add `iconURL` * feat: custom endpoint icon support: - use UnknownIcon in all icon contexts - add mistral and openrouter as known endpoints, and add their icons - iconURL support * fix(presetSchema): move endpointType to default schema definitions shared between convoSchema and defaults * refactor(Settings/OpenAI): remove legacy `isOpenAI` flag * fix(OpenAIClient): do not invoke abortCompletion on completion error * feat: add responseSender/label support for custom endpoints: - use defaultModelLabel field in endpointOption - add model defaults for custom endpoints in `getResponseSender` - add `useGetSender` hook which uses EndpointsQuery to determine `defaultModelLabel` - include defaultModelLabel from endpointConfig in custom endpoint client options - pass `endpointType` to `getResponseSender` * feat(OpenAIClient): use custom options from config file * refactor: rename `defaultModelLabel` to `modelDisplayLabel` * refactor(data-provider): separate concerns from `schemas` into `parsers`, `config`, and fix imports elsewhere * feat: `iconURL` and extract environment variables from custom endpoint config values * feat: custom config validation via zod schema, rename and move to `./projectRoot/librechat.yaml` * docs: custom config docs and examples * fix(OpenAIClient/mistral): mistral does not allow singular system message, also add `useChatCompletion` flag to use openai-node for title completions * fix(custom/initializeClient): extract env var and use `isUserProvided` function * Update librechat.example.yaml * feat(InputWithLabel): add className props, and forwardRef * fix(streamResponse): handle error edge case where either messages or convos query throws an error * fix(useSSE): handle errorHandler edge cases where error response is and is not properly formatted from API, especially when a conversationId is not yet provided, which ensures stream is properly closed on error * feat: user_provided keys for custom endpoints * fix(config/endpointSchema): do not allow default endpoint values in custom endpoint `name` * feat(loadConfigModels): extract env variables and optimize fetching models * feat: support custom endpoint iconURL for messages and Nav * feat(OpenAIClient): add/dropParams support * docs: update docs with default params, add/dropParams, and notes to use config file instead of `OPENAI_REVERSE_PROXY` * docs: update docs with additional notes * feat(maxTokensMap): add mistral models (32k context) * docs: update openrouter notes * Update ai_setup.md * docs(custom_config): add table of contents and fix note about custom name * docs(custom_config): reorder ToC * Update custom_config.md * Add note about `max_tokens` field in custom_config.md
2024-01-03 09:22:48 -05:00
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
const defaultConvo = {
...conversation,
...convo,
💫 feat: Config File & Custom Endpoints (#1474) * WIP(backend/api): custom endpoint * WIP(frontend/client): custom endpoint * chore: adjust typedefs for configs * refactor: use data-provider for cache keys and rename enums and custom endpoint for better clarity and compatibility * feat: loadYaml utility * refactor: rename back to from and proof-of-concept for creating schemas from user-defined defaults * refactor: remove custom endpoint from default endpointsConfig as it will be exclusively managed by yaml config * refactor(EndpointController): rename variables for clarity * feat: initial load custom config * feat(server/utils): add simple `isUserProvided` helper * chore(types): update TConfig type * refactor: remove custom endpoint handling from model services as will be handled by config, modularize fetching of models * feat: loadCustomConfig, loadConfigEndpoints, loadConfigModels * chore: reorganize server init imports, invoke loadCustomConfig * refactor(loadConfigEndpoints/Models): return each custom endpoint as standalone endpoint * refactor(Endpoint/ModelController): spread config values after default (temporary) * chore(client): fix type issues * WIP: first pass for multiple custom endpoints - add endpointType to Conversation schema - add update zod schemas for both convo/presets to allow non-EModelEndpoint value as endpoint (also using type assertion) - use `endpointType` value as `endpoint` where mapping to type is necessary using this field - use custom defined `endpoint` value and not type for mapping to modelsConfig - misc: add return type to `getDefaultEndpoint` - in `useNewConvo`, add the endpointType if it wasn't already added to conversation - EndpointsMenu: use user-defined endpoint name as Title in menu - TODO: custom icon via custom config, change unknown to robot icon * refactor(parseConvo): pass args as an object and change where used accordingly; chore: comment out 'create schema' code * chore: remove unused availableModels field in TConfig type * refactor(parseCompactConvo): pass args as an object and change where used accordingly * feat: chat through custom endpoint * chore(message/convoSchemas): avoid saving empty arrays * fix(BaseClient/saveMessageToDatabase): save endpointType * refactor(ChatRoute): show Spinner if endpointsQuery or modelsQuery are still loading, which is apparent with slow fetching of models/remote config on first serve * fix(useConversation): assign endpointType if it's missing * fix(SaveAsPreset): pass real endpoint and endpointType when saving Preset) * chore: recorganize types order for TConfig, add `iconURL` * feat: custom endpoint icon support: - use UnknownIcon in all icon contexts - add mistral and openrouter as known endpoints, and add their icons - iconURL support * fix(presetSchema): move endpointType to default schema definitions shared between convoSchema and defaults * refactor(Settings/OpenAI): remove legacy `isOpenAI` flag * fix(OpenAIClient): do not invoke abortCompletion on completion error * feat: add responseSender/label support for custom endpoints: - use defaultModelLabel field in endpointOption - add model defaults for custom endpoints in `getResponseSender` - add `useGetSender` hook which uses EndpointsQuery to determine `defaultModelLabel` - include defaultModelLabel from endpointConfig in custom endpoint client options - pass `endpointType` to `getResponseSender` * feat(OpenAIClient): use custom options from config file * refactor: rename `defaultModelLabel` to `modelDisplayLabel` * refactor(data-provider): separate concerns from `schemas` into `parsers`, `config`, and fix imports elsewhere * feat: `iconURL` and extract environment variables from custom endpoint config values * feat: custom config validation via zod schema, rename and move to `./projectRoot/librechat.yaml` * docs: custom config docs and examples * fix(OpenAIClient/mistral): mistral does not allow singular system message, also add `useChatCompletion` flag to use openai-node for title completions * fix(custom/initializeClient): extract env var and use `isUserProvided` function * Update librechat.example.yaml * feat(InputWithLabel): add className props, and forwardRef * fix(streamResponse): handle error edge case where either messages or convos query throws an error * fix(useSSE): handle errorHandler edge cases where error response is and is not properly formatted from API, especially when a conversationId is not yet provided, which ensures stream is properly closed on error * feat: user_provided keys for custom endpoints * fix(config/endpointSchema): do not allow default endpoint values in custom endpoint `name` * feat(loadConfigModels): extract env variables and optimize fetching models * feat: support custom endpoint iconURL for messages and Nav * feat(OpenAIClient): add/dropParams support * docs: update docs with default params, add/dropParams, and notes to use config file instead of `OPENAI_REVERSE_PROXY` * docs: update docs with additional notes * feat(maxTokensMap): add mistral models (32k context) * docs: update openrouter notes * Update ai_setup.md * docs(custom_config): add table of contents and fix note about custom name * docs(custom_config): reorder ToC * Update custom_config.md * Add note about `max_tokens` field in custom_config.md
2024-01-03 09:22:48 -05:00
endpointType,
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
endpoint,
};
🤖 feat: Model Specs & Save Tools per Convo/Preset (#2578) * WIP: first pass ModelSpecs * refactor(onSelectEndpoint): use `getConvoSwitchLogic` * feat: introduce iconURL, greeting, frontend fields for conversations/presets/messages * feat: conversation.iconURL & greeting in Landing * feat: conversation.iconURL & greeting in New Chat button * feat: message.iconURL * refactor: ConversationIcon -> ConvoIconURL * WIP: add spec as a conversation field * refactor: useAppStartup, set spec on initial load for new chat, allow undefined spec, add localStorage keys enum, additional type fields for spec * feat: handle `showIconInMenu`, `showIconInHeader`, undefined `iconURL` and no specs on initial load * chore: handle undefined or empty modelSpecs * WIP: first pass, modelSpec schema for custom config * refactor: move default filtered tools definition to ToolService * feat: pass modelSpecs from backend via startupConfig * refactor: modelSpecs config, return and define list * fix: react error and include iconURL in responseMessage * refactor: add iconURL to responseMessage only * refactor: getIconEndpoint * refactor: pass TSpecsConfig * fix(assistants): differentiate compactAssistantSchema, correctly resets shared conversation state with other endpoints * refactor: assistant id prefix localStorage key * refactor: add more LocalStorageKeys and replace hardcoded values * feat: prioritize spec on new chat behavior: last selected modelSpec behavior (localStorage) * feat: first pass, interface config * chore: WIP, todo: add warnings based on config.modelSpecs settings. * feat: enforce modelSpecs if configured * feat: show config file yaml errors * chore: delete unused legacy Plugins component * refactor: set tools to localStorage from recoil store * chore: add stable recoil setter to useEffect deps * refactor: save tools to conversation documents * style(MultiSelectPop): dynamic height, remove unused import * refactor(react-query): use localstorage keys and pass config to useAvailablePluginsQuery * feat(utils): add mapPlugins * refactor(Convo): use conversation.tools if defined, lastSelectedTools if not * refactor: remove unused legacy code using `useSetOptions`, remove conditional flag `isMultiChat` for using legacy settings * refactor(PluginStoreDialog): add exhaustive-deps which are stable react state setters * fix(HeaderOptions): pass `popover` as true * refactor(useSetStorage): use project enums * refactor: use LocalStorageKeys enum * fix: prevent setConversation from setting falsy values in lastSelectedTools * refactor: use map for availableTools state and available Plugins query * refactor(updateLastSelectedModel): organize logic better and add note on purpose * fix(setAgentOption): prevent reseting last model to secondary model for gptPlugins * refactor(buildDefaultConvo): use enum * refactor: remove `useSetStorage` and consolidate areas where conversation state is saved to localStorage * fix: conversations retain tools on refresh * fix(gptPlugins): prevent nullish tools from being saved * chore: delete useServerStream * refactor: move initial plugins logic to useAppStartup * refactor(MultiSelectDropDown): add more pass-in className props * feat: use tools in presets * chore: delete unused usePresetOptions * refactor: new agentOptions default handling * chore: note * feat: add label and custom instructions to agents * chore: remove 'disabled with tools' message * style: move plugins to 2nd column in parameters * fix: TPreset type for agentOptions * fix: interface controls * refactor: add interfaceConfig, use Separator within Switcher * refactor: hide Assistants panel if interface.parameters are disabled * fix(Header): only modelSpecs if list is greater than 0 * refactor: separate MessageIcon logic from useMessageHelpers for better react rule-following * fix(AppService): don't use reserved keyword 'interface' * feat: set existing Icon for custom endpoints through iconURL * fix(ci): tests passing for App Service * docs: refactor custom_config.md for readability and better organization, also include missing values * docs: interface section and re-organize docs * docs: update modelSpecs info * chore: remove unused files * chore: remove unused files * chore: move useSetIndexOptions * chore: remove unused file * chore: move useConversation(s) * chore: move useDefaultConvo * chore: move useNavigateToConvo * refactor: use plugin install hook so it can be used elsewhere * chore: import order * update docs * refactor(OpenAI/Plugins): allow modelLabel as an initial value for chatGptLabel * chore: remove unused EndpointOptionsPopover and hide 'Save as Preset' button if preset UI visibility disabled * feat(loadDefaultInterface): issue warnings based on values * feat: changelog for custom config file * docs: add additional changelog note * fix: prevent unavailable tool selection from preset and update availableTools on Plugin installations * feat: add `filteredTools` option in custom config * chore: changelog * fix(MessageIcon): always overwrite conversation.iconURL in messageSettings * fix(ModelSpecsMenu): icon edge cases * fix(NewChat): dynamic icon * fix(PluginsClient): always include endpoint in responseMessage * fix: always include endpoint and iconURL in responseMessage across different response methods * feat: interchangeable keys for modelSpec enforcing
2024-04-30 22:11:48 -04:00
defaultConvo.tools = lastConversationSetup?.tools ?? lastSelectedTools ?? defaultConvo.tools;
feat: OpenRouter Support & Improve Model Fetching ⇆ (#936) * chore(ChatGPTClient.js): add support for OpenRouter API chore(OpenAIClient.js): add support for OpenRouter API * chore: comment out token debugging * chore: add back streamResult assignment * chore: remove double condition/assignment from merging * refactor(routes/endpoints): -> controller/services logic * feat: add openrouter model fetching * chore: remove unused endpointsConfig in cleanupPreset function * refactor: separate models concern from endpointsConfig * refactor(data-provider): add TModels type and make TEndpointsConfig adaptible to new endpoint keys * refactor: complete models endpoint service in data-provider * refactor: onMutate for refreshToken and login, invalidate models query * feat: complete models endpoint logic for frontend * chore: remove requireJwtAuth from /api/endpoints and /api/models as not implemented yet * fix: endpoint will not be overwritten and instead use active value * feat: openrouter support for plugins * chore(EndpointOptionsDialog): remove unused recoil value * refactor(schemas/parseConvo): add handling of secondaryModels to use first of defined secondary models, which includes last selected one as first, or default to the convo's secondary model value * refactor: remove hooks from store and move to hooks refactor(switchToConversation): make switchToConversation use latest recoil state, which is necessary to get the most up-to-date models list, replace wrapper function refactor(getDefaultConversation): factor out logic into 3 pieces to reduce complexity. * fix: backend tests * feat: optimistic update by calling newConvo when models are fetched * feat: openrouter support for titling convos * feat: cache models fetch * chore: add missing dep to AuthContext useEffect * chore: fix useTimeout types * chore: delete old getDefaultConvo file * chore: remove newConvo logic from Root, remove console log from api models caching * chore: ensure bun is used for building in b:client script * fix: default endpoint will not default to null on a completely fresh login (no localStorage/cookies) * chore: add openrouter docs to free_ai_apis.md and .env.example * chore: remove openrouter console logs * feat: add debugging env variable for Plugins
2023-09-18 12:55:51 -04:00
defaultConvo.jailbreak = jailbreak ?? defaultConvo.jailbreak;
defaultConvo.toneStyle = toneStyle ?? defaultConvo.toneStyle;
return defaultConvo;
};
export default buildDefaultConvo;