🤖 feat: OpenAI Assistants v2 (initial support) (#2781)

* 🤖 Assistants V2 Support: Part 1

- Separated Azure Assistants to its own endpoint
- File Search / Vector Store integration is incomplete, but can toggle and use storage from playground
- Code Interpreter resource files can be added but not deleted
- GPT-4o is supported
- Many improvements to the Assistants Endpoint overall

data-provider v2 changes

copy existing route as v1

chore: rename new endpoint to reduce comparison operations and add new azure filesource

api: add azureAssistants part 1

force use of version for assistants/assistantsAzure

chore: switch name back to azureAssistants

refactor type version: string | number

Ensure assistants endpoints have version set

fix: isArchived type issue in ConversationListParams

refactor: update assistants mutations/queries with endpoint/version definitions, update Assistants Map structure

chore:  FilePreview component ExtendedFile type assertion

feat: isAssistantsEndpoint helper

chore: remove unused useGenerations

chore(buildTree): type issue

chore(Advanced): type issue (unused component, maybe in future)

first pass for multi-assistant endpoint rewrite

fix(listAssistants): pass params correctly

feat: list separate assistants by endpoint

fix(useTextarea): access assistantMap correctly

fix: assistant endpoint switching, resetting ID

fix: broken during rewrite, selecting assistant mention

fix: set/invalidate assistants endpoint query data correctly

feat: Fix issue with assistant ID not being reset correctly

getOpenAIClient helper function

feat: add toast for assistant deletion

fix: assistants delete right after create issue for azure

fix: assistant patching

refactor: actions to use getOpenAIClient

refactor: consolidate logic into helpers file

fix: issue where conversation data was not initially available

v1 chat support

refactor(spendTokens): only early return if completionTokens isNaN

fix(OpenAIClient): ensure spendTokens has all necessary params

refactor: route/controller logic

fix(assistants/initializeClient): use defaultHeaders field

fix: sanitize default operation id

chore: bump openai package

first pass v2 action service

feat: retroactive domain parsing for actions added via v1

feat: delete db records of actions/assistants on openai assistant deletion

chore: remove vision tools from v2 assistants

feat: v2 upload and delete assistant vision images

WIP first pass, thread attachments

fix: show assistant vision files (save local/firebase copy)

v2 image continue

fix: annotations

fix: refine annotations

show analyze as error if is no longer submitting before progress reaches 1 and show file_search as retrieval tool

fix: abort run, undefined endpoint issue

refactor: consolidate capabilities logic and anticipate versioning

frontend version 2 changes

fix: query selection and filter

add endpoint to unknown filepath

add file ids to resource, deleting in progress

enable/disable file search

remove version log

* 🤖 Assistants V2 Support: Part 2

🎹 fix: Autocompletion Chrome Bug on Action API Key Input

chore: remove `useOriginNavigate`

chore: set correct OpenAI Storage Source

fix: azure file deletions, instantiate clients by source for deletion

update code interpret files info

feat: deleteResourceFileId

chore: increase poll interval as azure easily rate limits

fix: openai file deletions, TODO: evaluate rejected deletion settled promises to determine which to delete from db records

file source icons

update table file filters

chore: file search info and versioning

fix: retrieval update with necessary tool_resources if specified

fix(useMentions): add optional chaining in case listMap value is undefined

fix: force assistant avatar roundedness

fix: azure assistants, check correct flag

chore: bump data-provider

* fix: merge conflict

* ci: fix backend tests due to new updates

* chore: update .env.example

* meilisearch improvements

* localization updates

* chore: update comparisons

* feat: add additional metadata: endpoint, author ID

* chore: azureAssistants ENDPOINTS exclusion warning
This commit is contained in:
Danny Avila 2024-05-19 12:56:55 -04:00 committed by GitHub
parent af8bcb08d6
commit 1a452121fa
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
158 changed files with 4184 additions and 1204 deletions

View file

@ -1,8 +1,13 @@
import { EModelEndpoint } from 'librechat-data-provider';
import type { Assistant, TConversation, TEndpointsConfig, TPreset } from 'librechat-data-provider';
import { isAssistantsEndpoint } from 'librechat-data-provider';
import type {
TAssistantsMap,
TConversation,
TEndpointsConfig,
TPreset,
} from 'librechat-data-provider';
import { getEndpointField, getIconKey, getIconEndpoint } from '~/utils';
import { icons } from '~/components/Chat/Menus/Endpoints/Icons';
import ConvoIconURL from '~/components/Endpoints/ConvoIconURL';
import { getEndpointField, getIconKey, getIconEndpoint } from '~/utils';
export default function ConvoIcon({
conversation,
@ -15,7 +20,7 @@ export default function ConvoIcon({
}: {
conversation: TConversation | TPreset | null;
endpointsConfig: TEndpointsConfig;
assistantMap: Record<string, Assistant>;
assistantMap: TAssistantsMap;
containerClassName?: string;
context?: 'message' | 'nav' | 'landing' | 'menu-item';
className?: string;
@ -25,7 +30,7 @@ export default function ConvoIcon({
let endpoint = conversation?.endpoint;
endpoint = getIconEndpoint({ endpointsConfig, iconURL, endpoint });
const assistant =
endpoint === EModelEndpoint.assistants && assistantMap?.[conversation?.assistant_id ?? ''];
isAssistantsEndpoint(endpoint) && assistantMap?.[endpoint]?.[conversation?.assistant_id ?? ''];
const assistantName = (assistant && assistant?.name) || '';
const avatar = (assistant && (assistant?.metadata?.avatar as string)) || '';

View file

@ -1,5 +1,10 @@
import { EModelEndpoint } from 'librechat-data-provider';
import type { Assistant, TConversation, TEndpointsConfig, TPreset } from 'librechat-data-provider';
import { isAssistantsEndpoint } from 'librechat-data-provider';
import type {
TConversation,
TEndpointsConfig,
TPreset,
TAssistantsMap,
} from 'librechat-data-provider';
import ConvoIconURL from '~/components/Endpoints/ConvoIconURL';
import MinimalIcon from '~/components/Endpoints/MinimalIcon';
import { getEndpointField, getIconEndpoint } from '~/utils';
@ -15,7 +20,7 @@ export default function EndpointIcon({
endpointsConfig: TEndpointsConfig;
containerClassName?: string;
context?: 'message' | 'nav' | 'landing' | 'menu-item';
assistantMap?: Record<string, Assistant>;
assistantMap?: TAssistantsMap;
className?: string;
size?: number;
}) {
@ -27,7 +32,7 @@ export default function EndpointIcon({
const endpointIconURL = getEndpointField(endpointsConfig, endpoint, 'iconURL');
const assistant =
endpoint === EModelEndpoint.assistants && assistantMap?.[conversation?.assistant_id ?? ''];
isAssistantsEndpoint(endpoint) && assistantMap?.[endpoint]?.[conversation?.assistant_id ?? ''];
const assistantAvatar = (assistant && (assistant?.metadata?.avatar as string)) || '';
const assistantName = (assistant && assistant?.name) || '';

View file

@ -1,4 +1,4 @@
import { EModelEndpoint } from 'librechat-data-provider';
import { EModelEndpoint, isAssistantsEndpoint } from 'librechat-data-provider';
import UnknownIcon from '~/components/Chat/Menus/Endpoints/UnknownIcon';
import {
Plugin,
@ -27,35 +27,38 @@ const MessageEndpointIcon: React.FC<IconProps> = (props) => {
assistantName,
} = props;
const assistantsIcon = {
icon: props.iconURL ? (
<div className="relative flex h-6 w-6 items-center justify-center">
<div
title={assistantName}
style={{
width: size,
height: size,
}}
className={cn('overflow-hidden rounded-full', props.className ?? '')}
>
<img
className="shadow-stroke h-full w-full object-cover"
src={props.iconURL}
alt={assistantName}
style={{ height: '80', width: '80' }}
/>
</div>
</div>
) : (
<div className="h-6 w-6">
<div className="shadow-stroke flex h-6 w-6 items-center justify-center overflow-hidden rounded-full">
<AssistantIcon className="h-2/3 w-2/3 text-gray-400" />
</div>
</div>
),
name: endpoint,
};
const endpointIcons = {
[EModelEndpoint.assistants]: {
icon: props.iconURL ? (
<div className="relative flex h-6 w-6 items-center justify-center">
<div
title={assistantName}
style={{
width: size,
height: size,
}}
className={cn('overflow-hidden rounded-full', props.className ?? '')}
>
<img
className="shadow-stroke h-full w-full object-cover"
src={props.iconURL}
alt={assistantName}
style={{ height: '80', width: '80' }}
/>
</div>
</div>
) : (
<div className="h-6 w-6">
<div className="shadow-stroke flex h-6 w-6 items-center justify-center overflow-hidden rounded-full">
<AssistantIcon className="h-2/3 w-2/3 text-gray-400" />
</div>
</div>
),
name: endpoint,
},
[EModelEndpoint.assistants]: assistantsIcon,
[EModelEndpoint.azureAssistants]: assistantsIcon,
[EModelEndpoint.azureOpenAI]: {
icon: <AzureMinimalIcon size={size * 0.5555555555555556} />,
bg: 'linear-gradient(0.375turn, #61bde2, #4389d0)',
@ -136,7 +139,7 @@ const MessageEndpointIcon: React.FC<IconProps> = (props) => {
({ icon, bg, name } = endpointIcons[iconURL]);
}
if (endpoint === EModelEndpoint.assistants) {
if (isAssistantsEndpoint(endpoint)) {
return icon;
}

View file

@ -15,7 +15,7 @@ import { cn } from '~/utils';
import { IconProps } from '~/common';
const MinimalIcon: React.FC<IconProps> = (props) => {
const { size = 30, error } = props;
const { size = 30, iconClassName, error } = props;
let endpoint = 'default'; // Default value for endpoint
@ -25,10 +25,13 @@ const MinimalIcon: React.FC<IconProps> = (props) => {
const endpointIcons = {
[EModelEndpoint.azureOpenAI]: {
icon: <AzureMinimalIcon />,
icon: <AzureMinimalIcon className={iconClassName} />,
name: props.chatGptLabel || 'ChatGPT',
},
[EModelEndpoint.openAI]: {
icon: <OpenAIMinimalIcon className={iconClassName} />,
name: props.chatGptLabel || 'ChatGPT',
},
[EModelEndpoint.openAI]: { icon: <OpenAIMinimalIcon />, name: props.chatGptLabel || 'ChatGPT' },
[EModelEndpoint.gptPlugins]: { icon: <MinimalPlugin />, name: 'Plugins' },
[EModelEndpoint.google]: { icon: <GoogleMinimalIcon />, name: props.modelLabel || 'Google' },
[EModelEndpoint.anthropic]: {
@ -42,6 +45,7 @@ const MinimalIcon: React.FC<IconProps> = (props) => {
[EModelEndpoint.bingAI]: { icon: <BingAIMinimalIcon />, name: 'BingAI' },
[EModelEndpoint.chatGPTBrowser]: { icon: <LightningIcon />, name: 'ChatGPT' },
[EModelEndpoint.assistants]: { icon: <Sparkles className="icon-sm" />, name: 'Assistant' },
[EModelEndpoint.azureAssistants]: { icon: <Sparkles className="icon-sm" />, name: 'Assistant' },
default: {
icon: (
<UnknownIcon

View file

@ -1,5 +1,7 @@
import TextareaAutosize from 'react-textarea-autosize';
import { ImageDetail, imageDetailNumeric, imageDetailValue } from 'librechat-data-provider';
import type { ValueType } from '@rc-component/mini-decimal';
import type { TModelSelectProps } from '~/common';
import {
Input,
Label,
@ -11,7 +13,6 @@ import {
} from '~/components/ui';
import { cn, defaultTextProps, optionText, removeFocusOutlines } from '~/utils/';
import { useLocalize, useDebouncedInput } from '~/hooks';
import type { TModelSelectProps } from '~/common';
import OptionHover from './OptionHover';
import { ESide } from '~/common';
@ -127,7 +128,7 @@ export default function Settings({
id="temp-int"
disabled={readonly}
value={temperatureValue as number}
onChange={setTemperature}
onChange={setTemperature as (value: ValueType | null) => void}
max={2}
min={0}
step={0.01}

View file

@ -1,12 +1,10 @@
import { useState, useMemo, useEffect } from 'react';
import TextareaAutosize from 'react-textarea-autosize';
import { defaultOrderQuery } from 'librechat-data-provider';
import type { TPreset } from 'librechat-data-provider';
import type { TModelSelectProps, Option } from '~/common';
import { Label, HoverCard, SelectDropDown, HoverCardTrigger } from '~/components/ui';
import { cn, defaultTextProps, removeFocusOutlines, mapAssistants } from '~/utils';
import { useLocalize, useDebouncedInput } from '~/hooks';
import { useListAssistantsQuery } from '~/data-provider';
import { useLocalize, useDebouncedInput, useAssistantListMap } from '~/hooks';
import OptionHover from './OptionHover';
import { ESide } from '~/common';
@ -17,23 +15,21 @@ export default function Settings({ conversation, setOption, models, readonly }:
[localize],
);
const { data: assistants = [] } = useListAssistantsQuery(defaultOrderQuery, {
select: (res) =>
[
defaultOption,
...res.data.map(({ id, name }) => ({
label: name,
value: id,
})),
].filter(Boolean),
});
const { data: assistantMap = {} } = useListAssistantsQuery(defaultOrderQuery, {
select: (res) => mapAssistants(res.data),
});
const assistantListMap = useAssistantListMap((res) => mapAssistants(res.data));
const { model, endpoint, assistant_id, endpointType, promptPrefix, instructions } =
conversation ?? {};
const assistants = useMemo(() => {
return [
defaultOption,
...(assistantListMap[endpoint ?? ''] ?? []).map(({ id, name }) => ({
label: name,
value: id,
})),
].filter(Boolean);
}, [assistantListMap, endpoint, defaultOption]);
const [onPromptPrefixChange, promptPrefixValue] = useDebouncedInput({
setOption,
optionKey: 'promptPrefix',
@ -47,11 +43,11 @@ export default function Settings({ conversation, setOption, models, readonly }:
const activeAssistant = useMemo(() => {
if (assistant_id) {
return assistantMap[assistant_id];
return assistantListMap[endpoint ?? '']?.[assistant_id];
}
return null;
}, [assistant_id, assistantMap]);
}, [assistant_id, assistantListMap, endpoint]);
const modelOptions = useMemo(() => {
return models.map((model) => ({
@ -89,7 +85,7 @@ export default function Settings({ conversation, setOption, models, readonly }:
return;
}
const assistant = assistantMap[value];
const assistant = assistantListMap[endpoint ?? '']?.[value];
if (!assistant) {
setAssistantValue(defaultOption);
return;

View file

@ -9,6 +9,7 @@ import OpenAISettings from './OpenAI';
const settings: { [key: string]: FC<TModelSelectProps> } = {
[EModelEndpoint.assistants]: AssistantsSettings,
[EModelEndpoint.azureAssistants]: AssistantsSettings,
[EModelEndpoint.openAI]: OpenAISettings,
[EModelEndpoint.custom]: OpenAISettings,
[EModelEndpoint.azureOpenAI]: OpenAISettings,