🤖 Assistants V2 Support: Part 1

- Separated Azure Assistants to its own endpoint
- File Search / Vector Store integration is incomplete, but can toggle and use storage from playground
- Code Interpreter resource files can be added but not deleted
- GPT-4o is supported
- Many improvements to the Assistants Endpoint overall

data-provider v2 changes

copy existing route as v1

chore: rename new endpoint to reduce comparison operations and add new azure filesource

api: add azureAssistants part 1

force use of version for assistants/assistantsAzure

chore: switch name back to azureAssistants

refactor type version: string | number

Ensure assistants endpoints have version set

fix: isArchived type issue in ConversationListParams

refactor: update assistants mutations/queries with endpoint/version definitions, update Assistants Map structure

chore:  FilePreview component ExtendedFile type assertion

feat: isAssistantsEndpoint helper

chore: remove unused useGenerations

chore(buildTree): type issue

chore(Advanced): type issue (unused component, maybe in future)

first pass for multi-assistant endpoint rewrite

fix(listAssistants): pass params correctly

feat: list separate assistants by endpoint

fix(useTextarea): access assistantMap correctly

fix: assistant endpoint switching, resetting ID

fix: broken during rewrite, selecting assistant mention

fix: set/invalidate assistants endpoint query data correctly

feat: Fix issue with assistant ID not being reset correctly

getOpenAIClient helper function

feat: add toast for assistant deletion

fix: assistants delete right after create issue for azure

fix: assistant patching

refactor: actions to use getOpenAIClient

refactor: consolidate logic into helpers file

fix: issue where conversation data was not initially available

v1 chat support

refactor(spendTokens): only early return if completionTokens isNaN

fix(OpenAIClient): ensure spendTokens has all necessary params

refactor: route/controller logic

fix(assistants/initializeClient): use defaultHeaders field

fix: sanitize default operation id

chore: bump openai package

first pass v2 action service

feat: retroactive domain parsing for actions added via v1

feat: delete db records of actions/assistants on openai assistant deletion

chore: remove vision tools from v2 assistants

feat: v2 upload and delete assistant vision images

WIP first pass, thread attachments

fix: show assistant vision files (save local/firebase copy)

v2 image continue

fix: annotations

fix: refine annotations

show analyze as error if is no longer submitting before progress reaches 1 and show file_search as retrieval tool

fix: abort run, undefined endpoint issue

refactor: consolidate capabilities logic and anticipate versioning

frontend version 2 changes

fix: query selection and filter

add endpoint to unknown filepath

add file ids to resource, deleting in progress

enable/disable file search

remove version log
This commit is contained in:
Danny Avila 2024-05-14 13:36:33 -04:00
parent f0e8cca5df
commit 2bdbff5141
No known key found for this signature in database
GPG key ID: 2DD9CC89B9B50364
118 changed files with 3358 additions and 1039 deletions

View file

@ -3,8 +3,8 @@ import { useForm } from 'react-hook-form';
import { memo, useCallback, useRef, useMemo } from 'react';
import {
supportsFiles,
EModelEndpoint,
mergeFileConfig,
isAssistantsEndpoint,
fileConfig as defaultFileConfig,
} from 'librechat-data-provider';
import { useChatContext, useAssistantsMapContext } from '~/Providers';
@ -74,8 +74,9 @@ const ChatForm = ({ index = 0 }) => {
const endpointFileConfig = fileConfig.endpoints[endpoint ?? ''];
const invalidAssistant = useMemo(
() =>
conversation?.endpoint === EModelEndpoint.assistants &&
(!conversation?.assistant_id || !assistantMap?.[conversation?.assistant_id ?? '']),
isAssistantsEndpoint(conversation?.endpoint) &&
(!conversation?.assistant_id ||
!assistantMap?.[conversation?.endpoint ?? '']?.[conversation?.assistant_id ?? '']),
[conversation?.assistant_id, conversation?.endpoint, assistantMap],
);
const disableInputs = useMemo(

View file

@ -20,7 +20,7 @@ const FilePreview = ({
}) => {
const radius = 55; // Radius of the SVG circle
const circumference = 2 * Math.PI * radius;
const progress = useProgress(file?.['progress'] ?? 1, 0.001, file?.size ?? 1);
const progress = useProgress(file?.['progress'] ?? 1, 0.001, (file as ExtendedFile)?.size ?? 1);
console.log(progress);
// Calculate the offset based on the loading progress

View file

@ -17,7 +17,9 @@ export default function Mention({
}) {
const localize = useLocalize();
const assistantMap = useAssistantsMapContext();
const { options, modelsConfig, assistants, onSelectMention } = useMentions({ assistantMap });
const { options, modelsConfig, assistantListMap, onSelectMention } = useMentions({
assistantMap,
});
const [activeIndex, setActiveIndex] = useState(0);
const timeoutRef = useRef<NodeJS.Timeout | null>(null);
@ -47,7 +49,12 @@ export default function Mention({
if (mention.type === 'endpoint' && mention.value === EModelEndpoint.assistants) {
setSearchValue('');
setInputOptions(assistants);
setInputOptions(assistantListMap[EModelEndpoint.assistants]);
setActiveIndex(0);
inputRef.current?.focus();
} else if (mention.type === 'endpoint' && mention.value === EModelEndpoint.azureAssistants) {
setSearchValue('');
setInputOptions(assistantListMap[EModelEndpoint.azureAssistants]);
setActiveIndex(0);
inputRef.current?.focus();
} else if (mention.type === 'endpoint') {