🤖 feat: OpenAI Assistants v2 (initial support) (#2781)

* 🤖 Assistants V2 Support: Part 1

- Separated Azure Assistants to its own endpoint
- File Search / Vector Store integration is incomplete, but can toggle and use storage from playground
- Code Interpreter resource files can be added but not deleted
- GPT-4o is supported
- Many improvements to the Assistants Endpoint overall

data-provider v2 changes

copy existing route as v1

chore: rename new endpoint to reduce comparison operations and add new azure filesource

api: add azureAssistants part 1

force use of version for assistants/assistantsAzure

chore: switch name back to azureAssistants

refactor type version: string | number

Ensure assistants endpoints have version set

fix: isArchived type issue in ConversationListParams

refactor: update assistants mutations/queries with endpoint/version definitions, update Assistants Map structure

chore:  FilePreview component ExtendedFile type assertion

feat: isAssistantsEndpoint helper

chore: remove unused useGenerations

chore(buildTree): type issue

chore(Advanced): type issue (unused component, maybe in future)

first pass for multi-assistant endpoint rewrite

fix(listAssistants): pass params correctly

feat: list separate assistants by endpoint

fix(useTextarea): access assistantMap correctly

fix: assistant endpoint switching, resetting ID

fix: broken during rewrite, selecting assistant mention

fix: set/invalidate assistants endpoint query data correctly

feat: Fix issue with assistant ID not being reset correctly

getOpenAIClient helper function

feat: add toast for assistant deletion

fix: assistants delete right after create issue for azure

fix: assistant patching

refactor: actions to use getOpenAIClient

refactor: consolidate logic into helpers file

fix: issue where conversation data was not initially available

v1 chat support

refactor(spendTokens): only early return if completionTokens isNaN

fix(OpenAIClient): ensure spendTokens has all necessary params

refactor: route/controller logic

fix(assistants/initializeClient): use defaultHeaders field

fix: sanitize default operation id

chore: bump openai package

first pass v2 action service

feat: retroactive domain parsing for actions added via v1

feat: delete db records of actions/assistants on openai assistant deletion

chore: remove vision tools from v2 assistants

feat: v2 upload and delete assistant vision images

WIP first pass, thread attachments

fix: show assistant vision files (save local/firebase copy)

v2 image continue

fix: annotations

fix: refine annotations

show analyze as error if is no longer submitting before progress reaches 1 and show file_search as retrieval tool

fix: abort run, undefined endpoint issue

refactor: consolidate capabilities logic and anticipate versioning

frontend version 2 changes

fix: query selection and filter

add endpoint to unknown filepath

add file ids to resource, deleting in progress

enable/disable file search

remove version log

* 🤖 Assistants V2 Support: Part 2

🎹 fix: Autocompletion Chrome Bug on Action API Key Input

chore: remove `useOriginNavigate`

chore: set correct OpenAI Storage Source

fix: azure file deletions, instantiate clients by source for deletion

update code interpret files info

feat: deleteResourceFileId

chore: increase poll interval as azure easily rate limits

fix: openai file deletions, TODO: evaluate rejected deletion settled promises to determine which to delete from db records

file source icons

update table file filters

chore: file search info and versioning

fix: retrieval update with necessary tool_resources if specified

fix(useMentions): add optional chaining in case listMap value is undefined

fix: force assistant avatar roundedness

fix: azure assistants, check correct flag

chore: bump data-provider

* fix: merge conflict

* ci: fix backend tests due to new updates

* chore: update .env.example

* meilisearch improvements

* localization updates

* chore: update comparisons

* feat: add additional metadata: endpoint, author ID

* chore: azureAssistants ENDPOINTS exclusion warning
This commit is contained in:
Danny Avila 2024-05-19 12:56:55 -04:00 committed by GitHub
parent af8bcb08d6
commit 1a452121fa
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
158 changed files with 4184 additions and 1204 deletions

View file

@ -1,6 +1,6 @@
{
"name": "librechat-data-provider",
"version": "0.6.4",
"version": "0.6.5",
"description": "data services for librechat apps",
"main": "dist/index.js",
"module": "dist/index.es.js",

View file

@ -212,6 +212,10 @@ export function resolveRef(
return schema as OpenAPIV3.SchemaObject;
}
function sanitizeOperationId(input: string) {
return input.replace(/[^a-zA-Z0-9_-]/g, '');
}
/** Function to convert OpenAPI spec to function signatures and request builders */
export function openapiToFunction(openapiSpec: OpenAPIV3.Document): {
functionSignatures: FunctionSignature[];
@ -231,7 +235,8 @@ export function openapiToFunction(openapiSpec: OpenAPIV3.Document): {
};
// Operation ID is used as the function name
const operationId = operationObj.operationId || `${method}_${path}`;
const defaultOperationId = `${method}_${path}`;
const operationId = operationObj.operationId || sanitizeOperationId(defaultOperationId);
const description = operationObj.summary || operationObj.description || '';
const parametersSchema: ParametersSchema = { type: 'object', properties: {}, required: [] };

View file

@ -1,3 +1,5 @@
import type { AssistantsEndpoint } from './schemas';
export const user = () => '/api/user';
export const balance = () => '/api/balance';
@ -83,15 +85,32 @@ export const plugins = () => '/api/plugins';
export const config = () => '/api/config';
export const assistants = (id?: string, options?: Record<string, string>) => {
let url = '/api/assistants';
export const assistants = ({
path,
options,
version,
endpoint,
}: {
path?: string;
options?: object;
endpoint?: AssistantsEndpoint;
version: number | string;
}) => {
let url = `/api/assistants/v${version}`;
if (id) {
url += `/${id}`;
if (path) {
url += `/${path}`;
}
if (endpoint) {
options = {
...(options ?? {}),
endpoint,
};
}
if (options && Object.keys(options).length > 0) {
const queryParams = new URLSearchParams(options).toString();
const queryParams = new URLSearchParams(options as Record<string, string>).toString();
url += `?${queryParams}`;
}

View file

@ -10,6 +10,8 @@ import { TModelsConfig } from './types';
export const defaultSocialLogins = ['google', 'facebook', 'openid', 'github', 'discord'];
export const defaultRetrievalModels = [
'gpt-4o',
'gpt-4o-2024-05-13',
'gpt-4-turbo-preview',
'gpt-3.5-turbo-0125',
'gpt-4-0125-preview',
@ -129,11 +131,17 @@ export enum Capabilities {
tools = 'tools',
}
export const defaultAssistantsVersion = {
[EModelEndpoint.assistants]: 2,
[EModelEndpoint.azureAssistants]: 1,
};
export const assistantEndpointSchema = z.object({
/* assistants specific */
disableBuilder: z.boolean().optional(),
pollIntervalMs: z.number().optional(),
timeoutMs: z.number().optional(),
version: z.union([z.string(), z.number()]).default(2),
supportedIds: z.array(z.string()).min(1).optional(),
excludedIds: z.array(z.string()).min(1).optional(),
retrievalModels: z.array(z.string()).min(1).optional().default(defaultRetrievalModels),
@ -287,6 +295,7 @@ export const configSchema = z.object({
endpoints: z
.object({
[EModelEndpoint.azureOpenAI]: azureEndpointSchema.optional(),
[EModelEndpoint.azureAssistants]: assistantEndpointSchema.optional(),
[EModelEndpoint.assistants]: assistantEndpointSchema.optional(),
custom: z.array(endpointSchema.partial()).optional(),
})
@ -324,6 +333,7 @@ export enum FetchTokenConfig {
export const defaultEndpoints: EModelEndpoint[] = [
EModelEndpoint.openAI,
EModelEndpoint.assistants,
EModelEndpoint.azureAssistants,
EModelEndpoint.azureOpenAI,
EModelEndpoint.bingAI,
EModelEndpoint.chatGPTBrowser,
@ -336,6 +346,7 @@ export const defaultEndpoints: EModelEndpoint[] = [
export const alternateName = {
[EModelEndpoint.openAI]: 'OpenAI',
[EModelEndpoint.assistants]: 'Assistants',
[EModelEndpoint.azureAssistants]: 'Azure Assistants',
[EModelEndpoint.azureOpenAI]: 'Azure OpenAI',
[EModelEndpoint.bingAI]: 'Bing',
[EModelEndpoint.chatGPTBrowser]: 'ChatGPT',
@ -345,24 +356,27 @@ export const alternateName = {
[EModelEndpoint.custom]: 'Custom',
};
const sharedOpenAIModels = [
'gpt-3.5-turbo',
'gpt-3.5-turbo-0125',
'gpt-4-turbo',
'gpt-4-turbo-2024-04-09',
'gpt-4-0125-preview',
'gpt-4-turbo-preview',
'gpt-4-1106-preview',
'gpt-3.5-turbo-1106',
'gpt-3.5-turbo-16k-0613',
'gpt-3.5-turbo-16k',
'gpt-4',
'gpt-4-0314',
'gpt-4-32k-0314',
'gpt-4-0613',
'gpt-3.5-turbo-0613',
];
export const defaultModels = {
[EModelEndpoint.assistants]: [
'gpt-3.5-turbo',
'gpt-3.5-turbo-0125',
'gpt-4-turbo',
'gpt-4-turbo-2024-04-09',
'gpt-4-0125-preview',
'gpt-4-turbo-preview',
'gpt-4-1106-preview',
'gpt-3.5-turbo-1106',
'gpt-3.5-turbo-16k-0613',
'gpt-3.5-turbo-16k',
'gpt-4',
'gpt-4-0314',
'gpt-4-32k-0314',
'gpt-4-0613',
'gpt-3.5-turbo-0613',
],
[EModelEndpoint.azureAssistants]: sharedOpenAIModels,
[EModelEndpoint.assistants]: ['gpt-4o', ...sharedOpenAIModels],
[EModelEndpoint.google]: [
'gemini-pro',
'gemini-pro-vision',
@ -391,25 +405,12 @@ export const defaultModels = {
],
[EModelEndpoint.openAI]: [
'gpt-4o',
'gpt-3.5-turbo-0125',
'gpt-4-turbo',
'gpt-4-turbo-2024-04-09',
'gpt-3.5-turbo-16k-0613',
'gpt-3.5-turbo-16k',
'gpt-4-turbo-preview',
'gpt-4-0125-preview',
'gpt-4-1106-preview',
'gpt-3.5-turbo',
'gpt-3.5-turbo-1106',
...sharedOpenAIModels,
'gpt-4-vision-preview',
'gpt-4',
'gpt-3.5-turbo-instruct-0914',
'gpt-3.5-turbo-0613',
'gpt-3.5-turbo-0301',
'gpt-3.5-turbo-instruct',
'gpt-4-0613',
'text-davinci-003',
'gpt-4-0314',
],
};
@ -440,7 +441,8 @@ export const EndpointURLs: { [key in EModelEndpoint]: string } = {
[EModelEndpoint.gptPlugins]: `/api/ask/${EModelEndpoint.gptPlugins}`,
[EModelEndpoint.azureOpenAI]: `/api/ask/${EModelEndpoint.azureOpenAI}`,
[EModelEndpoint.chatGPTBrowser]: `/api/ask/${EModelEndpoint.chatGPTBrowser}`,
[EModelEndpoint.assistants]: '/api/assistants/chat',
[EModelEndpoint.azureAssistants]: '/api/assistants/v1/chat',
[EModelEndpoint.assistants]: '/api/assistants/v2/chat',
};
export const modularEndpoints = new Set<EModelEndpoint | string>([
@ -458,6 +460,7 @@ export const supportsBalanceCheck = {
[EModelEndpoint.anthropic]: true,
[EModelEndpoint.gptPlugins]: true,
[EModelEndpoint.assistants]: true,
[EModelEndpoint.azureAssistants]: true,
[EModelEndpoint.azureOpenAI]: true,
};
@ -680,7 +683,7 @@ export enum Constants {
/** Key for the app's version. */
VERSION = 'v0.7.2',
/** Key for the Custom Config's version (librechat.yaml). */
CONFIG_VERSION = '1.1.0',
CONFIG_VERSION = '1.1.1',
/** Standard value for the first message's `parentMessageId` value, to indicate no parent exists. */
NO_PARENT = '00000000-0000-0000-0000-000000000000',
/** Fixed, encoded domain length for Azure OpenAI Assistants Function name parsing. */

View file

@ -1,5 +1,5 @@
import type { TSubmission, TMessage, TEndpointOption } from './types';
import { tConvoUpdateSchema, EModelEndpoint } from './schemas';
import { tConvoUpdateSchema, EModelEndpoint, isAssistantsEndpoint } from './schemas';
import { EndpointURLs } from './config';
export default function createPayload(submission: TSubmission) {
@ -12,7 +12,7 @@ export default function createPayload(submission: TSubmission) {
let server = EndpointURLs[endpointType ?? endpoint];
if (isEdited && endpoint === EModelEndpoint.assistants) {
if (isEdited && isAssistantsEndpoint(endpoint)) {
server += '/modify';
} else if (isEdited) {
server = server.replace('/ask/', '/edit/');

View file

@ -166,39 +166,105 @@ export const getEndpointsConfigOverride = (): Promise<unknown | boolean> => {
/* Assistants */
export const createAssistant = (data: a.AssistantCreateParams): Promise<a.Assistant> => {
return request.post(endpoints.assistants(), data);
export const createAssistant = ({
version,
...data
}: a.AssistantCreateParams): Promise<a.Assistant> => {
return request.post(endpoints.assistants({ version }), data);
};
export const getAssistantById = (assistant_id: string): Promise<a.Assistant> => {
return request.get(endpoints.assistants(assistant_id));
export const getAssistantById = ({
endpoint,
assistant_id,
version,
}: {
endpoint: s.AssistantsEndpoint;
assistant_id: string;
version: number | string | number;
}): Promise<a.Assistant> => {
return request.get(
endpoints.assistants({
path: assistant_id,
endpoint,
version,
}),
);
};
export const updateAssistant = (
assistant_id: string,
data: a.AssistantUpdateParams,
): Promise<a.Assistant> => {
return request.patch(endpoints.assistants(assistant_id), data);
export const updateAssistant = ({
assistant_id,
data,
version,
}: {
assistant_id: string;
data: a.AssistantUpdateParams;
version: number | string;
}): Promise<a.Assistant> => {
return request.patch(
endpoints.assistants({
path: assistant_id,
version,
}),
data,
);
};
export const deleteAssistant = (assistant_id: string, model: string): Promise<void> => {
return request.delete(endpoints.assistants(assistant_id, { model }));
export const deleteAssistant = ({
assistant_id,
model,
endpoint,
version,
}: m.DeleteAssistantBody & { version: number | string }): Promise<void> => {
return request.delete(
endpoints.assistants({
path: assistant_id,
options: { model, endpoint },
version,
}),
);
};
export const listAssistants = (
params?: a.AssistantListParams,
params: a.AssistantListParams,
version: number | string,
): Promise<a.AssistantListResponse> => {
return request.get(endpoints.assistants(), { params });
return request.get(
endpoints.assistants({
version,
options: params,
}),
);
};
export function getAssistantDocs(): Promise<a.AssistantDocument[]> {
return request.get(endpoints.assistants('documents'));
export function getAssistantDocs({
endpoint,
version,
}: {
endpoint: s.AssistantsEndpoint;
version: number | string;
}): Promise<a.AssistantDocument[]> {
return request.get(
endpoints.assistants({
path: 'documents',
version,
endpoint,
}),
);
}
/* Tools */
export const getAvailableTools = (): Promise<s.TPlugin[]> => {
return request.get(`${endpoints.assistants()}/tools`);
export const getAvailableTools = (
version: number | string,
endpoint: s.AssistantsEndpoint,
): Promise<s.TPlugin[]> => {
return request.get(
endpoints.assistants({
path: 'tools',
endpoint,
version,
}),
);
};
/* Files */
@ -247,7 +313,11 @@ export const uploadAvatar = (data: FormData): Promise<f.AvatarUploadResponse> =>
export const uploadAssistantAvatar = (data: m.AssistantAvatarVariables): Promise<a.Assistant> => {
return request.postMultiPart(
endpoints.assistants(`avatar/${data.assistant_id}`, { model: data.model }),
endpoints.assistants({
path: `avatar/${data.assistant_id}`,
options: { model: data.model, endpoint: data.endpoint },
version: data.version,
}),
data.formData,
);
};
@ -264,28 +334,55 @@ export const getFileDownload = async (userId: string, file_id: string): Promise<
export const deleteFiles = async (
files: f.BatchFile[],
assistant_id?: string,
tool_resource?: a.EToolResources,
): Promise<f.DeleteFilesResponse> =>
request.deleteWithOptions(endpoints.files(), {
data: { files, assistant_id },
data: { files, assistant_id, tool_resource },
});
/* actions */
export const updateAction = (data: m.UpdateActionVariables): Promise<m.UpdateActionResponse> => {
const { assistant_id, ...body } = data;
return request.post(endpoints.assistants(`actions/${assistant_id}`), body);
const { assistant_id, version, ...body } = data;
return request.post(
endpoints.assistants({
path: `actions/${assistant_id}`,
version,
}),
body,
);
};
export function getActions(): Promise<a.Action[]> {
return request.get(endpoints.assistants('actions'));
export function getActions({
endpoint,
version,
}: {
endpoint: s.AssistantsEndpoint;
version: number | string;
}): Promise<a.Action[]> {
return request.get(
endpoints.assistants({
path: 'actions',
version,
endpoint,
}),
);
}
export const deleteAction = async (
assistant_id: string,
action_id: string,
model: string,
): Promise<void> =>
request.delete(endpoints.assistants(`actions/${assistant_id}/${action_id}/${model}`));
export const deleteAction = async ({
assistant_id,
action_id,
model,
version,
endpoint,
}: m.DeleteActionVariables & { version: number | string }): Promise<void> =>
request.delete(
endpoints.assistants({
path: `actions/${assistant_id}/${action_id}/${model}`,
version,
endpoint,
}),
);
/* conversations */

View file

@ -7,6 +7,7 @@ export const supportsFiles = {
[EModelEndpoint.openAI]: true,
[EModelEndpoint.google]: true,
[EModelEndpoint.assistants]: true,
[EModelEndpoint.azureAssistants]: true,
[EModelEndpoint.azureOpenAI]: true,
[EModelEndpoint.anthropic]: true,
[EModelEndpoint.custom]: true,
@ -152,24 +153,28 @@ export const megabyte = 1024 * 1024;
/** Helper function to get megabytes value */
export const mbToBytes = (mb: number): number => mb * megabyte;
const defaultSizeLimit = mbToBytes(512);
const assistantsFileConfig = {
fileLimit: 10,
fileSizeLimit: defaultSizeLimit,
totalSizeLimit: defaultSizeLimit,
supportedMimeTypes,
disabled: false,
};
export const fileConfig = {
endpoints: {
[EModelEndpoint.assistants]: {
fileLimit: 10,
fileSizeLimit: mbToBytes(512),
totalSizeLimit: mbToBytes(512),
supportedMimeTypes,
disabled: false,
},
[EModelEndpoint.assistants]: assistantsFileConfig,
[EModelEndpoint.azureAssistants]: assistantsFileConfig,
default: {
fileLimit: 10,
fileSizeLimit: mbToBytes(512),
totalSizeLimit: mbToBytes(512),
fileSizeLimit: defaultSizeLimit,
totalSizeLimit: defaultSizeLimit,
supportedMimeTypes,
disabled: false,
},
},
serverFileSizeLimit: mbToBytes(512),
serverFileSizeLimit: defaultSizeLimit,
avatarSizeLimit: mbToBytes(2),
checkType: function (fileType: string, supportedTypes: RegExp[] = supportedMimeTypes) {
return supportedTypes.some((regex) => regex.test(fileType));

View file

@ -38,6 +38,7 @@ const endpointSchemas: Record<EModelEndpoint, EndpointSchema> = {
[EModelEndpoint.chatGPTBrowser]: chatGPTBrowserSchema,
[EModelEndpoint.gptPlugins]: gptPluginsSchema,
[EModelEndpoint.assistants]: assistantSchema,
[EModelEndpoint.azureAssistants]: assistantSchema,
};
// const schemaCreators: Record<EModelEndpoint, (customSchema: DefaultSchemaValues) => EndpointSchema> = {
@ -49,6 +50,7 @@ export function getEnabledEndpoints() {
const defaultEndpoints: string[] = [
EModelEndpoint.openAI,
EModelEndpoint.assistants,
EModelEndpoint.azureAssistants,
EModelEndpoint.azureOpenAI,
EModelEndpoint.google,
EModelEndpoint.bingAI,
@ -273,6 +275,7 @@ const compactEndpointSchemas: Record<string, CompactEndpointSchema> = {
[EModelEndpoint.azureOpenAI]: compactOpenAISchema,
[EModelEndpoint.custom]: compactOpenAISchema,
[EModelEndpoint.assistants]: compactAssistantSchema,
[EModelEndpoint.azureAssistants]: compactAssistantSchema,
[EModelEndpoint.google]: compactGoogleSchema,
/* BingAI needs all fields */
[EModelEndpoint.bingAI]: bingAISchema,

View file

@ -1,13 +1,11 @@
import {
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import type {
UseQueryOptions,
useQuery,
useMutation,
useQueryClient,
UseMutationResult,
QueryObserverResult,
} from '@tanstack/react-query';
import { defaultOrderQuery } from '../types/assistants';
import { initialModelsConfig, LocalStorageKeys } from '../config';
import { defaultOrderQuery } from '../types/assistants';
import * as dataService from '../data-service';
import * as m from '../types/mutations';
import { QueryKeys } from '../keys';
@ -154,8 +152,8 @@ export const useRevokeUserKeyMutation = (name: string): UseMutationResult<unknow
return useMutation(() => dataService.revokeUserKey(name), {
onSuccess: () => {
queryClient.invalidateQueries([QueryKeys.name, name]);
if (name === s.EModelEndpoint.assistants) {
queryClient.invalidateQueries([QueryKeys.assistants, defaultOrderQuery]);
if (s.isAssistantsEndpoint(name)) {
queryClient.invalidateQueries([QueryKeys.assistants, name, defaultOrderQuery]);
queryClient.invalidateQueries([QueryKeys.assistantDocs]);
queryClient.invalidateQueries([QueryKeys.assistants]);
queryClient.invalidateQueries([QueryKeys.assistant]);
@ -171,7 +169,16 @@ export const useRevokeAllUserKeysMutation = (): UseMutationResult<unknown> => {
return useMutation(() => dataService.revokeAllUserKeys(), {
onSuccess: () => {
queryClient.invalidateQueries([QueryKeys.name]);
queryClient.invalidateQueries([QueryKeys.assistants, defaultOrderQuery]);
queryClient.invalidateQueries([
QueryKeys.assistants,
s.EModelEndpoint.assistants,
defaultOrderQuery,
]);
queryClient.invalidateQueries([
QueryKeys.assistants,
s.EModelEndpoint.azureAssistants,
defaultOrderQuery,
]);
queryClient.invalidateQueries([QueryKeys.assistantDocs]);
queryClient.invalidateQueries([QueryKeys.assistants]);
queryClient.invalidateQueries([QueryKeys.assistant]);

View file

@ -22,9 +22,19 @@ export enum EModelEndpoint {
gptPlugins = 'gptPlugins',
anthropic = 'anthropic',
assistants = 'assistants',
azureAssistants = 'azureAssistants',
custom = 'custom',
}
export type AssistantsEndpoint = EModelEndpoint.assistants | EModelEndpoint.azureAssistants;
export const isAssistantsEndpoint = (endpoint?: AssistantsEndpoint | null | string): boolean => {
if (!endpoint) {
return false;
}
return endpoint.toLowerCase().endsWith(EModelEndpoint.assistants);
};
export enum ImageDetail {
low = 'low',
auto = 'auto',

View file

@ -183,6 +183,7 @@ export type TConfig = {
plugins?: Record<string, string>;
name?: string;
iconURL?: string;
version?: string;
modelDisplayLabel?: string;
userProvide?: boolean | null;
userProvideURL?: boolean | null;

View file

@ -1,4 +1,5 @@
import type { OpenAPIV3 } from 'openapi-types';
import type { AssistantsEndpoint } from 'src/schemas';
import type { TFile } from './files';
export type Schema = OpenAPIV3.SchemaObject & { description?: string };
@ -10,10 +11,16 @@ export type Metadata = {
export enum Tools {
code_interpreter = 'code_interpreter',
file_search = 'file_search',
retrieval = 'retrieval',
function = 'function',
}
export enum EToolResources {
code_interpreter = 'code_interpreter',
file_search = 'file_search',
}
export type Tool = {
[type: string]: Tools;
};
@ -27,6 +34,35 @@ export type FunctionTool = {
};
};
/**
* A set of resources that are used by the assistant's tools. The resources are
* specific to the type of tool. For example, the `code_interpreter` tool requires
* a list of file IDs, while the `file_search` tool requires a list of vector store
* IDs.
*/
export interface ToolResources {
code_interpreter?: CodeInterpreterResource;
file_search?: FileSearchResource;
}
export interface CodeInterpreterResource {
/**
* A list of [file](https://platform.openai.com/docs/api-reference/files) IDs made
* available to the `code_interpreter`` tool. There can be a maximum of 20 files
* associated with the tool.
*/
file_ids?: Array<string>;
}
export interface FileSearchResource {
/**
* The ID of the
* [vector store](https://platform.openai.com/docs/api-reference/vector-stores/object)
* attached to this assistant. There can be a maximum of 1 vector store attached to
* the assistant.
*/
vector_store_ids?: Array<string>;
}
export type Assistant = {
id: string;
created_at: number;
@ -38,8 +74,11 @@ export type Assistant = {
name: string | null;
object: string;
tools: FunctionTool[];
tool_resources?: ToolResources;
};
export type TAssistantsMap = Record<AssistantsEndpoint, Record<string, Assistant>>;
export type AssistantCreateParams = {
model: string;
description?: string | null;
@ -48,6 +87,8 @@ export type AssistantCreateParams = {
metadata?: Metadata | null;
name?: string | null;
tools?: Array<FunctionTool | string>;
endpoint: AssistantsEndpoint;
version: number | string;
};
export type AssistantUpdateParams = {
@ -58,6 +99,8 @@ export type AssistantUpdateParams = {
metadata?: Metadata | null;
name?: string | null;
tools?: Array<FunctionTool | string>;
tool_resources?: ToolResources;
endpoint: AssistantsEndpoint;
};
export type AssistantListParams = {
@ -65,6 +108,7 @@ export type AssistantListParams = {
before?: string | null;
after?: string | null;
order?: 'asc' | 'desc';
endpoint: AssistantsEndpoint;
};
export type AssistantListResponse = {
@ -123,12 +167,22 @@ export type RetrievalToolCall = {
type: 'retrieval'; // The type of tool call, always 'retrieval'.
};
/**
* Details of a Retrieval tool call the run step was involved in.
* Includes the tool call ID and the type of tool call.
*/
export type FileSearchToolCall = {
id: string; // The ID of the tool call object.
file_search: unknown; // An empty object for now.
type: 'file_search'; // The type of tool call, always 'retrieval'.
};
/**
* Details of the tool calls involved in a run step.
* Can be associated with one of three types of tools: `code_interpreter`, `retrieval`, or `function`.
*/
export type ToolCallsStepDetails = {
tool_calls: Array<CodeToolCall | RetrievalToolCall | FunctionToolCall>; // An array of tool calls the run step was involved in.
tool_calls: Array<CodeToolCall | RetrievalToolCall | FileSearchToolCall | FunctionToolCall>; // An array of tool calls the run step was involved in.
type: 'tool_calls'; // Always 'tool_calls'.
};
@ -203,6 +257,7 @@ export enum StepTypes {
export enum ToolCallTypes {
FUNCTION = 'function',
RETRIEVAL = 'retrieval',
FILE_SEARCH = 'file_search',
CODE_INTERPRETER = 'code_interpreter',
}
@ -239,7 +294,14 @@ export type PartMetadata = {
action?: boolean;
};
export type ContentPart = (CodeToolCall | RetrievalToolCall | FunctionToolCall | ImageFile | Text) &
export type ContentPart = (
| CodeToolCall
| RetrievalToolCall
| FileSearchToolCall
| FunctionToolCall
| ImageFile
| Text
) &
PartMetadata;
export type TMessageContentParts =
@ -247,7 +309,8 @@ export type TMessageContentParts =
| { type: ContentTypes.TEXT; text: Text & PartMetadata }
| {
type: ContentTypes.TOOL_CALL;
tool_call: (CodeToolCall | RetrievalToolCall | FunctionToolCall) & PartMetadata;
tool_call: (CodeToolCall | RetrievalToolCall | FileSearchToolCall | FunctionToolCall) &
PartMetadata;
}
| { type: ContentTypes.IMAGE_FILE; image_file: ImageFile & PartMetadata };
@ -315,6 +378,7 @@ export type Action = {
type?: string;
settings?: Record<string, unknown>;
metadata: ActionMetadata;
version: number | string;
};
export type AssistantAvatar = {
@ -334,6 +398,7 @@ export type AssistantDocument = {
};
export enum FilePurpose {
Vision = 'vision',
FineTune = 'fine-tune',
FineTuneResults = 'fine-tune-results',
Assistants = 'assistants',

View file

@ -1,11 +1,17 @@
import { EToolResources } from './assistants';
export enum FileSources {
local = 'local',
firebase = 'firebase',
azure = 'azure',
openai = 'openai',
s3 = 's3',
vectordb = 'vectordb',
}
export const checkOpenAIStorage = (source: string) =>
source === FileSources.openai || source === FileSources.azure;
export enum FileContext {
avatar = 'avatar',
unknown = 'unknown',
@ -54,6 +60,7 @@ export type TFile = {
usage: number;
context?: FileContext;
source?: FileSources;
filterSource?: FileSources;
width?: number;
height?: number;
expiresAt?: string | Date;
@ -97,6 +104,7 @@ export type BatchFile = {
export type DeleteFilesBody = {
files: BatchFile[];
assistant_id?: string;
tool_resource?: EToolResources;
};
export type DeleteMutationOptions = {

View file

@ -45,6 +45,8 @@ export type AssistantAvatarVariables = {
model: string;
formData: FormData;
postCreation?: boolean;
endpoint: types.AssistantsEndpoint;
version: number | string;
};
export type UpdateActionVariables = {
@ -53,6 +55,8 @@ export type UpdateActionVariables = {
metadata: ActionMetadata;
action_id?: string;
model: string;
endpoint: types.AssistantsEndpoint;
version: number | string;
};
export type UploadAssistantAvatarOptions = MutationOptions<Assistant, AssistantAvatarVariables>;
@ -66,7 +70,11 @@ export type UpdateAssistantVariables = {
export type UpdateAssistantMutationOptions = MutationOptions<Assistant, UpdateAssistantVariables>;
export type DeleteAssistantBody = { assistant_id: string; model: string };
export type DeleteAssistantBody = {
assistant_id: string;
model: string;
endpoint: types.AssistantsEndpoint;
};
export type DeleteAssistantMutationOptions = MutationOptions<
void,
@ -77,6 +85,7 @@ export type UpdateActionResponse = [AssistantDocument, Assistant, Action];
export type UpdateActionOptions = MutationOptions<UpdateActionResponse, UpdateActionVariables>;
export type DeleteActionVariables = {
endpoint: types.AssistantsEndpoint;
assistant_id: string;
action_id: string;
model: string;