mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-20 10:20:15 +01:00
✨ feat: Assistants API, General File Support, Side Panel, File Explorer (#1696)
* feat: assistant name/icon in Landing & Header * feat: assistname in textarea placeholder, and use `Assistant` as default name * feat: display non-image files in user messages * fix: only render files if files.length is > 0 * refactor(config -> file-config): move file related configuration values to separate module, add excel types * chore: spreadsheet file rendering * fix(Landing): dark mode style for Assistant Name * refactor: move progress incrementing to own hook, start smaller, cap near limit \(1\) * refactor(useContentHandler): add empty Text part if last part was completed tool or image * chore: add accordion trigger border styling for dark mode * feat: Assistant Builder model selection * chore: use Spinner when Assistant is mutating * fix(get/assistants): return correct response object `AssistantListResponse` * refactor(Spinner): pass size as prop * refactor: make assistant crud mutations optimistic, add types for options * chore: remove assistants route and view * chore: move assistant builder components to separate directory * feat(ContextButton): delete Assistant via context button/dialog, add localization * refactor: conditionally show use and context menu buttons, add localization for create assistant * feat: save side panel states to localStorage * style(SidePanel): improve avatar menu and assistant select styling for dark mode * refactor: make NavToggle reusable for either side (left or right), add SidePanel Toggle with ability to close it completely * fix: resize handle and navToggle behavior * fix(/avatar/:assistant_id): await `deleteFile` and assign unique name to uploaded image * WIP: file UI components from PR #576 * refactor(OpenAIMinimalIcon): pass className * feat: formatDate helper fn * feat: DataTableColumnHeader * feat: add row selection, formatted row values, number of rows selected * WIP: add files to Side panel temporarily * feat: `LB_QueueAsyncCall`: Leaky Bucket queue for external APIs, use in `processDeleteRequest` * fix(TFile): correct `source` type with `FileSources` * fix(useFileHandling): use `continue` instead of return when iterating multiple files, add file type to extendedFile * chore: add generic setter type * refactor(processDeleteRequest): settle promises to prevent rejections from processing deletions, log errors * feat: `useFileDeletion` to reuse file deletion logic * refactor(useFileDeletion): make `setFiles` an optional param and use object as param * feat: useDeleteFilesFromTable * feat: use real `files` data and add deletion action to data table * fix(Table): make headers sticky * feat: add dynamic filtering for columns; only show to user Host or OpenAI storage type * style(DropdownMenu): replace `slate` with `gray` * style(DataTable): apply dark mode themes and other misc styling * style(Columns): add color to OpenAI Storage option * refactor(FileContainer): make file preview reusable * refactor(Images): make image preview reusable * refactor(FilePreview): make file prop optional for FileIcon and FilePreview, fix relative style * feat(Columns): add file/image previews, set a minimum size to show for file size in bytes * WIP: File Panel with real files and formatted * feat: open files dialog from panel * style: file data table mobile and general column styling fixes * refactor(api/files): return files sorted by the most recently updated * refactor: provide fileMap through context to prevent re-selecting files to map in different areas; remove unused imports commented out in PanelColumns * refactor(ExtendFile): make File type optional, add `attached` to prevent attached files from being deleted on remove, make Message.files a partial TFile type * feat: attach files through file panel * refactor(useFileHandling): move files to the start of cache list when uploaded * refactor(useDeleteFilesMutation): delete files from cache when successfully deleted from server * fix(FileRow): handle possible edge case of duplication due to attaching recently uploaded file * style(SidePanel): make resize grip border transparent, remove unnecessary styling on close sidepanel button * feat: action utilities and tests * refactor(actions): add `ValidationResult` type and change wording for no server URL found * refactor(actions): check for empty server URL * fix(data-provider): revert tsconfig to fix type issue resolution * feat(client): first pass of actions input for assistants * refactor(FunctionSignature): change method to output object instead of string * refactor(models/Assistant): add actions field to schema, use searchParams object for methods, and add `getAssistant` * feat: post actions input first pass - create new Action document - add actions to Assistant DB document - create /action/:assistant_id POST route - pass more props down from PanelSwitcher, derive assistant_id from switcher - move privacy policy to ActionInput - reset data on input change/validation - add `useUpdateAction` - conform FunctionSignature type to FunctionTool - add action, assistant doc, update hook related types * refactor: optimize assistant/actions relationship - past domain in metadata as hostname and not a URL - include domain in tool name - add `getActions` for actions retrieval by user - add `getAssistants` for assistant docs retrieval by user - add `assistant_id` to Action schema - move actions to own module as a subroute to `api/assistants` - add `useGetActionsQuery` and `useGetAssistantDocsQuery` hooks - fix Action type def * feat: show assistant actions in assistant builder * feat: switch to actions on action click, editing action styling * fix: add Assistant state for builder panel to allow immediate selection of newly created assistants as well as retaining the current assistant when switching to a different panel within the builder * refactor(SidePanel/NavToggle): offset less from right when SidePanel is completely collapsed * chore: rename `processActions` -> `processRequiredActions` * chore: rename Assistant API Action to RequiredAction * refactor(actions): avoid nesting actual API params under generic `requestBody` to optimize LLM token usage * fix(handleTools): avoid calling `validTool` if not defined, add optional param to skip the loading of specs, which throws an error in the context of assistants * WIP: working first pass of toolCalls generated from openapi specs * WIP: first pass ToolCall styling * feat: programmatic iv encryption/decryption helpers * fix: correct ActionAuth types/enums, and define type for AuthForm * feat: encryption/decryption helpers for Action AuthMetadata * refactor(getActions): remove sensitive fields from query response * refactor(POST/actions): encrypt and remove sensitive fields from mutation response * fix(ActionService): change ESM import to CJS * feat: frontend auth handling for actions + optimistic update on action update/creation * refactor(actions): use the correct variables and types for setAuth method * refactor: POST /:assistant_id action can now handle updating an existing action, add `saved_auth_fields` to determine when user explicitly saves new auth creds. only send auth metadata if user explicitly saved fields * refactor(createActionTool): catch errors and send back meaningful error message, add flag to `getActions` to determine whether to retrieve sensitive values or not * refactor(ToolService): add `action` property to ToolCall PartMetadata to determine if the tool call was an action, fix parsing function name issue with actionDelimiter * fix(ActionRequest): use URL class to correctly join endpoint parts for `execute` call * feat: delete assistant actions * refactor: conditionally show Available actions * refactor: show `retrieval` and `code_interpreter` as Capabilities, swap `Switch` for `Checkbox` * chore: remove shadow-stroke from messages * WIP: first pass of Assistants Knowledge attachments * refactor: remove AssistantsProvider in favor of FormProvider, fix selectedAssistant re-render bug, map Assistant file_ids to files via fileMap, initialize Knowledge component with mapped files if any exist * fix: prevent deleting files on assistant file upload * chore: remove console.log * refactor(useUploadFileMutation): update files and assistants cache on upload * chore: disable oauth option as not supported yet * feat: cancel assistant runs * refactor: initialize OpenAI client with helper function, resolve all related circular dependencies * fix(DALL-E): initialization * fix(process): openai client initialization * fix: select an existing Assistant when the active one is deleted * chore: allow attaching files for assistant endpoint, send back relevant OpenAI error message when uploading, deconstruct openAI initialization correctly, add `message_file` to formData when a file is attached to the message but not the assistant * fix: add assistant_id on newConvo * fix(initializeClient): import fix * chore: swap setAssistant for setOption in useEffect * fix(DALL-E): add processFileURL to loadTools call * chore: add customConfig to debug logs * feat: delete threads on convo delete * chore: replace Assistants icon * chore: remove console.dir() in `abortRun` * feat(AssistantService): accumulate text values from run in openai.responseText * feat: titling for assistants endpoint * chore: move panel file components to appropriate directory, add file checks for attaching files, change icon for Attach Files * refactor: add localizations to tools, plugins, add condition for adding/remove user plugins so tool selections don't affect this value * chore: disable `import from url` action for now * chore: remove textMimeTypes from default fileConfig for now * fix: catch tool errors and send as outputs with error messages * fix: React warning about button as descendant of button * style: retrieval and cancelled icon * WIP: pass isSubmitting to Parts, use InProgressCall to display cancelled tool calls correctly, show domain/function name * fix(meilisearch): fix `postSaveHook` issue where indexing expects a mongo document, and join all text content parts for meili indexing * ci: fix dall-e tests * ci: fix client tests * fix: button types in actions panel * fix: plugin auth form persisting across tool selections * fix(ci): update AppService spec with `loadAndFormatTools` * fix(clearConvos): add id check earlier on * refactor(AssistantAvatar): set previewURL dynamically when emtadata.avatar changes * feat(assistants): addTitle cache setting * fix(useSSE): resolve rebase conflicts * fix: delete mutation * style(SidePanel): make grip visible on active and hover, invisible otherwise * ci: add data-provider tests to workflow, also update eslint/tsconfig to recognize specs, and add `text/csv` to fileConfig * fix: handle edge case where auth object is undefined, and log errors * refactor(actions): resolve schemas, add tests for resolving refs, import specs from separate file for tests * chore: remove comment * fix(ActionsInput): re-render bug when initializing states with action fields * fix(patch/assistant): filter undefined tools * chore: add logging for errors in assistants routes * fix(updateAssistant): map actions to functions to avoid overwriting * fix(actions): properly handle GET paths * fix(convos): unhandled delete thread exception * refactor(AssistantService): pass both thread_id and conversationId when sending intermediate assistant messages, remove `mapMessagesToSteps` from AssistantService * refactor(useSSE): replace all messages with runMessages and pass latestMessageId to abortRun; fix(checkMessageGaps): include tool calls when syncing messages * refactor(assistants/chat): invoke `createOnTextProgress` after thread creation * chore: add typing * style: sidepanel styling * style: action tool call domain styling * feat(assistants): default models, limit retrieval to certain models, add env variables to to env.example * feat: assistants api key in EndpointService * refactor: set assistant model to conversation on assistant switch * refactor: set assistant model to conversation on assistant select from panel * fix(retrieveAndProcessFile): catch attempt to download file with `assistant` purpose which is not allowed; add logging * feat: retrieval styling, handling, and logging * chore: rename ASSISTANTS_REVERSE_PROXY to ASSISTANTS_BASE_URL * feat: FileContext for file metadata * feat: context file mgmt and filtering * style(Select): hover/rounded changes * refactor: explicit conversation switch, endpoint dependent, through `useSelectAssistant`, which does not create new chat if current endpoint is assistant endpoint * fix(AssistantAvatar): make empty previewURL if no avatar present * refactor: side panel mobile styling * style: merge tool and action section, optimize mobile styling for action/tool buttons * fix: localStorage issues * fix(useSelectAssistant): invoke react query hook directly in select hook as Map was not being updated in time * style: light mode fixes * fix: prevent sidepanel nav styling from shifting layout up * refactor: change default layout (collapsed by default) * style: mobile optimization of DataTable * style: datatable * feat: client-side hide right-side panel * chore(useNewConvo): add partial typing for preset * fix(useSelectAssistant): pass correct model name by using template as preset * WIP: assistant presets * refactor(ToolService): add native solution for `TavilySearchResults` and log tool output errors * refactor: organize imports and use native TavilySearchResults * fix(TavilySearchResults): stringify result * fix(ToolCall): show tool call outputs when not an action * chore: rename Prompt Prefix to custom instructions (in user facing text only) * refactor(EditPresetDialog): Optimize setting title by debouncing, reset preset on dialog close to avoid state mixture * feat: add `presetOverride` to overwrite active conversation settings when saving a Preset (relevant for client side updates only) * feat: Assistant preset settings (client-side) * fix(Switcher): only set assistant_id and model if current endpoint is Assistants * feat: use `useDebouncedInput` for updating conversation settings, starting with EditPresetDialog title setting and Assistant instructions setting * feat(Assistants): add instructions field to settings * feat(chat/assistants): pass conversation settings to run body * wip: begin localization and only allow actions if the assistant is created * refactor(AssistantsPanel): knowledge localization, allow tools on creation * feat: experimental: allow 'priming' values before assistant is created, that would normally require an assistant_id to be defined * chore: trim console logs and make more meaningful * chore: toast messages * fix(ci): date test * feat: create file when uploading Assistant Avatar * feat: file upload rate limiting from custom config with dynamic file route initialization * refactor: use file upload limiters on post routes only * refactor(fileConfig): add endpoints field for endpoint specific fileconfigs, add mergeConfig function, add tests * refactor: fileConfig route, dynamic multer instances used on all '/' and '/images' POST routes, data service and query hook * feat: supportedMimeTypesSchema, test for array of regex * feat: configurable file config limits * chore: clarify assistants file knowledge prereq. * chore(useTextarea): default to localized 'Assistant' if assistant name is empty * feat: configurable file limits and toggle file upload per endpoint * fix(useUploadFileMutation): prevent updating assistant.files cache if file upload is a message_file attachment * fix(AssistantSelect): set last selected assistant only when timeout successfully runs * refactor(queries): disable assistant queries if assistants endpoint is not enabled * chore(Switcher): add localization * chore: pluralize `assistant` for `EModelEndpoint key and value * feat: show/hide assistant UI components based on endpoint availability; librechat.yaml config for disabling builder section and setting polling/timeout intervals * fix(compactEndpointSchemas): use EModelEndpoint for schema access * feat(runAssistant): use configured values from `librechat.yaml` for `pollIntervalMs` and `timeout` * fix: naming issue * wip: revert landing * 🎉 happy birthday LibreChat (#1768) * happy birthday LibreChat * Refactor endpoint condition in Landing component * Update birthday message in Eng.tsx * fix(/config): avoid nesting ternaries * refactor(/config): check birthday --------- Co-authored-by: Danny Avila <messagedaniel@protonmail.com> * fix: landing * fix: landing * fix(useMessageHelpers): hardcoded check to use EModelEndpoint instead * fix(ci): convo test revert to main * fix(assistants/chat): fix issue where assistant_id was being saved as model for convo * chore: added logging, promises racing to prevent longer timeouts, explicit setting of maxRetries and timeouts, robust catching of invalid abortRun params * refactor: use recoil state for `showStopButton` and only show for assistants endpoint after syncing conversation data * refactor: optimize abortRun strategy using localStorage, refactor `abortConversation` to use async/await and await the result, refactor how the abortKey cache is set for runs * fix(checkMessageGaps): assign `assistant_id` to synced messages if defined; prevents UI from showing blank assistant for cancelled messages * refactor: re-order sequence of chat route, only allow aborting messages after run is created, cancel abortRun if there was a cancelling error (likely due already cancelled in chat route), and add extra logging * chore(typedefs): add httpAgent type to OpenAIClient * refactor: use custom implementation of retrieving run with axios to allow for timing out run query * fix(waitForRun): handle timed out run retrieval query * refactor: update preset conditions: - presets will retain settings when a different endpoint is selected; for existing convos, either when modular or is assistant switch - no longer use `navigateToConvo` on preset select * fix: temporary calculator hack as expects string input when invoked * fix: cancel abortRun only when cancelling error is a result of the run already being cancelled * chore: remove use of `fileMaxSizeMB` and total counterpart (redundant) * docs: custom config documentation update * docs: assistants api setup and dotenv, new custom config fields * refactor(Switcher): make Assistant switcher sticky in SidePanel * chore(useSSE): remove console log of data and message index * refactor(AssistantPanel): button styling and add secondary select button to bottom of panel * refactor(OpenAIClient): allow passing conversationId to RunManager through titleConvo and initializeLLM to properly record title context tokens used in cases where conversationId was not defined by the client * feat(assistants): token tracking for assistant runs * chore(spendTokens): improve logging * feat: support/exclude specific assistant Ids * chore: add update `librechat.example.yaml`, optimize `AppService` handling, new tests for `AppService`, optimize missing/outdate config logging * chore: mount docker logs to root of project * chore: condense axios errors * chore: bump vite * chore: vite hot reload fix using latest version * chore(getOpenAIModels): sort instruct models to the end of models list * fix(assistants): user provided key * fix(assistants): user provided key, invalidate more queries on revoke --------- Co-authored-by: Marco Beretta <81851188+Berry-13@users.noreply.github.com>
This commit is contained in:
parent
cd2786441a
commit
ecd63eb9f1
316 changed files with 21873 additions and 6315 deletions
|
|
@ -39,7 +39,9 @@
|
|||
},
|
||||
"homepage": "https://github.com/danny-avila/LibreChat#readme",
|
||||
"dependencies": {
|
||||
"@types/js-yaml": "^4.0.9",
|
||||
"axios": "^1.3.4",
|
||||
"js-yaml": "^4.1.0",
|
||||
"openai": "4.11.1",
|
||||
"zod": "^3.22.4"
|
||||
},
|
||||
|
|
|
|||
475
packages/data-provider/specs/actions.spec.ts
Normal file
475
packages/data-provider/specs/actions.spec.ts
Normal file
|
|
@ -0,0 +1,475 @@
|
|||
import axios from 'axios';
|
||||
import { OpenAPIV3 } from 'openapi-types';
|
||||
import {
|
||||
resolveRef,
|
||||
ActionRequest,
|
||||
openapiToFunction,
|
||||
FunctionSignature,
|
||||
validateAndParseOpenAPISpec,
|
||||
} from '../src/actions';
|
||||
import { getWeatherOpenapiSpec, whimsicalOpenapiSpec, scholarAIOpenapiSpec } from './openapiSpecs';
|
||||
import { AuthorizationTypeEnum, AuthTypeEnum } from '../src/types/assistants';
|
||||
import type { FlowchartSchema } from './openapiSpecs';
|
||||
import type { ParametersSchema } from '../src/actions';
|
||||
|
||||
jest.mock('axios');
|
||||
const mockedAxios = axios as jest.Mocked<typeof axios>;
|
||||
|
||||
describe('FunctionSignature', () => {
|
||||
it('creates a function signature and converts to JSON tool', () => {
|
||||
const signature = new FunctionSignature('testFunction', 'A test function', {
|
||||
param1: { type: 'string' },
|
||||
} as unknown as ParametersSchema);
|
||||
expect(signature.name).toBe('testFunction');
|
||||
expect(signature.description).toBe('A test function');
|
||||
expect(signature.toObjectTool()).toEqual({
|
||||
type: 'function',
|
||||
function: {
|
||||
name: 'testFunction',
|
||||
description: 'A test function',
|
||||
parameters: {
|
||||
param1: { type: 'string' },
|
||||
},
|
||||
},
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('ActionRequest', () => {
|
||||
// Mocking responses for each method
|
||||
beforeEach(() => {
|
||||
mockedAxios.get.mockResolvedValue({ data: { success: true, method: 'GET' } });
|
||||
mockedAxios.post.mockResolvedValue({ data: { success: true, method: 'POST' } });
|
||||
mockedAxios.put.mockResolvedValue({ data: { success: true, method: 'PUT' } });
|
||||
mockedAxios.delete.mockResolvedValue({ data: { success: true, method: 'DELETE' } });
|
||||
mockedAxios.patch.mockResolvedValue({ data: { success: true, method: 'PATCH' } });
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
|
||||
it('should make a GET request', async () => {
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/test',
|
||||
'GET',
|
||||
'testOp',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await actionRequest.setParams({ param1: 'value1' });
|
||||
const response = await actionRequest.execute();
|
||||
expect(mockedAxios.get).toHaveBeenCalledWith('https://example.com/test', expect.anything());
|
||||
expect(response.data).toEqual({ success: true, method: 'GET' });
|
||||
});
|
||||
|
||||
describe('ActionRequest', () => {
|
||||
beforeEach(() => {
|
||||
mockedAxios.get.mockClear();
|
||||
mockedAxios.post.mockClear();
|
||||
mockedAxios.put.mockClear();
|
||||
mockedAxios.delete.mockClear();
|
||||
mockedAxios.patch.mockClear();
|
||||
});
|
||||
|
||||
it('handles GET requests', async () => {
|
||||
mockedAxios.get.mockResolvedValue({ data: { success: true } });
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/get',
|
||||
'GET',
|
||||
'testGet',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await actionRequest.setParams({ param: 'test' });
|
||||
const response = await actionRequest.execute();
|
||||
expect(mockedAxios.get).toHaveBeenCalled();
|
||||
expect(response.data.success).toBe(true);
|
||||
});
|
||||
|
||||
it('handles POST requests', async () => {
|
||||
mockedAxios.post.mockResolvedValue({ data: { success: true } });
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/post',
|
||||
'POST',
|
||||
'testPost',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await actionRequest.setParams({ param: 'test' });
|
||||
const response = await actionRequest.execute();
|
||||
expect(mockedAxios.post).toHaveBeenCalled();
|
||||
expect(response.data.success).toBe(true);
|
||||
});
|
||||
|
||||
it('handles PUT requests', async () => {
|
||||
mockedAxios.put.mockResolvedValue({ data: { success: true } });
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/put',
|
||||
'PUT',
|
||||
'testPut',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await actionRequest.setParams({ param: 'test' });
|
||||
const response = await actionRequest.execute();
|
||||
expect(mockedAxios.put).toHaveBeenCalled();
|
||||
expect(response.data.success).toBe(true);
|
||||
});
|
||||
|
||||
it('handles DELETE requests', async () => {
|
||||
mockedAxios.delete.mockResolvedValue({ data: { success: true } });
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/delete',
|
||||
'DELETE',
|
||||
'testDelete',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await actionRequest.setParams({ param: 'test' });
|
||||
const response = await actionRequest.execute();
|
||||
expect(mockedAxios.delete).toHaveBeenCalled();
|
||||
expect(response.data.success).toBe(true);
|
||||
});
|
||||
|
||||
it('handles PATCH requests', async () => {
|
||||
mockedAxios.patch.mockResolvedValue({ data: { success: true } });
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/patch',
|
||||
'PATCH',
|
||||
'testPatch',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await actionRequest.setParams({ param: 'test' });
|
||||
const response = await actionRequest.execute();
|
||||
expect(mockedAxios.patch).toHaveBeenCalled();
|
||||
expect(response.data.success).toBe(true);
|
||||
});
|
||||
|
||||
it('throws an error for unsupported HTTP methods', async () => {
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/invalid',
|
||||
'INVALID',
|
||||
'testInvalid',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await expect(actionRequest.execute()).rejects.toThrow('Unsupported HTTP method: INVALID');
|
||||
});
|
||||
});
|
||||
|
||||
it('throws an error for unsupported HTTP method', async () => {
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/test',
|
||||
'INVALID',
|
||||
'testOp',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
await expect(actionRequest.execute()).rejects.toThrow('Unsupported HTTP method: INVALID');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Authentication Handling', () => {
|
||||
it('correctly sets Basic Auth header', async () => {
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/test',
|
||||
'GET',
|
||||
'testOp',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
|
||||
const api_key = 'user:pass';
|
||||
const encodedCredentials = Buffer.from('user:pass').toString('base64');
|
||||
|
||||
actionRequest.setAuth({
|
||||
auth: {
|
||||
type: AuthTypeEnum.ServiceHttp,
|
||||
authorization_type: AuthorizationTypeEnum.Basic,
|
||||
},
|
||||
api_key,
|
||||
});
|
||||
|
||||
await actionRequest.setParams({ param1: 'value1' });
|
||||
await actionRequest.execute();
|
||||
expect(mockedAxios.get).toHaveBeenCalledWith('https://example.com/test', {
|
||||
headers: expect.objectContaining({
|
||||
Authorization: `Basic ${encodedCredentials}`,
|
||||
}),
|
||||
params: expect.anything(),
|
||||
});
|
||||
});
|
||||
|
||||
it('correctly sets Bearer token', async () => {
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/test',
|
||||
'GET',
|
||||
'testOp',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
actionRequest.setAuth({
|
||||
auth: {
|
||||
type: AuthTypeEnum.ServiceHttp,
|
||||
authorization_type: AuthorizationTypeEnum.Bearer,
|
||||
},
|
||||
api_key: 'token123',
|
||||
});
|
||||
await actionRequest.setParams({ param1: 'value1' });
|
||||
await actionRequest.execute();
|
||||
expect(mockedAxios.get).toHaveBeenCalledWith('https://example.com/test', {
|
||||
headers: expect.objectContaining({
|
||||
Authorization: 'Bearer token123',
|
||||
}),
|
||||
params: expect.anything(),
|
||||
});
|
||||
});
|
||||
|
||||
it('correctly sets API Key', async () => {
|
||||
const actionRequest = new ActionRequest(
|
||||
'https://example.com',
|
||||
'/test',
|
||||
'GET',
|
||||
'testOp',
|
||||
false,
|
||||
'application/json',
|
||||
);
|
||||
// Updated to match ActionMetadata structure
|
||||
actionRequest.setAuth({
|
||||
auth: {
|
||||
type: AuthTypeEnum.ServiceHttp, // Assuming this is a valid enum or value for your context
|
||||
authorization_type: AuthorizationTypeEnum.Custom, // Assuming Custom means using a custom header
|
||||
custom_auth_header: 'X-API-KEY',
|
||||
},
|
||||
api_key: 'abc123',
|
||||
});
|
||||
await actionRequest.setParams({ param1: 'value1' });
|
||||
await actionRequest.execute();
|
||||
expect(mockedAxios.get).toHaveBeenCalledWith('https://example.com/test', {
|
||||
headers: expect.objectContaining({
|
||||
'X-API-KEY': 'abc123',
|
||||
}),
|
||||
params: expect.anything(),
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolveRef', () => {
|
||||
it('correctly resolves $ref references in the OpenAPI spec', () => {
|
||||
const openapiSpec = whimsicalOpenapiSpec;
|
||||
const flowchartRequestRef = (
|
||||
openapiSpec.paths['/ai.chatgpt.render-flowchart']?.post
|
||||
?.requestBody as OpenAPIV3.RequestBodyObject
|
||||
)?.content['application/json'].schema;
|
||||
expect(flowchartRequestRef).toBeDefined();
|
||||
const resolvedFlowchartRequest = resolveRef(
|
||||
flowchartRequestRef as OpenAPIV3.RequestBodyObject,
|
||||
openapiSpec.components,
|
||||
);
|
||||
|
||||
expect(resolvedFlowchartRequest).toBeDefined();
|
||||
expect(resolvedFlowchartRequest.type).toBe('object');
|
||||
const properties = resolvedFlowchartRequest.properties as FlowchartSchema;
|
||||
expect(properties).toBeDefined();
|
||||
expect(properties.mermaid).toBeDefined();
|
||||
expect(properties.mermaid.type).toBe('string');
|
||||
});
|
||||
});
|
||||
|
||||
describe('openapiToFunction', () => {
|
||||
it('converts OpenAPI spec to function signatures and request builders', () => {
|
||||
const { functionSignatures, requestBuilders } = openapiToFunction(getWeatherOpenapiSpec);
|
||||
expect(functionSignatures.length).toBe(1);
|
||||
expect(functionSignatures[0].name).toBe('GetCurrentWeather');
|
||||
|
||||
const parameters = functionSignatures[0].parameters as ParametersSchema & {
|
||||
properties: {
|
||||
location: {
|
||||
type: 'string';
|
||||
};
|
||||
locations: {
|
||||
type: 'array';
|
||||
items: {
|
||||
type: 'object';
|
||||
properties: {
|
||||
city: {
|
||||
type: 'string';
|
||||
};
|
||||
state: {
|
||||
type: 'string';
|
||||
};
|
||||
countryCode: {
|
||||
type: 'string';
|
||||
};
|
||||
time: {
|
||||
type: 'string';
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
expect(parameters).toBeDefined();
|
||||
expect(parameters.properties.locations).toBeDefined();
|
||||
expect(parameters.properties.locations.type).toBe('array');
|
||||
expect(parameters.properties.locations.items.type).toBe('object');
|
||||
|
||||
expect(parameters.properties.locations.items.properties.city.type).toBe('string');
|
||||
expect(parameters.properties.locations.items.properties.state.type).toBe('string');
|
||||
expect(parameters.properties.locations.items.properties.countryCode.type).toBe('string');
|
||||
expect(parameters.properties.locations.items.properties.time.type).toBe('string');
|
||||
|
||||
expect(requestBuilders).toHaveProperty('GetCurrentWeather');
|
||||
expect(requestBuilders.GetCurrentWeather).toBeInstanceOf(ActionRequest);
|
||||
});
|
||||
|
||||
describe('openapiToFunction with $ref resolution', () => {
|
||||
it('correctly converts OpenAPI spec to function signatures and request builders, resolving $ref references', () => {
|
||||
const { functionSignatures, requestBuilders } = openapiToFunction(whimsicalOpenapiSpec);
|
||||
|
||||
expect(functionSignatures.length).toBeGreaterThan(0);
|
||||
|
||||
const postRenderFlowchartSignature = functionSignatures.find(
|
||||
(sig) => sig.name === 'postRenderFlowchart',
|
||||
);
|
||||
expect(postRenderFlowchartSignature).toBeDefined();
|
||||
expect(postRenderFlowchartSignature?.name).toBe('postRenderFlowchart');
|
||||
expect(postRenderFlowchartSignature?.parameters).toBeDefined();
|
||||
|
||||
expect(requestBuilders).toHaveProperty('postRenderFlowchart');
|
||||
const postRenderFlowchartRequestBuilder = requestBuilders['postRenderFlowchart'];
|
||||
expect(postRenderFlowchartRequestBuilder).toBeDefined();
|
||||
expect(postRenderFlowchartRequestBuilder.method).toBe('post');
|
||||
expect(postRenderFlowchartRequestBuilder.path).toBe('/ai.chatgpt.render-flowchart');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
const invalidServerURL = 'Could not find a valid URL in `servers`';
|
||||
|
||||
describe('validateAndParseOpenAPISpec', () => {
|
||||
it('validates a correct OpenAPI spec successfully', () => {
|
||||
const validSpec = JSON.stringify({
|
||||
openapi: '3.0.0',
|
||||
info: { title: 'Test API', version: '1.0.0' },
|
||||
servers: [{ url: 'https://test.api' }],
|
||||
paths: { '/test': {} },
|
||||
components: { schemas: {} },
|
||||
});
|
||||
|
||||
const result = validateAndParseOpenAPISpec(validSpec);
|
||||
expect(result.status).toBe(true);
|
||||
expect(result.message).toBe('OpenAPI spec is valid.');
|
||||
});
|
||||
|
||||
it('returns an error for spec with no servers', () => {
|
||||
const noServerSpec = JSON.stringify({
|
||||
openapi: '3.0.0',
|
||||
info: { title: 'Test API', version: '1.0.0' },
|
||||
paths: { '/test': {} },
|
||||
components: { schemas: {} },
|
||||
});
|
||||
|
||||
const result = validateAndParseOpenAPISpec(noServerSpec);
|
||||
expect(result.status).toBe(false);
|
||||
expect(result.message).toBe(invalidServerURL);
|
||||
});
|
||||
|
||||
it('returns an error for spec with empty server URL', () => {
|
||||
const emptyURLSpec = `{
|
||||
"openapi": "3.1.0",
|
||||
"info": {
|
||||
"title": "Untitled",
|
||||
"description": "Your OpenAPI specification",
|
||||
"version": "v1.0.0"
|
||||
},
|
||||
"servers": [
|
||||
{
|
||||
"url": ""
|
||||
}
|
||||
],
|
||||
"paths": {},
|
||||
"components": {
|
||||
"schemas": {}
|
||||
}
|
||||
}`;
|
||||
|
||||
const result = validateAndParseOpenAPISpec(emptyURLSpec);
|
||||
expect(result.status).toBe(false);
|
||||
expect(result.message).toBe(invalidServerURL);
|
||||
});
|
||||
|
||||
it('returns an error for spec with no paths', () => {
|
||||
const noPathsSpec = JSON.stringify({
|
||||
openapi: '3.0.0',
|
||||
info: { title: 'Test API', version: '1.0.0' },
|
||||
servers: [{ url: 'https://test.api' }],
|
||||
components: { schemas: {} },
|
||||
});
|
||||
|
||||
const result = validateAndParseOpenAPISpec(noPathsSpec);
|
||||
expect(result.status).toBe(false);
|
||||
expect(result.message).toBe('No paths found in the OpenAPI spec.');
|
||||
});
|
||||
|
||||
it('detects missing components in spec', () => {
|
||||
const missingComponentSpec = JSON.stringify({
|
||||
openapi: '3.0.0',
|
||||
info: { title: 'Test API', version: '1.0.0' },
|
||||
servers: [{ url: 'https://test.api' }],
|
||||
paths: {
|
||||
'/test': {
|
||||
get: {
|
||||
responses: {
|
||||
'200': {
|
||||
content: {
|
||||
'application/json': { schema: { $ref: '#/components/schemas/Missing' } },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const result = validateAndParseOpenAPISpec(missingComponentSpec);
|
||||
expect(result.status).toBe(true);
|
||||
expect(result.message).toContain('reference to unknown component Missing');
|
||||
expect(result.spec).toBeDefined();
|
||||
});
|
||||
|
||||
it('handles invalid spec formats', () => {
|
||||
const invalidSpec = 'not a valid spec';
|
||||
|
||||
const result = validateAndParseOpenAPISpec(invalidSpec);
|
||||
expect(result.status).toBe(false);
|
||||
expect(result.message).toBe(invalidServerURL);
|
||||
});
|
||||
|
||||
it('handles YAML spec and correctly converts to Function Signatures', () => {
|
||||
const result = validateAndParseOpenAPISpec(scholarAIOpenapiSpec);
|
||||
expect(result.status).toBe(true);
|
||||
|
||||
const spec = result.spec;
|
||||
expect(spec).toBeDefined();
|
||||
|
||||
const { functionSignatures, requestBuilders } = openapiToFunction(spec as OpenAPIV3.Document);
|
||||
expect(functionSignatures.length).toBe(3);
|
||||
expect(requestBuilders).toHaveProperty('searchAbstracts');
|
||||
expect(requestBuilders).toHaveProperty('getFullText');
|
||||
expect(requestBuilders).toHaveProperty('saveCitation');
|
||||
});
|
||||
});
|
||||
181
packages/data-provider/specs/filetypes.spec.ts
Normal file
181
packages/data-provider/specs/filetypes.spec.ts
Normal file
|
|
@ -0,0 +1,181 @@
|
|||
import {
|
||||
fileConfig,
|
||||
fullMimeTypesList,
|
||||
codeInterpreterMimeTypesList,
|
||||
retrievalMimeTypesList,
|
||||
supportedMimeTypes,
|
||||
codeInterpreterMimeTypes,
|
||||
retrievalMimeTypes,
|
||||
excelFileTypes,
|
||||
excelMimeTypes,
|
||||
fileConfigSchema,
|
||||
mergeFileConfig,
|
||||
mbToBytes,
|
||||
} from '../src/file-config';
|
||||
|
||||
describe('MIME Type Regex Patterns', () => {
|
||||
const unsupportedMimeTypes = [
|
||||
'text/x-unknown',
|
||||
'application/unknown',
|
||||
'image/bmp',
|
||||
'image/svg',
|
||||
'audio/mp3',
|
||||
];
|
||||
|
||||
// Testing general supported MIME types
|
||||
fullMimeTypesList.forEach((mimeType) => {
|
||||
test(`"${mimeType}" should match one of the supported regex patterns in supportedMimeTypes`, () => {
|
||||
const matches = supportedMimeTypes.some((regex) => regex.test(mimeType));
|
||||
expect(matches).toBeTruthy();
|
||||
});
|
||||
});
|
||||
|
||||
// Testing unsupported MIME types
|
||||
unsupportedMimeTypes.forEach((mimeType) => {
|
||||
test(`"${mimeType}" should not match any of the supported regex patterns in supportedMimeTypes`, () => {
|
||||
const matches = supportedMimeTypes.some((regex) => regex.test(mimeType));
|
||||
expect(matches).toBeFalsy();
|
||||
});
|
||||
});
|
||||
|
||||
// Testing MIME types for Code Interpreter support
|
||||
codeInterpreterMimeTypesList.forEach((mimeType) => {
|
||||
test(`"${mimeType}" should be supported by codeInterpreterMimeTypes`, () => {
|
||||
const matches = codeInterpreterMimeTypes.some((regex) => regex.test(mimeType));
|
||||
expect(matches).toBeTruthy();
|
||||
});
|
||||
});
|
||||
|
||||
// Testing MIME types for Retrieval support
|
||||
retrievalMimeTypesList.forEach((mimeType) => {
|
||||
test(`"${mimeType}" should be supported by retrievalMimeTypes`, () => {
|
||||
const matches = retrievalMimeTypes.some((regex) => regex.test(mimeType));
|
||||
expect(matches).toBeTruthy();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('MIME Types Exclusive to Code Interpreter', () => {
|
||||
const exclusiveCodeInterpreterMimeTypes = codeInterpreterMimeTypesList.filter(
|
||||
(mimeType) => !retrievalMimeTypesList.includes(mimeType),
|
||||
);
|
||||
|
||||
exclusiveCodeInterpreterMimeTypes.forEach((mimeType) => {
|
||||
test(`"${mimeType}" should not be supported by retrievalMimeTypes`, () => {
|
||||
const isSupportedByRetrieval = retrievalMimeTypes.some((regex) => regex.test(mimeType));
|
||||
expect(isSupportedByRetrieval).toBeFalsy();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Testing Excel MIME types', () => {
|
||||
excelFileTypes.forEach((mimeType) => {
|
||||
test(`"${mimeType}" should match one of the supported regex patterns in supportedMimeTypes`, () => {
|
||||
const matches = supportedMimeTypes.some((regex) => regex.test(mimeType));
|
||||
expect(matches).toBeTruthy();
|
||||
});
|
||||
});
|
||||
|
||||
test('Excel MIME types should match the regex pattern in excelMimeTypes', () => {
|
||||
const matches = excelFileTypes.every((mimeType) => excelMimeTypes.test(mimeType));
|
||||
expect(matches).toBeTruthy();
|
||||
});
|
||||
});
|
||||
|
||||
describe('Testing `fileConfig`', () => {
|
||||
describe('checkType function', () => {
|
||||
test('should return true for supported MIME types', () => {
|
||||
const fileTypes = ['text/csv', 'application/json', 'application/pdf', 'image/jpeg'];
|
||||
fileTypes.forEach((fileType) => {
|
||||
const isSupported = fileConfig.checkType(fileType);
|
||||
expect(isSupported).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
test('should return false for unsupported MIME types', () => {
|
||||
const fileTypes = ['text/mamba', 'application/exe', 'no-image', ''];
|
||||
fileTypes.forEach((fileType) => {
|
||||
const isSupported = fileConfig.checkType(fileType);
|
||||
expect(isSupported).toBe(false);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
const dynamicConfigs = {
|
||||
minimalUpdate: {
|
||||
serverFileSizeLimit: 1024, // Increasing server file size limit
|
||||
},
|
||||
fullOverrideDefaultEndpoint: {
|
||||
endpoints: {
|
||||
default: {
|
||||
fileLimit: 15,
|
||||
fileSizeLimit: 30,
|
||||
totalSizeLimit: 60,
|
||||
supportedMimeTypes: ['^video/.*$'], // Changing to support video files
|
||||
},
|
||||
},
|
||||
},
|
||||
newEndpointAddition: {
|
||||
endpoints: {
|
||||
newEndpoint: {
|
||||
fileLimit: 5,
|
||||
fileSizeLimit: 10,
|
||||
totalSizeLimit: 20,
|
||||
supportedMimeTypes: ['^application/json$', '^application/xml$'],
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
describe('mergeFileConfig', () => {
|
||||
test('merges minimal update correctly', () => {
|
||||
const result = mergeFileConfig(dynamicConfigs.minimalUpdate);
|
||||
expect(result.serverFileSizeLimit).toEqual(mbToBytes(1024));
|
||||
const parsedResult = fileConfigSchema.safeParse(result);
|
||||
expect(parsedResult.success).toBeTruthy();
|
||||
});
|
||||
|
||||
test('overrides default endpoint with full new configuration', () => {
|
||||
const result = mergeFileConfig(dynamicConfigs.fullOverrideDefaultEndpoint);
|
||||
expect(result.endpoints.default.fileLimit).toEqual(15);
|
||||
expect(result.endpoints.default.supportedMimeTypes).toEqual(
|
||||
expect.arrayContaining([new RegExp('^video/.*$')]),
|
||||
);
|
||||
const parsedResult = fileConfigSchema.safeParse(result);
|
||||
expect(parsedResult.success).toBeTruthy();
|
||||
});
|
||||
|
||||
test('adds new endpoint configuration correctly', () => {
|
||||
const result = mergeFileConfig(dynamicConfigs.newEndpointAddition);
|
||||
expect(result.endpoints.newEndpoint).toBeDefined();
|
||||
expect(result.endpoints.newEndpoint.fileLimit).toEqual(5);
|
||||
expect(result.endpoints.newEndpoint.supportedMimeTypes).toEqual(
|
||||
expect.arrayContaining([new RegExp('^application/json$')]),
|
||||
);
|
||||
const parsedResult = fileConfigSchema.safeParse(result);
|
||||
expect(parsedResult.success).toBeTruthy();
|
||||
});
|
||||
|
||||
test('disables an endpoint and sets numeric fields to 0 and empties supportedMimeTypes', () => {
|
||||
const configWithDisabledEndpoint = {
|
||||
endpoints: {
|
||||
disabledEndpoint: {
|
||||
disabled: true,
|
||||
fileLimit: 15,
|
||||
fileSizeLimit: 30,
|
||||
totalSizeLimit: 60,
|
||||
supportedMimeTypes: ['^video/.*$'],
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const result = mergeFileConfig(configWithDisabledEndpoint);
|
||||
expect(result.endpoints.disabledEndpoint).toBeDefined();
|
||||
expect(result.endpoints.disabledEndpoint.disabled).toEqual(true);
|
||||
expect(result.endpoints.disabledEndpoint.fileLimit).toEqual(0);
|
||||
expect(result.endpoints.disabledEndpoint.fileSizeLimit).toEqual(0);
|
||||
expect(result.endpoints.disabledEndpoint.totalSizeLimit).toEqual(0);
|
||||
expect(result.endpoints.disabledEndpoint.supportedMimeTypes).toEqual([]);
|
||||
});
|
||||
});
|
||||
350
packages/data-provider/specs/openapiSpecs.ts
Normal file
350
packages/data-provider/specs/openapiSpecs.ts
Normal file
|
|
@ -0,0 +1,350 @@
|
|||
import { OpenAPIV3 } from 'openapi-types';
|
||||
|
||||
export type FlowchartSchema = {
|
||||
mermaid: {
|
||||
type: 'string';
|
||||
description: 'Flowchart to be rendered, in Mermaid syntax';
|
||||
};
|
||||
title: {
|
||||
type: 'string';
|
||||
description: 'Title of the flowchart';
|
||||
};
|
||||
};
|
||||
|
||||
export const getWeatherOpenapiSpec: OpenAPIV3.Document = {
|
||||
openapi: '3.1.0',
|
||||
info: {
|
||||
title: 'Get weather data',
|
||||
description: 'Retrieves current weather data for a location.',
|
||||
version: 'v1.0.0',
|
||||
},
|
||||
servers: [
|
||||
{
|
||||
url: 'https://weather.example.com',
|
||||
},
|
||||
],
|
||||
paths: {
|
||||
'/location': {
|
||||
get: {
|
||||
description: 'Get temperature for a specific location',
|
||||
operationId: 'GetCurrentWeather',
|
||||
parameters: [
|
||||
{
|
||||
name: 'location',
|
||||
in: 'query',
|
||||
description: 'The city and state to retrieve the weather for',
|
||||
required: true,
|
||||
schema: {
|
||||
type: 'string',
|
||||
},
|
||||
},
|
||||
],
|
||||
requestBody: {
|
||||
required: true,
|
||||
content: {
|
||||
'application/json': {
|
||||
schema: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
locations: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
city: {
|
||||
type: 'string',
|
||||
example: 'San Francisco',
|
||||
},
|
||||
state: {
|
||||
type: 'string',
|
||||
example: 'CA',
|
||||
},
|
||||
countryCode: {
|
||||
type: 'string',
|
||||
description: 'ISO 3166-1 alpha-2 country code',
|
||||
example: 'US',
|
||||
},
|
||||
time: {
|
||||
type: 'string',
|
||||
description:
|
||||
'Optional time for which the weather is requested, in ISO 8601 format.',
|
||||
example: '2023-12-04T14:00:00Z',
|
||||
},
|
||||
},
|
||||
required: ['city', 'state', 'countryCode'],
|
||||
description:
|
||||
'Details of the location for which the weather data is requested.',
|
||||
},
|
||||
description: 'A list of locations to retrieve the weather for.',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
deprecated: false,
|
||||
responses: {},
|
||||
},
|
||||
},
|
||||
},
|
||||
components: {
|
||||
schemas: {},
|
||||
},
|
||||
};
|
||||
|
||||
export const whimsicalOpenapiSpec: OpenAPIV3.Document = {
|
||||
openapi: '3.0.0',
|
||||
info: {
|
||||
version: '1.0.0',
|
||||
title: 'Diagram to Image API',
|
||||
description: 'A simple API to generate flowchart, mindmap, or sequence diagram images.',
|
||||
},
|
||||
servers: [{ url: 'https://whimsical.com/api' }],
|
||||
paths: {
|
||||
'/ai.chatgpt.render-flowchart': {
|
||||
post: {
|
||||
operationId: 'postRenderFlowchart',
|
||||
// 'x-openai-isConsequential': false,
|
||||
summary: 'Renders a flowchart',
|
||||
description:
|
||||
'Accepts a string describing a flowchart and returns a URL to a rendered image',
|
||||
requestBody: {
|
||||
content: {
|
||||
'application/json': {
|
||||
schema: {
|
||||
$ref: '#/components/schemas/FlowchartRequest',
|
||||
},
|
||||
},
|
||||
},
|
||||
required: true,
|
||||
},
|
||||
responses: {
|
||||
'200': {
|
||||
description: 'URL to the rendered image',
|
||||
content: {
|
||||
'application/json': {
|
||||
schema: {
|
||||
$ref: '#/components/schemas/FlowchartResponse',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
components: {
|
||||
schemas: {
|
||||
FlowchartRequest: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
mermaid: {
|
||||
type: 'string',
|
||||
description: 'Flowchart to be rendered, in Mermaid syntax',
|
||||
},
|
||||
title: {
|
||||
type: 'string',
|
||||
description: 'Title of the flowchart',
|
||||
},
|
||||
},
|
||||
required: ['mermaid'],
|
||||
},
|
||||
FlowchartResponse: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
imageURL: {
|
||||
type: 'string',
|
||||
description: 'URL of the rendered image',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const scholarAIOpenapiSpec = `
|
||||
openapi: 3.0.1
|
||||
info:
|
||||
title: ScholarAI
|
||||
description: Allows the user to search facts and findings from scientific articles
|
||||
version: 'v1'
|
||||
servers:
|
||||
- url: https://scholar-ai.net
|
||||
paths:
|
||||
/api/abstracts:
|
||||
get:
|
||||
operationId: searchAbstracts
|
||||
summary: Get relevant paper abstracts by keywords search
|
||||
parameters:
|
||||
- name: keywords
|
||||
in: query
|
||||
description: Keywords of inquiry which should appear in article. Must be in English.
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: sort
|
||||
in: query
|
||||
description: The sort order for results. Valid values are cited_by_count or publication_date. Excluding this value does a relevance based search.
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- cited_by_count
|
||||
- publication_date
|
||||
- name: query
|
||||
in: query
|
||||
description: The user query
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: peer_reviewed_only
|
||||
in: query
|
||||
description: Whether to only return peer reviewed articles. Defaults to true, ChatGPT should cautiously suggest this value can be set to false
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- name: start_year
|
||||
in: query
|
||||
description: The first year, inclusive, to include in the search range. Excluding this value will include all years.
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- name: end_year
|
||||
in: query
|
||||
description: The last year, inclusive, to include in the search range. Excluding this value will include all years.
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- name: offset
|
||||
in: query
|
||||
description: The offset of the first result to return. Defaults to 0.
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/searchAbstractsResponse'
|
||||
/api/fulltext:
|
||||
get:
|
||||
operationId: getFullText
|
||||
summary: Get full text of a paper by URL for PDF
|
||||
parameters:
|
||||
- name: pdf_url
|
||||
in: query
|
||||
description: URL for PDF
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: chunk
|
||||
in: query
|
||||
description: chunk number to retrieve, defaults to 1
|
||||
required: false
|
||||
schema:
|
||||
type: number
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/getFullTextResponse'
|
||||
/api/save-citation:
|
||||
get:
|
||||
operationId: saveCitation
|
||||
summary: Save citation to reference manager
|
||||
parameters:
|
||||
- name: doi
|
||||
in: query
|
||||
description: Digital Object Identifier (DOI) of article
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: zotero_user_id
|
||||
in: query
|
||||
description: Zotero User ID
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: zotero_api_key
|
||||
in: query
|
||||
description: Zotero API Key
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/saveCitationResponse'
|
||||
components:
|
||||
schemas:
|
||||
searchAbstractsResponse:
|
||||
type: object
|
||||
properties:
|
||||
next_offset:
|
||||
type: number
|
||||
description: The offset of the next page of results.
|
||||
total_num_results:
|
||||
type: number
|
||||
description: The total number of results.
|
||||
abstracts:
|
||||
type: array
|
||||
items:
|
||||
type: object
|
||||
properties:
|
||||
title:
|
||||
type: string
|
||||
abstract:
|
||||
type: string
|
||||
description: Summary of the context, methods, results, and conclusions of the paper.
|
||||
doi:
|
||||
type: string
|
||||
description: The DOI of the paper.
|
||||
landing_page_url:
|
||||
type: string
|
||||
description: Link to the paper on its open-access host.
|
||||
pdf_url:
|
||||
type: string
|
||||
description: Link to the paper PDF.
|
||||
publicationDate:
|
||||
type: string
|
||||
description: The date the paper was published in YYYY-MM-DD format.
|
||||
relevance:
|
||||
type: number
|
||||
description: The relevance of the paper to the search query. 1 is the most relevant.
|
||||
creators:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: The name of the creator.
|
||||
cited_by_count:
|
||||
type: number
|
||||
description: The number of citations of the article.
|
||||
description: The list of relevant abstracts.
|
||||
getFullTextResponse:
|
||||
type: object
|
||||
properties:
|
||||
full_text:
|
||||
type: string
|
||||
description: The full text of the paper.
|
||||
pdf_url:
|
||||
type: string
|
||||
description: The PDF URL of the paper.
|
||||
chunk:
|
||||
type: number
|
||||
description: The chunk of the paper.
|
||||
total_chunk_num:
|
||||
type: number
|
||||
description: The total chunks of the paper.
|
||||
saveCitationResponse:
|
||||
type: object
|
||||
properties:
|
||||
message:
|
||||
type: string
|
||||
description: Confirmation of successful save or error message.`;
|
||||
347
packages/data-provider/src/actions.ts
Normal file
347
packages/data-provider/src/actions.ts
Normal file
|
|
@ -0,0 +1,347 @@
|
|||
import axios from 'axios';
|
||||
import { URL } from 'url';
|
||||
import crypto from 'crypto';
|
||||
import { load } from 'js-yaml';
|
||||
import type { FunctionTool, Schema, Reference, ActionMetadata } from './types/assistants';
|
||||
import type { OpenAPIV3 } from 'openapi-types';
|
||||
import { Tools, AuthTypeEnum, AuthorizationTypeEnum } from './types/assistants';
|
||||
|
||||
export type ParametersSchema = {
|
||||
type: string;
|
||||
properties: Record<string, Reference | Schema>;
|
||||
required: string[];
|
||||
};
|
||||
|
||||
export type ApiKeyCredentials = {
|
||||
api_key: string;
|
||||
custom_auth_header?: string;
|
||||
authorization_type?: AuthorizationTypeEnum;
|
||||
};
|
||||
|
||||
export type OAuthCredentials = {
|
||||
tokenUrl: string;
|
||||
clientId: string;
|
||||
clientSecret: string;
|
||||
scope: string;
|
||||
};
|
||||
|
||||
export type Credentials = ApiKeyCredentials | OAuthCredentials;
|
||||
|
||||
export function sha1(input: string) {
|
||||
return crypto.createHash('sha1').update(input).digest('hex');
|
||||
}
|
||||
|
||||
export function createURL(domain: string, path: string) {
|
||||
const myURL = new URL(path, domain);
|
||||
return myURL.toString();
|
||||
}
|
||||
|
||||
export class FunctionSignature {
|
||||
name: string;
|
||||
description: string;
|
||||
parameters: ParametersSchema;
|
||||
|
||||
constructor(name: string, description: string, parameters: ParametersSchema) {
|
||||
this.name = name;
|
||||
this.description = description;
|
||||
if (parameters.properties?.['requestBody']) {
|
||||
this.parameters = parameters.properties?.['requestBody'] as ParametersSchema;
|
||||
} else {
|
||||
this.parameters = parameters;
|
||||
}
|
||||
}
|
||||
|
||||
toObjectTool(): FunctionTool {
|
||||
return {
|
||||
type: Tools.function,
|
||||
function: {
|
||||
name: this.name,
|
||||
description: this.description,
|
||||
parameters: this.parameters,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export class ActionRequest {
|
||||
domain: string;
|
||||
path: string;
|
||||
method: string;
|
||||
operation: string;
|
||||
operationHash?: string;
|
||||
isConsequential: boolean;
|
||||
contentType: string;
|
||||
params?: object;
|
||||
|
||||
constructor(
|
||||
domain: string,
|
||||
path: string,
|
||||
method: string,
|
||||
operation: string,
|
||||
isConsequential: boolean,
|
||||
contentType: string,
|
||||
) {
|
||||
this.domain = domain;
|
||||
this.path = path;
|
||||
this.method = method;
|
||||
this.operation = operation;
|
||||
this.isConsequential = isConsequential;
|
||||
this.contentType = contentType;
|
||||
}
|
||||
|
||||
private authHeaders: Record<string, string> = {};
|
||||
private authToken?: string;
|
||||
|
||||
async setParams(params: object) {
|
||||
this.operationHash = sha1(JSON.stringify(params));
|
||||
this.params = params;
|
||||
}
|
||||
|
||||
async setAuth(metadata: ActionMetadata) {
|
||||
if (!metadata.auth) {
|
||||
return;
|
||||
}
|
||||
|
||||
const {
|
||||
type,
|
||||
/* API Key */
|
||||
authorization_type,
|
||||
custom_auth_header,
|
||||
/* OAuth */
|
||||
authorization_url,
|
||||
client_url,
|
||||
scope,
|
||||
token_exchange_method,
|
||||
} = metadata.auth;
|
||||
|
||||
const {
|
||||
/* API Key */
|
||||
api_key,
|
||||
/* OAuth */
|
||||
oauth_client_id,
|
||||
oauth_client_secret,
|
||||
} = metadata;
|
||||
|
||||
const isApiKey = api_key && type === AuthTypeEnum.ServiceHttp;
|
||||
const isOAuth =
|
||||
oauth_client_id &&
|
||||
oauth_client_secret &&
|
||||
type === AuthTypeEnum.OAuth &&
|
||||
authorization_url &&
|
||||
client_url &&
|
||||
scope &&
|
||||
token_exchange_method;
|
||||
|
||||
if (isApiKey && authorization_type === AuthorizationTypeEnum.Basic) {
|
||||
const basicToken = Buffer.from(api_key).toString('base64');
|
||||
this.authHeaders['Authorization'] = `Basic ${basicToken}`;
|
||||
} else if (isApiKey && authorization_type === AuthorizationTypeEnum.Bearer) {
|
||||
this.authHeaders['Authorization'] = `Bearer ${api_key}`;
|
||||
} else if (
|
||||
isApiKey &&
|
||||
authorization_type === AuthorizationTypeEnum.Custom &&
|
||||
custom_auth_header
|
||||
) {
|
||||
this.authHeaders[custom_auth_header] = api_key;
|
||||
} else if (isOAuth) {
|
||||
// TODO: WIP - OAuth support
|
||||
if (!this.authToken) {
|
||||
const tokenResponse = await axios.post(
|
||||
client_url,
|
||||
{
|
||||
client_id: oauth_client_id,
|
||||
client_secret: oauth_client_secret,
|
||||
scope: scope,
|
||||
grant_type: 'client_credentials',
|
||||
},
|
||||
{
|
||||
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
|
||||
},
|
||||
);
|
||||
this.authToken = tokenResponse.data.access_token;
|
||||
}
|
||||
this.authHeaders['Authorization'] = `Bearer ${this.authToken}`;
|
||||
}
|
||||
}
|
||||
|
||||
async execute() {
|
||||
const url = createURL(this.domain, this.path);
|
||||
const headers = {
|
||||
...this.authHeaders,
|
||||
'Content-Type': this.contentType,
|
||||
};
|
||||
|
||||
const method = this.method.toLowerCase();
|
||||
|
||||
if (method === 'get') {
|
||||
return axios.get(url, { headers, params: this.params });
|
||||
} else if (method === 'post') {
|
||||
return axios.post(url, this.params, { headers });
|
||||
} else if (method === 'put') {
|
||||
return axios.put(url, this.params, { headers });
|
||||
} else if (method === 'delete') {
|
||||
return axios.delete(url, { headers, data: this.params });
|
||||
} else if (method === 'patch') {
|
||||
return axios.patch(url, this.params, { headers });
|
||||
} else {
|
||||
throw new Error(`Unsupported HTTP method: ${this.method}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export function resolveRef(
|
||||
schema: OpenAPIV3.SchemaObject | OpenAPIV3.ReferenceObject | OpenAPIV3.RequestBodyObject,
|
||||
components?: OpenAPIV3.ComponentsObject,
|
||||
): OpenAPIV3.SchemaObject {
|
||||
if ('$ref' in schema && components) {
|
||||
const refPath = schema.$ref.replace(/^#\/components\/schemas\//, '');
|
||||
const resolvedSchema = components.schemas?.[refPath];
|
||||
if (!resolvedSchema) {
|
||||
throw new Error(`Reference ${schema.$ref} not found`);
|
||||
}
|
||||
return resolveRef(resolvedSchema, components);
|
||||
}
|
||||
return schema as OpenAPIV3.SchemaObject;
|
||||
}
|
||||
|
||||
/** Function to convert OpenAPI spec to function signatures and request builders */
|
||||
export function openapiToFunction(openapiSpec: OpenAPIV3.Document): {
|
||||
functionSignatures: FunctionSignature[];
|
||||
requestBuilders: Record<string, ActionRequest>;
|
||||
} {
|
||||
const functionSignatures: FunctionSignature[] = [];
|
||||
const requestBuilders: Record<string, ActionRequest> = {};
|
||||
|
||||
// Base URL from OpenAPI spec servers
|
||||
const baseUrl = openapiSpec.servers?.[0]?.url ?? '';
|
||||
|
||||
// Iterate over each path and method in the OpenAPI spec
|
||||
for (const [path, methods] of Object.entries(openapiSpec.paths)) {
|
||||
for (const [method, operation] of Object.entries(methods as OpenAPIV3.PathsObject)) {
|
||||
const operationObj = operation as OpenAPIV3.OperationObject & {
|
||||
'x-openai-isConsequential'?: boolean;
|
||||
};
|
||||
|
||||
// Operation ID is used as the function name
|
||||
const operationId = operationObj.operationId || `${method}_${path}`;
|
||||
const description = operationObj.summary || operationObj.description || '';
|
||||
|
||||
const parametersSchema: ParametersSchema = { type: 'object', properties: {}, required: [] };
|
||||
|
||||
if (operationObj.requestBody) {
|
||||
const requestBody = operationObj.requestBody as OpenAPIV3.RequestBodyObject;
|
||||
const content = requestBody.content;
|
||||
const contentType = Object.keys(content)[0];
|
||||
const schema = content[contentType]?.schema;
|
||||
const resolvedSchema = resolveRef(
|
||||
schema as OpenAPIV3.ReferenceObject | OpenAPIV3.SchemaObject,
|
||||
openapiSpec.components,
|
||||
);
|
||||
parametersSchema.properties['requestBody'] = resolvedSchema;
|
||||
}
|
||||
|
||||
if (operationObj.parameters) {
|
||||
for (const param of operationObj.parameters) {
|
||||
const paramObj = param as OpenAPIV3.ParameterObject;
|
||||
const resolvedSchema = resolveRef(
|
||||
{ ...paramObj.schema } as OpenAPIV3.ReferenceObject | OpenAPIV3.SchemaObject,
|
||||
openapiSpec.components,
|
||||
);
|
||||
parametersSchema.properties[paramObj.name] = resolvedSchema;
|
||||
if (paramObj.required) {
|
||||
parametersSchema.required.push(paramObj.name);
|
||||
}
|
||||
if (paramObj.description && !('$$ref' in parametersSchema.properties[paramObj.name])) {
|
||||
parametersSchema.properties[paramObj.name].description = paramObj.description;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const functionSignature = new FunctionSignature(operationId, description, parametersSchema);
|
||||
functionSignatures.push(functionSignature);
|
||||
|
||||
const actionRequest = new ActionRequest(
|
||||
baseUrl,
|
||||
path,
|
||||
method,
|
||||
operationId,
|
||||
!!operationObj['x-openai-isConsequential'], // Custom extension for consequential actions
|
||||
operationObj.requestBody ? 'application/json' : 'application/x-www-form-urlencoded',
|
||||
);
|
||||
|
||||
requestBuilders[operationId] = actionRequest;
|
||||
}
|
||||
}
|
||||
|
||||
return { functionSignatures, requestBuilders };
|
||||
}
|
||||
|
||||
export type ValidationResult = {
|
||||
status: boolean;
|
||||
message: string;
|
||||
spec?: OpenAPIV3.Document;
|
||||
};
|
||||
|
||||
export function validateAndParseOpenAPISpec(specString: string): ValidationResult {
|
||||
try {
|
||||
let parsedSpec;
|
||||
try {
|
||||
parsedSpec = JSON.parse(specString);
|
||||
} catch {
|
||||
parsedSpec = load(specString);
|
||||
}
|
||||
|
||||
// Check for servers
|
||||
if (
|
||||
!parsedSpec.servers ||
|
||||
!Array.isArray(parsedSpec.servers) ||
|
||||
parsedSpec.servers.length === 0
|
||||
) {
|
||||
return { status: false, message: 'Could not find a valid URL in `servers`' };
|
||||
}
|
||||
|
||||
if (!parsedSpec.servers[0].url) {
|
||||
return { status: false, message: 'Could not find a valid URL in `servers`' };
|
||||
}
|
||||
|
||||
// Check for paths
|
||||
const paths = parsedSpec.paths;
|
||||
if (!paths || typeof paths !== 'object' || Object.keys(paths).length === 0) {
|
||||
return { status: false, message: 'No paths found in the OpenAPI spec.' };
|
||||
}
|
||||
|
||||
const components = parsedSpec.components?.schemas || {};
|
||||
const messages = [];
|
||||
|
||||
for (const [path, methods] of Object.entries(paths)) {
|
||||
for (const [httpMethod, operation] of Object.entries(methods as OpenAPIV3.PathItemObject)) {
|
||||
// Ensure operation is a valid operation object
|
||||
const { responses } = operation as OpenAPIV3.OperationObject;
|
||||
if (typeof operation === 'object' && responses) {
|
||||
for (const [statusCode, response] of Object.entries(responses)) {
|
||||
const content = (response as OpenAPIV3.ResponseObject).content;
|
||||
if (content && content['application/json'] && content['application/json'].schema) {
|
||||
const schema = content['application/json'].schema;
|
||||
if ('$ref' in schema && typeof schema.$ref === 'string') {
|
||||
const refName = schema.$ref.split('/').pop();
|
||||
if (refName && !components[refName]) {
|
||||
messages.push(
|
||||
`In context=('paths', '${path}', '${httpMethod}', '${statusCode}', 'response', 'content', 'application/json', 'schema'), reference to unknown component ${refName}; using empty schema`,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
status: true,
|
||||
message: messages.join('\n') || 'OpenAPI spec is valid.',
|
||||
spec: parsedSpec,
|
||||
};
|
||||
} catch (error) {
|
||||
return { status: false, message: 'Error parsing OpenAPI spec.' };
|
||||
}
|
||||
}
|
||||
|
|
@ -1,9 +1,36 @@
|
|||
/* eslint-disable max-len */
|
||||
import { z } from 'zod';
|
||||
import { EModelEndpoint, eModelEndpointSchema } from './schemas';
|
||||
import { fileConfigSchema } from './file-config';
|
||||
import { FileSources } from './types/files';
|
||||
|
||||
export const fileSourceSchema = z.nativeEnum(FileSources);
|
||||
|
||||
export const assistantEndpointSchema = z.object({
|
||||
/* assistants specific */
|
||||
disableBuilder: z.boolean().optional(),
|
||||
pollIntervalMs: z.number().optional(),
|
||||
timeoutMs: z.number().optional(),
|
||||
supportedIds: z.array(z.string()).min(1).optional(),
|
||||
excludedIds: z.array(z.string()).min(1).optional(),
|
||||
/* general */
|
||||
apiKey: z.string().optional(),
|
||||
baseURL: z.string().optional(),
|
||||
models: z
|
||||
.object({
|
||||
default: z.array(z.string()).min(1),
|
||||
fetch: z.boolean().optional(),
|
||||
userIdQuery: z.boolean().optional(),
|
||||
})
|
||||
.optional(),
|
||||
titleConvo: z.boolean().optional(),
|
||||
titleMethod: z.union([z.literal('completion'), z.literal('functions')]).optional(),
|
||||
titleModel: z.string().optional(),
|
||||
headers: z.record(z.any()).optional(),
|
||||
});
|
||||
|
||||
export type TAssistantEndpoint = z.infer<typeof assistantEndpointSchema>;
|
||||
|
||||
export const endpointSchema = z.object({
|
||||
name: z.string().refine((value) => !eModelEndpointSchema.safeParse(value).success, {
|
||||
message: `Value cannot be one of the default endpoint (EModelEndpoint) values: ${Object.values(
|
||||
|
|
@ -25,6 +52,19 @@ export const endpointSchema = z.object({
|
|||
forcePrompt: z.boolean().optional(),
|
||||
modelDisplayLabel: z.string().optional(),
|
||||
headers: z.record(z.any()).optional(),
|
||||
addParams: z.record(z.any()).optional(),
|
||||
dropParams: z.array(z.string()).optional(),
|
||||
});
|
||||
|
||||
export const rateLimitSchema = z.object({
|
||||
fileUploads: z
|
||||
.object({
|
||||
ipMax: z.number().optional(),
|
||||
ipWindowInMinutes: z.number().optional(),
|
||||
userMax: z.number().optional(),
|
||||
userWindowInMinutes: z.number().optional(),
|
||||
})
|
||||
.optional(),
|
||||
});
|
||||
|
||||
export const configSchema = z.object({
|
||||
|
|
@ -37,11 +77,17 @@ export const configSchema = z.object({
|
|||
allowedDomains: z.array(z.string()).optional(),
|
||||
})
|
||||
.optional(),
|
||||
rateLimits: rateLimitSchema.optional(),
|
||||
fileConfig: fileConfigSchema.optional(),
|
||||
endpoints: z
|
||||
.object({
|
||||
custom: z.array(endpointSchema.partial()),
|
||||
[EModelEndpoint.assistants]: assistantEndpointSchema.optional(),
|
||||
custom: z.array(endpointSchema.partial()).optional(),
|
||||
})
|
||||
.strict()
|
||||
.refine((data) => Object.keys(data).length > 0, {
|
||||
message: 'At least one `endpoints` field must be provided.',
|
||||
})
|
||||
.optional(),
|
||||
});
|
||||
|
||||
|
|
@ -54,7 +100,7 @@ export enum KnownEndpoints {
|
|||
|
||||
export const defaultEndpoints: EModelEndpoint[] = [
|
||||
EModelEndpoint.openAI,
|
||||
EModelEndpoint.assistant,
|
||||
EModelEndpoint.assistants,
|
||||
EModelEndpoint.azureOpenAI,
|
||||
EModelEndpoint.bingAI,
|
||||
EModelEndpoint.chatGPTBrowser,
|
||||
|
|
@ -66,7 +112,7 @@ export const defaultEndpoints: EModelEndpoint[] = [
|
|||
|
||||
export const alternateName = {
|
||||
[EModelEndpoint.openAI]: 'OpenAI',
|
||||
[EModelEndpoint.assistant]: 'Assistants',
|
||||
[EModelEndpoint.assistants]: 'Assistants',
|
||||
[EModelEndpoint.azureOpenAI]: 'Azure OpenAI',
|
||||
[EModelEndpoint.bingAI]: 'Bing',
|
||||
[EModelEndpoint.chatGPTBrowser]: 'ChatGPT',
|
||||
|
|
@ -77,6 +123,21 @@ export const alternateName = {
|
|||
};
|
||||
|
||||
export const defaultModels = {
|
||||
[EModelEndpoint.assistants]: [
|
||||
'gpt-3.5-turbo-0125',
|
||||
'gpt-4-0125-preview',
|
||||
'gpt-4-turbo-preview',
|
||||
'gpt-4-1106-preview',
|
||||
'gpt-3.5-turbo-1106',
|
||||
'gpt-3.5-turbo-16k-0613',
|
||||
'gpt-3.5-turbo-16k',
|
||||
'gpt-3.5-turbo',
|
||||
'gpt-4',
|
||||
'gpt-4-0314',
|
||||
'gpt-4-32k-0314',
|
||||
'gpt-4-0613',
|
||||
'gpt-3.5-turbo-0613',
|
||||
],
|
||||
[EModelEndpoint.google]: [
|
||||
'gemini-pro',
|
||||
'gemini-pro-vision',
|
||||
|
|
@ -121,6 +182,14 @@ export const defaultModels = {
|
|||
],
|
||||
};
|
||||
|
||||
export const supportsRetrieval = new Set([
|
||||
'gpt-3.5-turbo-0125',
|
||||
'gpt-4-0125-preview',
|
||||
'gpt-4-turbo-preview',
|
||||
'gpt-4-1106-preview',
|
||||
'gpt-3.5-turbo-1106',
|
||||
]);
|
||||
|
||||
export const EndpointURLs: { [key in EModelEndpoint]: string } = {
|
||||
[EModelEndpoint.openAI]: `/api/ask/${EModelEndpoint.openAI}`,
|
||||
[EModelEndpoint.bingAI]: `/api/ask/${EModelEndpoint.bingAI}`,
|
||||
|
|
@ -130,7 +199,7 @@ export const EndpointURLs: { [key in EModelEndpoint]: string } = {
|
|||
[EModelEndpoint.gptPlugins]: `/api/ask/${EModelEndpoint.gptPlugins}`,
|
||||
[EModelEndpoint.azureOpenAI]: `/api/ask/${EModelEndpoint.azureOpenAI}`,
|
||||
[EModelEndpoint.chatGPTBrowser]: `/api/ask/${EModelEndpoint.chatGPTBrowser}`,
|
||||
[EModelEndpoint.assistant]: '/api/assistants/chat',
|
||||
[EModelEndpoint.assistants]: '/api/assistants/chat',
|
||||
};
|
||||
|
||||
export const modularEndpoints = new Set<EModelEndpoint | string>([
|
||||
|
|
@ -142,14 +211,6 @@ export const modularEndpoints = new Set<EModelEndpoint | string>([
|
|||
EModelEndpoint.custom,
|
||||
]);
|
||||
|
||||
export const supportsFiles = {
|
||||
[EModelEndpoint.openAI]: true,
|
||||
[EModelEndpoint.google]: true,
|
||||
[EModelEndpoint.assistant]: true,
|
||||
[EModelEndpoint.azureOpenAI]: true,
|
||||
[EModelEndpoint.custom]: true,
|
||||
};
|
||||
|
||||
export const supportsBalanceCheck = {
|
||||
[EModelEndpoint.openAI]: true,
|
||||
[EModelEndpoint.azureOpenAI]: true,
|
||||
|
|
@ -159,6 +220,19 @@ export const supportsBalanceCheck = {
|
|||
|
||||
export const visionModels = ['gpt-4-vision', 'llava-13b', 'gemini-pro-vision'];
|
||||
|
||||
export function validateVisionModel(
|
||||
model: string | undefined,
|
||||
additionalModels: string[] | undefined = [],
|
||||
) {
|
||||
if (!model) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return visionModels.concat(additionalModels).some((visionModel) => model.includes(visionModel));
|
||||
}
|
||||
|
||||
export const imageGenTools = new Set(['dalle', 'dall-e', 'stable-diffusion']);
|
||||
|
||||
/**
|
||||
* Enum for cache keys.
|
||||
*/
|
||||
|
|
@ -175,6 +249,11 @@ export enum CacheKeys {
|
|||
* Key for the title generation cache.
|
||||
*/
|
||||
GEN_TITLE = 'genTitle',
|
||||
/**
|
||||
/**
|
||||
* Key for the tools cache.
|
||||
*/
|
||||
TOOLS = 'tools',
|
||||
/**
|
||||
* Key for the model config cache.
|
||||
*/
|
||||
|
|
@ -195,10 +274,18 @@ export enum CacheKeys {
|
|||
* Key for the custom config cache.
|
||||
*/
|
||||
CUSTOM_CONFIG = 'customConfig',
|
||||
/**
|
||||
* Key for accessing Abort Keys
|
||||
*/
|
||||
ABORT_KEYS = 'abortKeys',
|
||||
/**
|
||||
* Key for the override config cache.
|
||||
*/
|
||||
OVERRIDE_CONFIG = 'overrideConfig',
|
||||
/**
|
||||
* Key for accessing File Upload Violations (exceeding limit).
|
||||
*/
|
||||
FILE_UPLOAD_LIMIT = 'file_upload_limit',
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -260,3 +347,27 @@ export enum SettingsTabValues {
|
|||
*/
|
||||
ACCOUNT = 'account',
|
||||
}
|
||||
|
||||
/**
|
||||
* Enum for app-wide constants
|
||||
*/
|
||||
export enum Constants {
|
||||
/**
|
||||
* Key for the app's version.
|
||||
*/
|
||||
VERSION = 'v0.6.9',
|
||||
/**
|
||||
* Key for the Custom Config's version (librechat.yaml).
|
||||
*/
|
||||
CONFIG_VERSION = '1.0.3',
|
||||
/**
|
||||
* Standard value for the first message's `parentMessageId` value, to indicate no parent exists.
|
||||
*/
|
||||
NO_PARENT = '00000000-0000-0000-0000-000000000000',
|
||||
}
|
||||
|
||||
export const defaultOrderQuery: {
|
||||
order: 'asc';
|
||||
} = {
|
||||
order: 'asc',
|
||||
};
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ export default function createPayload(submission: TSubmission) {
|
|||
|
||||
let server = EndpointURLs[endpointType ?? endpoint];
|
||||
|
||||
if (isEdited && endpoint === EModelEndpoint.assistant) {
|
||||
if (isEdited && endpoint === EModelEndpoint.assistants) {
|
||||
server += '/modify';
|
||||
} else if (isEdited) {
|
||||
server = server.replace('/ask/', '/edit/');
|
||||
|
|
@ -32,9 +32,5 @@ export default function createPayload(submission: TSubmission) {
|
|||
conversationId,
|
||||
};
|
||||
|
||||
if (endpoint === EModelEndpoint.assistant) {
|
||||
payload.messages = messages;
|
||||
}
|
||||
|
||||
return { server, payload };
|
||||
}
|
||||
|
|
|
|||
|
|
@ -196,23 +196,57 @@ export const listAssistants = (
|
|||
return request.get(endpoints.assistants(), { params });
|
||||
};
|
||||
|
||||
/* Tools */
|
||||
|
||||
export const getAvailableTools = (): Promise<s.TPlugin[]> => {
|
||||
return request.get(`${endpoints.assistants()}/tools`);
|
||||
};
|
||||
|
||||
/* Files */
|
||||
|
||||
export const getFiles = (): Promise<f.TFile[]> => {
|
||||
return request.get(endpoints.files());
|
||||
};
|
||||
|
||||
export const getFileConfig = (): Promise<f.FileConfig> => {
|
||||
return request.get(`${endpoints.files()}/config`);
|
||||
};
|
||||
|
||||
export const uploadImage = (data: FormData): Promise<f.TFileUpload> => {
|
||||
return request.postMultiPart(endpoints.images(), data);
|
||||
};
|
||||
|
||||
export const uploadFile = (data: FormData): Promise<f.TFileUpload> => {
|
||||
return request.postMultiPart(endpoints.files(), data);
|
||||
};
|
||||
|
||||
export const uploadAvatar = (data: FormData): Promise<f.AvatarUploadResponse> => {
|
||||
return request.postMultiPart(endpoints.avatar(), data);
|
||||
};
|
||||
|
||||
export const deleteFiles = async (files: f.BatchFile[]): Promise<f.DeleteFilesResponse> =>
|
||||
export const uploadAssistantAvatar = (data: m.AssistantAvatarVariables): Promise<a.Assistant> => {
|
||||
return request.postMultiPart(endpoints.assistants(`avatar/${data.assistant_id}`), data.formData);
|
||||
};
|
||||
|
||||
export const updateAction = (data: m.UpdateActionVariables): Promise<m.UpdateActionResponse> => {
|
||||
const { assistant_id, ...body } = data;
|
||||
return request.post(endpoints.assistants(`actions/${assistant_id}`), body);
|
||||
};
|
||||
|
||||
export function getActions(): Promise<a.Action[]> {
|
||||
return request.get(endpoints.assistants('actions'));
|
||||
}
|
||||
|
||||
export function getAssistantDocs(): Promise<a.AssistantDocument[]> {
|
||||
return request.get(endpoints.assistants('documents'));
|
||||
}
|
||||
|
||||
export const deleteFiles = async (
|
||||
files: f.BatchFile[],
|
||||
assistant_id?: string,
|
||||
): Promise<f.DeleteFilesResponse> =>
|
||||
request.deleteWithOptions(endpoints.files(), {
|
||||
data: { files },
|
||||
data: { files, assistant_id },
|
||||
});
|
||||
|
||||
/* conversations */
|
||||
|
|
@ -237,3 +271,6 @@ export const listConversationsByQuery = (
|
|||
return request.get(endpoints.conversations(pageNumber));
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteAction = async (assistant_id: string, action_id: string): Promise<void> =>
|
||||
request.delete(endpoints.assistants(`actions/${assistant_id}/${action_id}`));
|
||||
|
|
|
|||
264
packages/data-provider/src/file-config.ts
Normal file
264
packages/data-provider/src/file-config.ts
Normal file
|
|
@ -0,0 +1,264 @@
|
|||
/* eslint-disable max-len */
|
||||
import { z } from 'zod';
|
||||
import { EModelEndpoint } from './schemas';
|
||||
import type { FileConfig, EndpointFileConfig } from './types/files';
|
||||
|
||||
export const supportsFiles = {
|
||||
[EModelEndpoint.openAI]: true,
|
||||
[EModelEndpoint.google]: true,
|
||||
[EModelEndpoint.assistants]: true,
|
||||
[EModelEndpoint.azureOpenAI]: true,
|
||||
[EModelEndpoint.custom]: true,
|
||||
};
|
||||
|
||||
export const excelFileTypes = [
|
||||
'application/vnd.ms-excel',
|
||||
'application/msexcel',
|
||||
'application/x-msexcel',
|
||||
'application/x-ms-excel',
|
||||
'application/x-excel',
|
||||
'application/x-dos_ms_excel',
|
||||
'application/xls',
|
||||
'application/x-xls',
|
||||
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
|
||||
];
|
||||
|
||||
export const fullMimeTypesList = [
|
||||
'text/x-c',
|
||||
'text/x-c++',
|
||||
'application/csv',
|
||||
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
|
||||
'text/html',
|
||||
'text/x-java',
|
||||
'application/json',
|
||||
'text/markdown',
|
||||
'application/pdf',
|
||||
'text/x-php',
|
||||
'application/vnd.openxmlformats-officedocument.presentationml.presentation',
|
||||
'text/x-python',
|
||||
'text/x-script.python',
|
||||
'text/x-ruby',
|
||||
'text/x-tex',
|
||||
'text/plain',
|
||||
'text/css',
|
||||
'image/jpeg',
|
||||
'text/javascript',
|
||||
'image/gif',
|
||||
'image/png',
|
||||
'application/x-tar',
|
||||
'application/typescript',
|
||||
'application/xml',
|
||||
'application/zip',
|
||||
...excelFileTypes,
|
||||
];
|
||||
|
||||
export const codeInterpreterMimeTypesList = [
|
||||
'text/x-c',
|
||||
'text/x-c++',
|
||||
'application/csv',
|
||||
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
|
||||
'text/html',
|
||||
'text/x-java',
|
||||
'application/json',
|
||||
'text/markdown',
|
||||
'application/pdf',
|
||||
'text/x-php',
|
||||
'application/vnd.openxmlformats-officedocument.presentationml.presentation',
|
||||
'text/x-python',
|
||||
'text/x-script.python',
|
||||
'text/x-ruby',
|
||||
'text/x-tex',
|
||||
'text/plain',
|
||||
'text/css',
|
||||
'image/jpeg',
|
||||
'text/javascript',
|
||||
'image/gif',
|
||||
'image/png',
|
||||
'application/x-tar',
|
||||
'application/typescript',
|
||||
'application/xml',
|
||||
'application/zip',
|
||||
...excelFileTypes,
|
||||
];
|
||||
|
||||
export const retrievalMimeTypesList = [
|
||||
'text/x-c',
|
||||
'text/x-c++',
|
||||
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
|
||||
'text/html',
|
||||
'text/x-java',
|
||||
'application/json',
|
||||
'text/markdown',
|
||||
'application/pdf',
|
||||
'text/x-php',
|
||||
'application/vnd.openxmlformats-officedocument.presentationml.presentation',
|
||||
'text/x-python',
|
||||
'text/x-script.python',
|
||||
'text/x-ruby',
|
||||
'text/x-tex',
|
||||
'text/plain',
|
||||
];
|
||||
|
||||
export const imageExtRegex = /\.(jpg|jpeg|png|gif|webp)$/i;
|
||||
|
||||
export const excelMimeTypes =
|
||||
/^application\/(vnd\.ms-excel|msexcel|x-msexcel|x-ms-excel|x-excel|x-dos_ms_excel|xls|x-xls|vnd\.openxmlformats-officedocument\.spreadsheetml\.sheet)$/;
|
||||
|
||||
export const textMimeTypes =
|
||||
/^(text\/(x-c|x-c\+\+|x-java|html|markdown|x-php|x-python|x-script\.python|x-ruby|x-tex|plain|css|javascript|csv))$/;
|
||||
|
||||
export const applicationMimeTypes =
|
||||
/^(application\/(csv|json|pdf|x-tar|typescript|vnd\.openxmlformats-officedocument\.(wordprocessingml\.document|presentationml\.presentation|spreadsheetml\.sheet)|xml|zip))$/;
|
||||
|
||||
export const imageMimeTypes = /^image\/(jpeg|gif|png|webp)$/;
|
||||
|
||||
export const supportedMimeTypes = [
|
||||
textMimeTypes,
|
||||
excelMimeTypes,
|
||||
applicationMimeTypes,
|
||||
imageMimeTypes,
|
||||
];
|
||||
|
||||
export const codeInterpreterMimeTypes = [
|
||||
textMimeTypes,
|
||||
excelMimeTypes,
|
||||
applicationMimeTypes,
|
||||
imageMimeTypes,
|
||||
];
|
||||
|
||||
export const retrievalMimeTypes = [
|
||||
/^(text\/(x-c|x-c\+\+|html|x-java|markdown|x-php|x-python|x-script\.python|x-ruby|x-tex|plain))$/,
|
||||
/^(application\/(json|pdf|vnd\.openxmlformats-officedocument\.(wordprocessingml\.document|presentationml\.presentation)))$/,
|
||||
];
|
||||
|
||||
export const megabyte = 1024 * 1024;
|
||||
/** Helper function to get megabytes value */
|
||||
export const mbToBytes = (mb: number): number => mb * megabyte;
|
||||
|
||||
export const fileConfig = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.assistants]: {
|
||||
fileLimit: 10,
|
||||
fileSizeLimit: mbToBytes(512),
|
||||
totalSizeLimit: mbToBytes(512),
|
||||
supportedMimeTypes,
|
||||
disabled: false,
|
||||
},
|
||||
default: {
|
||||
fileLimit: 10,
|
||||
fileSizeLimit: mbToBytes(20),
|
||||
totalSizeLimit: mbToBytes(25),
|
||||
supportedMimeTypes: [imageMimeTypes],
|
||||
disabled: false,
|
||||
},
|
||||
},
|
||||
serverFileSizeLimit: mbToBytes(512),
|
||||
avatarSizeLimit: mbToBytes(2),
|
||||
checkType: function (fileType: string, supportedTypes: RegExp[] = supportedMimeTypes) {
|
||||
return supportedTypes.some((regex) => regex.test(fileType));
|
||||
},
|
||||
};
|
||||
|
||||
const supportedMimeTypesSchema = z
|
||||
.array(z.any())
|
||||
.optional()
|
||||
.refine(
|
||||
(mimeTypes) => {
|
||||
if (!mimeTypes) {
|
||||
return true;
|
||||
}
|
||||
return mimeTypes.every(
|
||||
(mimeType) => mimeType instanceof RegExp || typeof mimeType === 'string',
|
||||
);
|
||||
},
|
||||
{
|
||||
message: 'Each mimeType must be a string or a RegExp object.',
|
||||
},
|
||||
);
|
||||
|
||||
export const endpointFileConfigSchema = z.object({
|
||||
disabled: z.boolean().optional(),
|
||||
fileLimit: z.number().min(0).optional(),
|
||||
fileSizeLimit: z.number().min(0).optional(),
|
||||
totalSizeLimit: z.number().min(0).optional(),
|
||||
supportedMimeTypes: supportedMimeTypesSchema.optional(),
|
||||
});
|
||||
|
||||
export const fileConfigSchema = z.object({
|
||||
endpoints: z.record(endpointFileConfigSchema).optional(),
|
||||
serverFileSizeLimit: z.number().min(0).optional(),
|
||||
avatarSizeLimit: z.number().min(0).optional(),
|
||||
});
|
||||
|
||||
/** Helper function to safely convert string patterns to RegExp objects */
|
||||
export const convertStringsToRegex = (patterns: string[]): RegExp[] =>
|
||||
patterns.reduce((acc: RegExp[], pattern) => {
|
||||
try {
|
||||
const regex = new RegExp(pattern);
|
||||
acc.push(regex);
|
||||
} catch (error) {
|
||||
console.error(`Invalid regex pattern "${pattern}" skipped.`);
|
||||
}
|
||||
return acc;
|
||||
}, []);
|
||||
|
||||
export function mergeFileConfig(dynamic: z.infer<typeof fileConfigSchema> | undefined): FileConfig {
|
||||
const mergedConfig = fileConfig as FileConfig;
|
||||
if (!dynamic) {
|
||||
return mergedConfig;
|
||||
}
|
||||
|
||||
if (dynamic.serverFileSizeLimit !== undefined) {
|
||||
mergedConfig.serverFileSizeLimit = mbToBytes(dynamic.serverFileSizeLimit);
|
||||
}
|
||||
|
||||
if (dynamic.avatarSizeLimit !== undefined) {
|
||||
mergedConfig.avatarSizeLimit = mbToBytes(dynamic.avatarSizeLimit);
|
||||
}
|
||||
|
||||
if (!dynamic.endpoints) {
|
||||
return mergedConfig;
|
||||
}
|
||||
|
||||
for (const key in dynamic.endpoints) {
|
||||
const dynamicEndpoint = (dynamic.endpoints as Record<string, EndpointFileConfig>)[key];
|
||||
|
||||
if (!mergedConfig.endpoints[key]) {
|
||||
mergedConfig.endpoints[key] = {};
|
||||
}
|
||||
|
||||
const mergedEndpoint = mergedConfig.endpoints[key];
|
||||
|
||||
if (dynamicEndpoint.disabled === true) {
|
||||
mergedEndpoint.disabled = true;
|
||||
mergedEndpoint.fileLimit = 0;
|
||||
mergedEndpoint.fileSizeLimit = 0;
|
||||
mergedEndpoint.totalSizeLimit = 0;
|
||||
mergedEndpoint.supportedMimeTypes = [];
|
||||
continue;
|
||||
}
|
||||
|
||||
if (dynamicEndpoint.fileSizeLimit !== undefined) {
|
||||
mergedEndpoint.fileSizeLimit = mbToBytes(dynamicEndpoint.fileSizeLimit);
|
||||
}
|
||||
|
||||
if (dynamicEndpoint.totalSizeLimit !== undefined) {
|
||||
mergedEndpoint.totalSizeLimit = mbToBytes(dynamicEndpoint.totalSizeLimit);
|
||||
}
|
||||
|
||||
const configKeys = ['fileLimit'] as const;
|
||||
configKeys.forEach((field) => {
|
||||
if (dynamicEndpoint[field] !== undefined) {
|
||||
mergedEndpoint[field] = dynamicEndpoint[field];
|
||||
}
|
||||
});
|
||||
|
||||
if (dynamicEndpoint.supportedMimeTypes) {
|
||||
mergedEndpoint.supportedMimeTypes = convertStringsToRegex(
|
||||
dynamicEndpoint.supportedMimeTypes as unknown as string[],
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return mergedConfig;
|
||||
}
|
||||
|
|
@ -1,5 +1,6 @@
|
|||
/* config */
|
||||
export * from './config';
|
||||
export * from './file-config';
|
||||
/* schema helpers */
|
||||
export * from './parsers';
|
||||
/* types (exports schemas from `./types` as they contain needed in other defs) */
|
||||
|
|
@ -17,4 +18,5 @@ import * as dataService from './data-service';
|
|||
export { dataService };
|
||||
/* general helpers */
|
||||
export * from './sse';
|
||||
export * from './actions';
|
||||
export { default as createPayload } from './createPayload';
|
||||
|
|
|
|||
|
|
@ -18,13 +18,20 @@ export enum QueryKeys {
|
|||
assistant = 'assistant',
|
||||
endpointsConfigOverride = 'endpointsConfigOverride',
|
||||
files = 'files',
|
||||
fileConfig = 'fileConfig',
|
||||
tools = 'tools',
|
||||
actions = 'actions',
|
||||
assistantDocs = 'assistantDocs',
|
||||
}
|
||||
|
||||
export enum MutationKeys {
|
||||
imageUpload = 'imageUpload',
|
||||
fileUpload = 'fileUpload',
|
||||
fileDelete = 'fileDelete',
|
||||
updatePreset = 'updatePreset',
|
||||
deletePreset = 'deletePreset',
|
||||
logoutUser = 'logoutUser',
|
||||
avatarUpload = 'avatarUpload',
|
||||
assistantAvatarUpload = 'assistantAvatarUpload',
|
||||
updateAction = 'updateAction',
|
||||
deleteAction = 'deleteAction',
|
||||
}
|
||||
|
|
|
|||
|
|
@ -35,7 +35,7 @@ const endpointSchemas: Record<EModelEndpoint, EndpointSchema> = {
|
|||
[EModelEndpoint.anthropic]: anthropicSchema,
|
||||
[EModelEndpoint.chatGPTBrowser]: chatGPTBrowserSchema,
|
||||
[EModelEndpoint.gptPlugins]: gptPluginsSchema,
|
||||
[EModelEndpoint.assistant]: assistantSchema,
|
||||
[EModelEndpoint.assistants]: assistantSchema,
|
||||
};
|
||||
|
||||
// const schemaCreators: Record<EModelEndpoint, (customSchema: DefaultSchemaValues) => EndpointSchema> = {
|
||||
|
|
@ -172,16 +172,16 @@ type CompactEndpointSchema =
|
|||
| typeof compactPluginsSchema;
|
||||
|
||||
const compactEndpointSchemas: Record<string, CompactEndpointSchema> = {
|
||||
openAI: compactOpenAISchema,
|
||||
azureOpenAI: compactOpenAISchema,
|
||||
custom: compactOpenAISchema,
|
||||
assistant: assistantSchema,
|
||||
google: compactGoogleSchema,
|
||||
[EModelEndpoint.openAI]: compactOpenAISchema,
|
||||
[EModelEndpoint.azureOpenAI]: compactOpenAISchema,
|
||||
[EModelEndpoint.custom]: compactOpenAISchema,
|
||||
[EModelEndpoint.assistants]: assistantSchema,
|
||||
[EModelEndpoint.google]: compactGoogleSchema,
|
||||
/* BingAI needs all fields */
|
||||
bingAI: bingAISchema,
|
||||
anthropic: compactAnthropicSchema,
|
||||
chatGPTBrowser: compactChatGPTSchema,
|
||||
gptPlugins: compactPluginsSchema,
|
||||
[EModelEndpoint.bingAI]: bingAISchema,
|
||||
[EModelEndpoint.anthropic]: compactAnthropicSchema,
|
||||
[EModelEndpoint.chatGPTBrowser]: compactChatGPTSchema,
|
||||
[EModelEndpoint.gptPlugins]: compactPluginsSchema,
|
||||
};
|
||||
|
||||
export const parseCompactConvo = ({
|
||||
|
|
|
|||
|
|
@ -1,138 +0,0 @@
|
|||
import { useQuery, useMutation, useQueryClient, useInfiniteQuery } from '@tanstack/react-query';
|
||||
import type {
|
||||
UseQueryOptions,
|
||||
UseMutationResult,
|
||||
QueryObserverResult,
|
||||
UseInfiniteQueryOptions,
|
||||
} from '@tanstack/react-query';
|
||||
import * as t from '../types/assistants';
|
||||
import * as dataService from '../data-service';
|
||||
import { QueryKeys } from '../keys';
|
||||
|
||||
/**
|
||||
* Hook for listing all assistants, with optional parameters provided for pagination and sorting
|
||||
*/
|
||||
export const useListAssistantsQuery = <TData = t.AssistantListResponse>(
|
||||
params?: t.AssistantListParams,
|
||||
config?: UseQueryOptions<t.AssistantListResponse, unknown, TData>,
|
||||
): QueryObserverResult<TData> => {
|
||||
return useQuery<t.AssistantListResponse, unknown, TData>(
|
||||
[QueryKeys.assistants, params],
|
||||
() => dataService.listAssistants(params),
|
||||
{
|
||||
// Example selector to sort them by created_at
|
||||
// select: (res) => {
|
||||
// return res.data.sort((a, b) => a.created_at - b.created_at);
|
||||
// },
|
||||
refetchOnWindowFocus: false,
|
||||
refetchOnReconnect: false,
|
||||
refetchOnMount: false,
|
||||
retry: false,
|
||||
...config,
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
export const useListAssistantsInfiniteQuery = (
|
||||
params?: t.AssistantListParams,
|
||||
config?: UseInfiniteQueryOptions<t.AssistantListResponse, Error>,
|
||||
) => {
|
||||
return useInfiniteQuery<t.AssistantListResponse, Error>(
|
||||
['assistantsList', params],
|
||||
({ pageParam = '' }) => dataService.listAssistants({ ...params, after: pageParam }),
|
||||
{
|
||||
getNextPageParam: (lastPage) => {
|
||||
// lastPage is of type AssistantListResponse, you can use the has_more and last_id from it directly
|
||||
if (lastPage.has_more) {
|
||||
return lastPage.last_id;
|
||||
}
|
||||
return undefined;
|
||||
},
|
||||
...config,
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Hook for creating a new assistant
|
||||
*/
|
||||
export const useCreateAssistantMutation = (): UseMutationResult<
|
||||
t.Assistant,
|
||||
Error,
|
||||
t.AssistantCreateParams
|
||||
> => {
|
||||
const queryClient = useQueryClient();
|
||||
return useMutation(
|
||||
(newAssistantData: t.AssistantCreateParams) => dataService.createAssistant(newAssistantData),
|
||||
{
|
||||
onSuccess: () => {
|
||||
// Invalidate and refetch assistants query to update list
|
||||
queryClient.invalidateQueries([QueryKeys.assistants]);
|
||||
},
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Hook for retrieving details about a single assistant
|
||||
*/
|
||||
export const useGetAssistantByIdQuery = (
|
||||
assistant_id: string,
|
||||
config?: UseQueryOptions<t.Assistant>,
|
||||
): QueryObserverResult<t.Assistant> => {
|
||||
return useQuery<t.Assistant>(
|
||||
[QueryKeys.assistant, assistant_id],
|
||||
() => dataService.getAssistantById(assistant_id),
|
||||
{
|
||||
enabled: !!assistant_id, // Query will not execute until the assistant_id exists
|
||||
refetchOnWindowFocus: false,
|
||||
refetchOnReconnect: false,
|
||||
refetchOnMount: false,
|
||||
retry: false,
|
||||
...config,
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Hook for updating an assistant
|
||||
*/
|
||||
export const useUpdateAssistantMutation = (): UseMutationResult<
|
||||
t.Assistant,
|
||||
Error,
|
||||
{ assistant_id: string; data: t.AssistantUpdateParams }
|
||||
> => {
|
||||
const queryClient = useQueryClient();
|
||||
return useMutation(
|
||||
({ assistant_id, data }: { assistant_id: string; data: t.AssistantUpdateParams }) =>
|
||||
dataService.updateAssistant(assistant_id, data),
|
||||
{
|
||||
onSuccess: (_, { assistant_id }) => {
|
||||
// Invalidate and refetch assistant details query
|
||||
queryClient.invalidateQueries([QueryKeys.assistant, assistant_id]);
|
||||
// Optionally invalidate and refetch list of assistants
|
||||
queryClient.invalidateQueries([QueryKeys.assistants]);
|
||||
},
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Hook for deleting an assistant
|
||||
*/
|
||||
export const useDeleteAssistantMutation = (): UseMutationResult<
|
||||
void,
|
||||
Error,
|
||||
{ assistant_id: string }
|
||||
> => {
|
||||
const queryClient = useQueryClient();
|
||||
return useMutation(
|
||||
({ assistant_id }: { assistant_id: string }) => dataService.deleteAssistant(assistant_id),
|
||||
{
|
||||
onSuccess: () => {
|
||||
// Invalidate and refetch assistant list query
|
||||
queryClient.invalidateQueries([QueryKeys.assistants]);
|
||||
},
|
||||
},
|
||||
);
|
||||
};
|
||||
|
|
@ -1,2 +1 @@
|
|||
export * from './react-query-service';
|
||||
export * from './assistants';
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ import {
|
|||
import * as t from '../types';
|
||||
import * as s from '../schemas';
|
||||
import * as m from '../types/mutations';
|
||||
import { defaultOrderQuery } from '../config';
|
||||
import * as dataService from '../data-service';
|
||||
import request from '../request';
|
||||
import { QueryKeys } from '../keys';
|
||||
|
|
@ -136,6 +137,14 @@ export const useRevokeUserKeyMutation = (name: string): UseMutationResult<unknow
|
|||
return useMutation(() => dataService.revokeUserKey(name), {
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries([QueryKeys.name]);
|
||||
if (name === s.EModelEndpoint.assistants) {
|
||||
queryClient.invalidateQueries([QueryKeys.assistants, defaultOrderQuery]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistantDocs]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistants]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistant]);
|
||||
queryClient.invalidateQueries([QueryKeys.actions]);
|
||||
queryClient.invalidateQueries([QueryKeys.tools]);
|
||||
}
|
||||
},
|
||||
});
|
||||
};
|
||||
|
|
@ -145,6 +154,12 @@ export const useRevokeAllUserKeysMutation = (): UseMutationResult<unknown> => {
|
|||
return useMutation(() => dataService.revokeAllUserKeys(), {
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries([QueryKeys.name]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistants, defaultOrderQuery]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistantDocs]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistants]);
|
||||
queryClient.invalidateQueries([QueryKeys.assistant]);
|
||||
queryClient.invalidateQueries([QueryKeys.actions]);
|
||||
queryClient.invalidateQueries([QueryKeys.tools]);
|
||||
},
|
||||
});
|
||||
};
|
||||
|
|
@ -277,7 +292,7 @@ export const useLoginUserMutation = (): UseMutationResult<
|
|||
localStorage.removeItem('lastSelectedModel');
|
||||
localStorage.removeItem('lastSelectedTools');
|
||||
localStorage.removeItem('filesToDelete');
|
||||
localStorage.removeItem('lastAssistant');
|
||||
// localStorage.removeItem('lastAssistant');
|
||||
},
|
||||
});
|
||||
};
|
||||
|
|
|
|||
|
|
@ -1,4 +1,8 @@
|
|||
import { z } from 'zod';
|
||||
import type { TMessageContentParts } from './types/assistants';
|
||||
import type { TFile } from './types/files';
|
||||
|
||||
export const isUUID = z.string().uuid();
|
||||
|
||||
export enum EModelEndpoint {
|
||||
azureOpenAI = 'azureOpenAI',
|
||||
|
|
@ -8,10 +12,22 @@ export enum EModelEndpoint {
|
|||
google = 'google',
|
||||
gptPlugins = 'gptPlugins',
|
||||
anthropic = 'anthropic',
|
||||
assistant = 'assistant',
|
||||
assistants = 'assistants',
|
||||
custom = 'custom',
|
||||
}
|
||||
|
||||
export const defaultAssistantFormValues = {
|
||||
assistant: '',
|
||||
id: '',
|
||||
name: '',
|
||||
description: '',
|
||||
instructions: '',
|
||||
model: 'gpt-3.5-turbo-1106',
|
||||
functions: [],
|
||||
code_interpreter: false,
|
||||
retrieval: false,
|
||||
};
|
||||
|
||||
export const endpointSettings = {
|
||||
[EModelEndpoint.google]: {
|
||||
model: {
|
||||
|
|
@ -153,21 +169,16 @@ export const tMessageSchema = z.object({
|
|||
unfinished: z.boolean().optional(),
|
||||
searchResult: z.boolean().optional(),
|
||||
finish_reason: z.string().optional(),
|
||||
/* assistant */
|
||||
thread_id: z.string().optional(),
|
||||
});
|
||||
|
||||
export type TMessage = z.input<typeof tMessageSchema> & {
|
||||
children?: TMessage[];
|
||||
plugin?: TResPlugin | null;
|
||||
plugins?: TResPlugin[];
|
||||
files?: {
|
||||
file_id: string;
|
||||
type?: string;
|
||||
filename?: string;
|
||||
preview?: string;
|
||||
filepath?: string;
|
||||
height?: number;
|
||||
width?: number;
|
||||
}[];
|
||||
content?: TMessageContentParts[];
|
||||
files?: Partial<TFile>[];
|
||||
};
|
||||
|
||||
export const tConversationSchema = z.object({
|
||||
|
|
@ -204,16 +215,17 @@ export const tConversationSchema = z.object({
|
|||
toneStyle: z.string().nullable().optional(),
|
||||
maxOutputTokens: z.number().optional(),
|
||||
agentOptions: tAgentOptionsSchema.nullable().optional(),
|
||||
file_ids: z.array(z.string()).optional(),
|
||||
/* vision */
|
||||
resendImages: z.boolean().optional(),
|
||||
imageDetail: eImageDetailSchema.optional(),
|
||||
/* assistant */
|
||||
assistant_id: z.string().optional(),
|
||||
thread_id: z.string().optional(),
|
||||
instructions: z.string().optional(),
|
||||
/** Used to overwrite active conversation settings when saving a Preset */
|
||||
presetOverride: z.record(z.unknown()).optional(),
|
||||
});
|
||||
|
||||
export type TConversation = z.infer<typeof tConversationSchema>;
|
||||
|
||||
export const tPresetSchema = tConversationSchema
|
||||
.omit({
|
||||
conversationId: true,
|
||||
|
|
@ -246,6 +258,10 @@ export const tPresetUpdateSchema = tConversationSchema.merge(
|
|||
|
||||
export type TPreset = z.infer<typeof tPresetSchema>;
|
||||
|
||||
export type TConversation = z.infer<typeof tConversationSchema> & {
|
||||
presetOverride?: Partial<TPreset>;
|
||||
};
|
||||
|
||||
// type DefaultSchemaValues = Partial<typeof google>;
|
||||
|
||||
export const openAISchema = tConversationSchema
|
||||
|
|
@ -470,7 +486,8 @@ export const assistantSchema = tConversationSchema
|
|||
.pick({
|
||||
model: true,
|
||||
assistant_id: true,
|
||||
thread_id: true,
|
||||
instructions: true,
|
||||
promptPrefix: true,
|
||||
})
|
||||
.transform(removeNullishValues)
|
||||
.catch(() => ({}));
|
||||
|
|
|
|||
|
|
@ -25,6 +25,8 @@ export type TEndpointOption = {
|
|||
modelLabel?: string | null;
|
||||
jailbreak?: boolean;
|
||||
key?: string | null;
|
||||
/* assistant */
|
||||
thread_id?: string;
|
||||
};
|
||||
|
||||
export type TSubmission = {
|
||||
|
|
@ -45,11 +47,13 @@ export type TPluginAction = {
|
|||
pluginKey: string;
|
||||
action: 'install' | 'uninstall';
|
||||
auth?: unknown;
|
||||
isAssistantTool?: boolean;
|
||||
};
|
||||
|
||||
export type GroupedConversations = [key: string, TConversation[]][];
|
||||
|
||||
export type TUpdateUserPlugins = {
|
||||
isAssistantTool?: boolean;
|
||||
pluginKey: string;
|
||||
action: string;
|
||||
auth?: unknown;
|
||||
|
|
@ -108,6 +112,7 @@ export type TUpdateConversationResponse = TConversation;
|
|||
|
||||
export type TDeleteConversationRequest = {
|
||||
conversationId?: string;
|
||||
thread_id?: string;
|
||||
source?: string;
|
||||
};
|
||||
|
||||
|
|
@ -140,6 +145,7 @@ export type TConfig = {
|
|||
modelDisplayLabel?: string;
|
||||
userProvide?: boolean | null;
|
||||
userProvideURL?: boolean | null;
|
||||
disableBuilder?: boolean;
|
||||
};
|
||||
|
||||
export type TEndpointsConfig =
|
||||
|
|
|
|||
|
|
@ -1,3 +1,8 @@
|
|||
import type { OpenAPIV3 } from 'openapi-types';
|
||||
|
||||
export type Schema = OpenAPIV3.SchemaObject & { description?: string };
|
||||
export type Reference = OpenAPIV3.ReferenceObject & { description?: string };
|
||||
|
||||
export type Metadata = {
|
||||
[key: string]: unknown;
|
||||
};
|
||||
|
|
@ -12,6 +17,15 @@ export type Tool = {
|
|||
[type: string]: Tools;
|
||||
};
|
||||
|
||||
export type FunctionTool = {
|
||||
type: Tools;
|
||||
function?: {
|
||||
description: string;
|
||||
name: string;
|
||||
parameters: Record<string, unknown>;
|
||||
};
|
||||
};
|
||||
|
||||
export type Assistant = {
|
||||
id: string;
|
||||
created_at: number;
|
||||
|
|
@ -22,7 +36,7 @@ export type Assistant = {
|
|||
model: string;
|
||||
name: string | null;
|
||||
object: string;
|
||||
tools: Tool[];
|
||||
tools: FunctionTool[];
|
||||
};
|
||||
|
||||
export type AssistantCreateParams = {
|
||||
|
|
@ -32,7 +46,7 @@ export type AssistantCreateParams = {
|
|||
instructions?: string | null;
|
||||
metadata?: Metadata | null;
|
||||
name?: string | null;
|
||||
tools?: Tool[];
|
||||
tools?: Array<FunctionTool | string>;
|
||||
};
|
||||
|
||||
export type AssistantUpdateParams = {
|
||||
|
|
@ -42,7 +56,7 @@ export type AssistantUpdateParams = {
|
|||
instructions?: string | null;
|
||||
metadata?: Metadata | null;
|
||||
name?: string | null;
|
||||
tools?: Tool[];
|
||||
tools?: Array<FunctionTool | string>;
|
||||
};
|
||||
|
||||
export type AssistantListParams = {
|
||||
|
|
@ -70,3 +84,241 @@ export type File = {
|
|||
object: string;
|
||||
purpose: 'fine-tune' | 'fine-tune-results' | 'assistants' | 'assistants_output';
|
||||
};
|
||||
|
||||
/**
|
||||
* Details of the Code Interpreter tool call the run step was involved in.
|
||||
* Includes the tool call ID, the code interpreter definition, and the type of tool call.
|
||||
*/
|
||||
export type CodeToolCall = {
|
||||
id: string; // The ID of the tool call.
|
||||
code_interpreter: {
|
||||
input: string; // The input to the Code Interpreter tool call.
|
||||
outputs: Array<Record<string, unknown>>; // The outputs from the Code Interpreter tool call.
|
||||
};
|
||||
type: 'code_interpreter'; // The type of tool call, always 'code_interpreter'.
|
||||
};
|
||||
|
||||
/**
|
||||
* Details of a Function tool call the run step was involved in.
|
||||
* Includes the tool call ID, the function definition, and the type of tool call.
|
||||
*/
|
||||
export type FunctionToolCall = {
|
||||
id: string; // The ID of the tool call object.
|
||||
function: {
|
||||
arguments: string; // The arguments passed to the function.
|
||||
name: string; // The name of the function.
|
||||
output: string | null; // The output of the function, null if not submitted.
|
||||
};
|
||||
type: 'function'; // The type of tool call, always 'function'.
|
||||
};
|
||||
|
||||
/**
|
||||
* Details of a Retrieval tool call the run step was involved in.
|
||||
* Includes the tool call ID and the type of tool call.
|
||||
*/
|
||||
export type RetrievalToolCall = {
|
||||
id: string; // The ID of the tool call object.
|
||||
retrieval: unknown; // An empty object for now.
|
||||
type: 'retrieval'; // The type of tool call, always 'retrieval'.
|
||||
};
|
||||
|
||||
/**
|
||||
* Details of the tool calls involved in a run step.
|
||||
* Can be associated with one of three types of tools: `code_interpreter`, `retrieval`, or `function`.
|
||||
*/
|
||||
export type ToolCallsStepDetails = {
|
||||
tool_calls: Array<CodeToolCall | RetrievalToolCall | FunctionToolCall>; // An array of tool calls the run step was involved in.
|
||||
type: 'tool_calls'; // Always 'tool_calls'.
|
||||
};
|
||||
|
||||
export type ImageFile = {
|
||||
/**
|
||||
* The [File](https://platform.openai.com/docs/api-reference/files) ID of the image
|
||||
* in the message content.
|
||||
*/
|
||||
file_id: string;
|
||||
filename: string;
|
||||
filepath: string;
|
||||
height: number;
|
||||
width: number;
|
||||
/**
|
||||
* Prompt used to generate the image if applicable.
|
||||
*/
|
||||
prompt?: string;
|
||||
/**
|
||||
* Additional metadata used to generate or about the image/tool_call.
|
||||
*/
|
||||
metadata?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
// FileCitation.ts
|
||||
export type FileCitation = {
|
||||
end_index: number;
|
||||
file_citation: FileCitationDetails;
|
||||
start_index: number;
|
||||
text: string;
|
||||
type: 'file_citation';
|
||||
};
|
||||
|
||||
export type FileCitationDetails = {
|
||||
file_id: string;
|
||||
quote: string;
|
||||
};
|
||||
|
||||
export type FilePath = {
|
||||
end_index: number;
|
||||
file_path: FilePathDetails;
|
||||
start_index: number;
|
||||
text: string;
|
||||
type: 'file_path';
|
||||
};
|
||||
|
||||
export type FilePathDetails = {
|
||||
file_id: string;
|
||||
};
|
||||
|
||||
export type Text = {
|
||||
annotations?: Array<FileCitation | FilePath>;
|
||||
value: string;
|
||||
};
|
||||
|
||||
export enum ContentTypes {
|
||||
TEXT = 'text',
|
||||
TOOL_CALL = 'tool_call',
|
||||
IMAGE_FILE = 'image_file',
|
||||
}
|
||||
|
||||
export enum StepTypes {
|
||||
TOOL_CALLS = 'tool_calls',
|
||||
MESSAGE_CREATION = 'message_creation',
|
||||
}
|
||||
|
||||
export enum ToolCallTypes {
|
||||
FUNCTION = 'function',
|
||||
RETRIEVAL = 'retrieval',
|
||||
CODE_INTERPRETER = 'code_interpreter',
|
||||
}
|
||||
|
||||
export enum StepStatus {
|
||||
IN_PROGRESS = 'in_progress',
|
||||
CANCELLED = 'cancelled',
|
||||
FAILED = 'failed',
|
||||
COMPLETED = 'completed',
|
||||
EXPIRED = 'expired',
|
||||
}
|
||||
|
||||
export enum MessageContentTypes {
|
||||
TEXT = 'text',
|
||||
IMAGE_FILE = 'image_file',
|
||||
}
|
||||
|
||||
//enum for RunStatus
|
||||
// The status of the run: queued, in_progress, requires_action, cancelling, cancelled, failed, completed, or expired.
|
||||
export enum RunStatus {
|
||||
QUEUED = 'queued',
|
||||
IN_PROGRESS = 'in_progress',
|
||||
REQUIRES_ACTION = 'requires_action',
|
||||
CANCELLING = 'cancelling',
|
||||
CANCELLED = 'cancelled',
|
||||
FAILED = 'failed',
|
||||
COMPLETED = 'completed',
|
||||
EXPIRED = 'expired',
|
||||
}
|
||||
|
||||
export type PartMetadata = {
|
||||
progress?: number;
|
||||
asset_pointer?: string;
|
||||
status?: string;
|
||||
action?: boolean;
|
||||
};
|
||||
|
||||
export type ContentPart = (CodeToolCall | RetrievalToolCall | FunctionToolCall | ImageFile | Text) &
|
||||
PartMetadata;
|
||||
|
||||
export type TMessageContentParts =
|
||||
| { type: ContentTypes.TEXT; text: Text & PartMetadata }
|
||||
| {
|
||||
type: ContentTypes.TOOL_CALL;
|
||||
tool_call: (CodeToolCall | RetrievalToolCall | FunctionToolCall) & PartMetadata;
|
||||
}
|
||||
| { type: ContentTypes.IMAGE_FILE; image_file: ImageFile & PartMetadata };
|
||||
|
||||
export type TContentData = TMessageContentParts & {
|
||||
messageId: string;
|
||||
conversationId: string;
|
||||
userMessageId: string;
|
||||
thread_id: string;
|
||||
index: number;
|
||||
stream?: boolean;
|
||||
};
|
||||
|
||||
export const actionDelimiter = '_action_';
|
||||
|
||||
export enum AuthTypeEnum {
|
||||
ServiceHttp = 'service_http',
|
||||
OAuth = 'oauth',
|
||||
None = 'none',
|
||||
}
|
||||
|
||||
export enum AuthorizationTypeEnum {
|
||||
Bearer = 'bearer',
|
||||
Basic = 'basic',
|
||||
Custom = 'custom',
|
||||
}
|
||||
|
||||
export enum TokenExchangeMethodEnum {
|
||||
DefaultPost = 'default_post',
|
||||
BasicAuthHeader = 'basic_auth_header',
|
||||
}
|
||||
|
||||
export type ActionAuth = {
|
||||
authorization_type?: AuthorizationTypeEnum;
|
||||
custom_auth_header?: string;
|
||||
type?: AuthTypeEnum;
|
||||
authorization_content_type?: string;
|
||||
authorization_url?: string;
|
||||
client_url?: string;
|
||||
scope?: string;
|
||||
token_exchange_method?: TokenExchangeMethodEnum;
|
||||
};
|
||||
|
||||
export type ActionMetadata = {
|
||||
api_key?: string;
|
||||
auth?: ActionAuth;
|
||||
domain?: string;
|
||||
privacy_policy_url?: string;
|
||||
raw_spec?: string;
|
||||
oauth_client_id?: string;
|
||||
oauth_client_secret?: string;
|
||||
};
|
||||
|
||||
export type Action = {
|
||||
action_id: string;
|
||||
assistant_id: string;
|
||||
type?: string;
|
||||
settings?: Record<string, unknown>;
|
||||
metadata: ActionMetadata;
|
||||
};
|
||||
|
||||
export type AssistantAvatar = {
|
||||
filepath: string;
|
||||
source: string;
|
||||
};
|
||||
|
||||
export type AssistantDocument = {
|
||||
user: string;
|
||||
assistant_id: string;
|
||||
avatar?: AssistantAvatar;
|
||||
access_level?: number;
|
||||
file_ids?: string[];
|
||||
actions?: string[];
|
||||
createdAt?: Date;
|
||||
updatedAt?: Date;
|
||||
};
|
||||
|
||||
export enum FilePurpose {
|
||||
FineTune = 'fine-tune',
|
||||
FineTuneResults = 'fine-tune-results',
|
||||
Assistants = 'assistants',
|
||||
AssistantsOutput = 'assistants_output',
|
||||
}
|
||||
|
|
|
|||
|
|
@ -5,17 +5,54 @@ export enum FileSources {
|
|||
s3 = 's3',
|
||||
}
|
||||
|
||||
export enum FileContext {
|
||||
avatar = 'avatar',
|
||||
unknown = 'unknown',
|
||||
assistants = 'assistants',
|
||||
image_generation = 'image_generation',
|
||||
assistants_output = 'assistants_output',
|
||||
message_attachment = 'message_attachment',
|
||||
}
|
||||
|
||||
export type EndpointFileConfig = {
|
||||
disabled?: boolean;
|
||||
fileLimit?: number;
|
||||
fileSizeLimit?: number;
|
||||
totalSizeLimit?: number;
|
||||
supportedMimeTypes?: RegExp[];
|
||||
};
|
||||
|
||||
export type FileConfig = {
|
||||
endpoints: {
|
||||
[key: string]: EndpointFileConfig;
|
||||
};
|
||||
serverFileSizeLimit?: number;
|
||||
avatarSizeLimit?: number;
|
||||
checkType?: (fileType: string, supportedTypes: RegExp[]) => boolean;
|
||||
};
|
||||
|
||||
export type TFile = {
|
||||
message: string;
|
||||
_id?: string;
|
||||
__v?: number;
|
||||
user: string;
|
||||
conversationId?: string;
|
||||
message?: string;
|
||||
file_id: string;
|
||||
filepath: string;
|
||||
filename: string;
|
||||
type: string;
|
||||
size: number;
|
||||
temp_file_id?: string;
|
||||
bytes: number;
|
||||
filename: string;
|
||||
filepath: string;
|
||||
object: 'file';
|
||||
type: string;
|
||||
usage: number;
|
||||
context?: FileContext;
|
||||
source?: FileSources;
|
||||
height?: number;
|
||||
width?: number;
|
||||
height?: number;
|
||||
expiresAt?: string | Date;
|
||||
preview?: string;
|
||||
createdAt?: string | Date;
|
||||
updatedAt?: string | Date;
|
||||
};
|
||||
|
||||
export type TFileUpload = TFile & {
|
||||
|
|
@ -26,15 +63,10 @@ export type AvatarUploadResponse = {
|
|||
url: string;
|
||||
};
|
||||
|
||||
export type FileUploadBody = {
|
||||
formData: FormData;
|
||||
file_id: string;
|
||||
};
|
||||
|
||||
export type UploadMutationOptions = {
|
||||
onSuccess?: (data: TFileUpload, variables: FileUploadBody, context?: unknown) => void;
|
||||
onMutate?: (variables: FileUploadBody) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: FileUploadBody, context?: unknown) => void;
|
||||
onSuccess?: (data: TFileUpload, variables: FormData, context?: unknown) => void;
|
||||
onMutate?: (variables: FormData) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: FormData, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type UploadAvatarOptions = {
|
||||
|
|
@ -56,6 +88,7 @@ export type BatchFile = {
|
|||
|
||||
export type DeleteFilesBody = {
|
||||
files: BatchFile[];
|
||||
assistant_id?: string;
|
||||
};
|
||||
|
||||
export type DeleteMutationOptions = {
|
||||
|
|
|
|||
|
|
@ -1,4 +1,13 @@
|
|||
import { TPreset } from '../types';
|
||||
import { TPreset, TDeleteConversationResponse, TDeleteConversationRequest } from '../types';
|
||||
import {
|
||||
Assistant,
|
||||
AssistantCreateParams,
|
||||
AssistantUpdateParams,
|
||||
ActionMetadata,
|
||||
FunctionTool,
|
||||
AssistantDocument,
|
||||
Action,
|
||||
} from './assistants';
|
||||
|
||||
export type TGenTitleRequest = {
|
||||
conversationId: string;
|
||||
|
|
@ -34,3 +43,83 @@ export type LogoutOptions = {
|
|||
onMutate?: (variables: undefined) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: undefined, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type AssistantAvatarVariables = {
|
||||
assistant_id: string;
|
||||
formData: FormData;
|
||||
postCreation?: boolean;
|
||||
};
|
||||
|
||||
export type UpdateActionVariables = {
|
||||
assistant_id: string;
|
||||
functions: FunctionTool[];
|
||||
metadata: ActionMetadata;
|
||||
action_id?: string;
|
||||
};
|
||||
|
||||
export type UploadAssistantAvatarOptions = {
|
||||
onSuccess?: (data: Assistant, variables: AssistantAvatarVariables, context?: unknown) => void;
|
||||
onMutate?: (variables: AssistantAvatarVariables) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: AssistantAvatarVariables, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type CreateAssistantMutationOptions = {
|
||||
onSuccess?: (data: Assistant, variables: AssistantCreateParams, context?: unknown) => void;
|
||||
onMutate?: (variables: AssistantCreateParams) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: AssistantCreateParams, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type UpdateAssistantMutationOptions = {
|
||||
onSuccess?: (
|
||||
data: Assistant,
|
||||
variables: { assistant_id: string; data: AssistantUpdateParams },
|
||||
context?: unknown,
|
||||
) => void;
|
||||
onMutate?: (variables: {
|
||||
assistant_id: string;
|
||||
data: AssistantUpdateParams;
|
||||
}) => void | Promise<unknown>;
|
||||
onError?: (
|
||||
error: unknown,
|
||||
variables: { assistant_id: string; data: AssistantUpdateParams },
|
||||
context?: unknown,
|
||||
) => void;
|
||||
};
|
||||
|
||||
export type DeleteAssistantMutationOptions = {
|
||||
onSuccess?: (data: void, variables: { assistant_id: string }, context?: unknown) => void;
|
||||
onMutate?: (variables: { assistant_id: string }) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: { assistant_id: string }, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type UpdateActionResponse = [AssistantDocument, Assistant, Action];
|
||||
export type UpdateActionOptions = {
|
||||
onSuccess?: (
|
||||
data: UpdateActionResponse,
|
||||
variables: UpdateActionVariables,
|
||||
context?: unknown,
|
||||
) => void;
|
||||
onMutate?: (variables: UpdateActionVariables) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: UpdateActionVariables, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type DeleteActionVariables = {
|
||||
assistant_id: string;
|
||||
action_id: string;
|
||||
};
|
||||
|
||||
export type DeleteActionOptions = {
|
||||
onSuccess?: (data: void, variables: DeleteActionVariables, context?: unknown) => void;
|
||||
onMutate?: (variables: DeleteActionVariables) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: DeleteActionVariables, context?: unknown) => void;
|
||||
};
|
||||
|
||||
export type DeleteConversationOptions = {
|
||||
onSuccess?: (
|
||||
data: TDeleteConversationResponse,
|
||||
variables: TDeleteConversationRequest,
|
||||
context?: unknown,
|
||||
) => void;
|
||||
onMutate?: (variables: TDeleteConversationRequest) => void | Promise<unknown>;
|
||||
onError?: (error: unknown, variables: TDeleteConversationRequest, context?: unknown) => void;
|
||||
};
|
||||
|
|
|
|||
10
packages/data-provider/tsconfig.spec.json
Normal file
10
packages/data-provider/tsconfig.spec.json
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"noEmit": true,
|
||||
"outDir": "./dist/tests",
|
||||
"baseUrl": "."
|
||||
},
|
||||
"include": ["specs/**/*", "src/**/*"],
|
||||
"exclude": ["node_modules", "dist"]
|
||||
}
|
||||
Loading…
Add table
Add a link
Reference in a new issue