mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-09-22 06:00:56 +02:00

* chore: Update @modelcontextprotocol/sdk to version 1.12.3 in package.json and package-lock.json - Bump version of @modelcontextprotocol/sdk to 1.12.3 to incorporate recent updates. - Update dependencies for ajv and cross-spawn to their latest versions. - Add ajv as a new dependency in the sdk module. - Include json-schema-traverse as a new dependency in the sdk module. * feat: @librechat/auth * feat: Add crypto module exports to auth package - Introduced a new crypto module by creating index.ts in the crypto directory. - Updated the main index.ts of the auth package to export from the new crypto module. * feat: Update package dependencies and build scripts for auth package - Added @librechat/auth as a dependency in package.json and package-lock.json. - Updated build scripts to include the auth package in both frontend and bun build processes. - Removed unused mongoose and openid-client dependencies from package-lock.json for cleaner dependency management. * refactor: Migrate crypto utility functions to @librechat/auth - Replaced local crypto utility imports with the new @librechat/auth package across multiple files. - Removed the obsolete crypto.js file and its exports. - Updated relevant services and models to utilize the new encryption and decryption methods from @librechat/auth. * feat: Enhance OAuth token handling and update dependencies in auth package * chore: Remove Token model and TokenService due to restructuring of OAuth handling - Deleted the Token.js model and TokenService.js, which were responsible for managing OAuth tokens. - This change is part of a broader refactor to streamline OAuth token management and improve code organization. * refactor: imports from '@librechat/auth' to '@librechat/api' and add OAuth token handling functionality * refactor: Simplify logger usage in MCP and FlowStateManager classes * chore: fix imports * feat: Add OAuth configuration schema to MCP with token exchange method support * feat: FIRST PASS Implement MCP OAuth flow with token management and error handling - Added a new route for handling OAuth callbacks and token retrieval. - Integrated OAuth token storage and retrieval mechanisms. - Enhanced MCP connection to support automatic OAuth flow initiation on 401 errors. - Implemented dynamic client registration and metadata discovery for OAuth. - Updated MCPManager to manage OAuth tokens and handle authentication requirements. - Introduced comprehensive logging for OAuth processes and error handling. * refactor: Update MCPConnection and MCPManager to utilize new URL handling - Added a `url` property to MCPConnection for better URL management. - Refactored MCPManager to use the new `url` property instead of a deprecated method for OAuth handling. - Changed logging from info to debug level for flow manager and token methods initialization. - Improved comments for clarity on existing tokens and OAuth event listener setup. * refactor: Improve connection timeout error messages in MCPConnection and MCPManager and use initTimeout for connection - Updated the connection timeout error messages to include the duration of the timeout. - Introduced a configurable `connectTimeout` variable in both MCPConnection and MCPManager for better flexibility. * chore: cleanup MCP OAuth Token exchange handling; fix: erroneous use of flowsCache and remove verbose logs * refactor: Update MCPManager and MCPTokenStorage to use TokenMethods for token management - Removed direct token storage handling in MCPManager and replaced it with TokenMethods for better abstraction. - Refactored MCPTokenStorage methods to accept parameters for token operations, enhancing flexibility and readability. - Improved logging messages related to token persistence and retrieval processes. * refactor: Update MCP OAuth handling to use static methods and improve flow management - Refactored MCPOAuthHandler to utilize static methods for initiating and completing OAuth flows, enhancing clarity and reducing instance dependencies. - Updated MCPManager to pass flowManager explicitly to OAuth handling methods, improving flexibility in flow state management. - Enhanced comments and logging for better understanding of OAuth processes and flow state retrieval. * refactor: Integrate token methods into createMCPTool for enhanced token management * refactor: Change logging from info to debug level in MCPOAuthHandler for improved log management * chore: clean up logging * feat: first pass, auth URL from MCP OAuth flow * chore: Improve logging format for OAuth authentication URL display * chore: cleanup mcp manager comments * feat: add connection reconnection logic in MCPManager * refactor: reorganize token storage handling in MCP - Moved token storage logic from MCPManager to a new MCPTokenStorage class for better separation of concerns. - Updated imports to reflect the new token storage structure. - Enhanced methods for storing, retrieving, updating, and deleting OAuth tokens, improving overall token management. * chore: update comment for SYSTEM_USER_ID in MCPManager for clarity * feat: implement refresh token functionality in MCP - Added refresh token handling in MCPManager to support token renewal for both app-level and user-specific connections. - Introduced a refreshTokens function to facilitate token refresh logic. - Enhanced MCPTokenStorage to manage client information and refresh token processes. - Updated logging for better traceability during token operations. * chore: cleanup @librechat/auth * feat: implement MCP server initialization in a separate service - Added a new service to handle the initialization of MCP servers, improving code organization and readability. - Refactored the server startup logic to utilize the new initializeMCP function. - Removed redundant MCP initialization code from the main server file. * fix: don't log auth url for user connections * feat: enhance OAuth flow with success and error handling components - Updated OAuth callback routes to redirect to new success and error pages instead of sending status messages. - Introduced `OAuthSuccess` and `OAuthError` components to provide user feedback during authentication. - Added localization support for success and error messages in the translation files. - Implemented countdown functionality in the success component for a better user experience. * fix: refresh token handling for user connections, add missing URL and methods - add standard enum for system user id and helper for determining app-lvel vs. user-level connections * refactor: update token handling in MCPManager and MCPTokenStorage * fix: improve error logging in OAuth authentication handler * fix: concurrency issues for both login url emission and concurrency of oauth flows for shared flows (same user, same server, multiple calls for same server) * fix: properly fail shared flows for concurrent server calls and prevent duplication of tokens * chore: remove unused auth package directory from update configuration * ci: fix mocks in samlStrategy tests * ci: add mcpConfig to AppService test setup * chore: remove obsolete MCP OAuth implementation documentation * fix: update build script for API to use correct command * chore: bump version of @librechat/api to 1.2.4 * fix: update abort signal handling in createMCPTool function * fix: add optional clientInfo parameter to refreshTokensFunction metadata * refactor: replace app.locals.availableTools with getCachedTools in multiple services and controllers for improved tool management * fix: concurrent refresh token handling issue * refactor: add signal parameter to getUserConnection method for improved abort handling * chore: JSDoc typing for `loadEphemeralAgent` * refactor: update isConnectionActive method to use destructured parameters for improved readability * feat: implement caching for MCP tools to handle app-level disconnects for loading list of tools * ci: fix agent test
760 lines
24 KiB
JavaScript
760 lines
24 KiB
JavaScript
const {
|
|
FileSources,
|
|
EModelEndpoint,
|
|
EImageOutputType,
|
|
AgentCapabilities,
|
|
defaultSocialLogins,
|
|
validateAzureGroups,
|
|
defaultAgentCapabilities,
|
|
deprecatedAzureVariables,
|
|
conflictingAzureVariables,
|
|
} = require('librechat-data-provider');
|
|
|
|
const AppService = require('./AppService');
|
|
|
|
jest.mock('./Config/loadCustomConfig', () => {
|
|
return jest.fn(() =>
|
|
Promise.resolve({
|
|
registration: { socialLogins: ['testLogin'] },
|
|
fileStrategy: 'testStrategy',
|
|
balance: {
|
|
enabled: true,
|
|
},
|
|
}),
|
|
);
|
|
});
|
|
jest.mock('./Files/Firebase/initialize', () => ({
|
|
initializeFirebase: jest.fn(),
|
|
}));
|
|
jest.mock('~/models', () => ({
|
|
initializeRoles: jest.fn(),
|
|
}));
|
|
jest.mock('~/models/Role', () => ({
|
|
updateAccessPermissions: jest.fn(),
|
|
}));
|
|
jest.mock('./Config', () => ({
|
|
setCachedTools: jest.fn(),
|
|
getCachedTools: jest.fn().mockResolvedValue({
|
|
ExampleTool: {
|
|
type: 'function',
|
|
function: {
|
|
description: 'Example tool function',
|
|
name: 'exampleFunction',
|
|
parameters: {
|
|
type: 'object',
|
|
properties: {
|
|
param1: { type: 'string', description: 'An example parameter' },
|
|
},
|
|
required: ['param1'],
|
|
},
|
|
},
|
|
},
|
|
}),
|
|
}));
|
|
jest.mock('./ToolService', () => ({
|
|
loadAndFormatTools: jest.fn().mockReturnValue({
|
|
ExampleTool: {
|
|
type: 'function',
|
|
function: {
|
|
description: 'Example tool function',
|
|
name: 'exampleFunction',
|
|
parameters: {
|
|
type: 'object',
|
|
properties: {
|
|
param1: { type: 'string', description: 'An example parameter' },
|
|
},
|
|
required: ['param1'],
|
|
},
|
|
},
|
|
},
|
|
}),
|
|
}));
|
|
jest.mock('./start/turnstile', () => ({
|
|
loadTurnstileConfig: jest.fn(() => ({
|
|
siteKey: 'default-site-key',
|
|
options: {},
|
|
})),
|
|
}));
|
|
|
|
const azureGroups = [
|
|
{
|
|
group: 'librechat-westus',
|
|
apiKey: '${WESTUS_API_KEY}',
|
|
instanceName: 'librechat-westus',
|
|
version: '2023-12-01-preview',
|
|
models: {
|
|
'gpt-4-vision-preview': {
|
|
deploymentName: 'gpt-4-vision-preview',
|
|
version: '2024-02-15-preview',
|
|
},
|
|
'gpt-3.5-turbo': {
|
|
deploymentName: 'gpt-35-turbo',
|
|
},
|
|
'gpt-3.5-turbo-1106': {
|
|
deploymentName: 'gpt-35-turbo-1106',
|
|
},
|
|
'gpt-4': {
|
|
deploymentName: 'gpt-4',
|
|
},
|
|
'gpt-4-1106-preview': {
|
|
deploymentName: 'gpt-4-1106-preview',
|
|
},
|
|
},
|
|
},
|
|
{
|
|
group: 'librechat-eastus',
|
|
apiKey: '${EASTUS_API_KEY}',
|
|
instanceName: 'librechat-eastus',
|
|
deploymentName: 'gpt-4-turbo',
|
|
version: '2024-02-15-preview',
|
|
models: {
|
|
'gpt-4-turbo': true,
|
|
},
|
|
},
|
|
];
|
|
|
|
describe('AppService', () => {
|
|
let app;
|
|
const mockedTurnstileConfig = {
|
|
siteKey: 'default-site-key',
|
|
options: {},
|
|
};
|
|
|
|
beforeEach(() => {
|
|
app = { locals: {} };
|
|
process.env.CDN_PROVIDER = undefined;
|
|
});
|
|
|
|
it('should correctly assign process.env and app.locals based on custom config', async () => {
|
|
await AppService(app);
|
|
|
|
expect(process.env.CDN_PROVIDER).toEqual('testStrategy');
|
|
|
|
expect(app.locals).toEqual({
|
|
socialLogins: ['testLogin'],
|
|
fileStrategy: 'testStrategy',
|
|
interfaceConfig: expect.objectContaining({
|
|
endpointsMenu: true,
|
|
modelSelect: true,
|
|
parameters: true,
|
|
sidePanel: true,
|
|
presets: true,
|
|
}),
|
|
mcpConfig: null,
|
|
turnstileConfig: mockedTurnstileConfig,
|
|
modelSpecs: undefined,
|
|
paths: expect.anything(),
|
|
ocr: expect.anything(),
|
|
imageOutputType: expect.any(String),
|
|
fileConfig: undefined,
|
|
secureImageLinks: undefined,
|
|
balance: { enabled: true },
|
|
filteredTools: undefined,
|
|
includedTools: undefined,
|
|
webSearch: {
|
|
cohereApiKey: '${COHERE_API_KEY}',
|
|
firecrawlApiKey: '${FIRECRAWL_API_KEY}',
|
|
firecrawlApiUrl: '${FIRECRAWL_API_URL}',
|
|
jinaApiKey: '${JINA_API_KEY}',
|
|
safeSearch: 1,
|
|
serperApiKey: '${SERPER_API_KEY}',
|
|
},
|
|
memory: undefined,
|
|
agents: {
|
|
disableBuilder: false,
|
|
capabilities: expect.arrayContaining([...defaultAgentCapabilities]),
|
|
},
|
|
});
|
|
});
|
|
|
|
it('should log a warning if the config version is outdated', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
version: '0.9.0', // An outdated version for this test
|
|
registration: { socialLogins: ['testLogin'] },
|
|
fileStrategy: 'testStrategy',
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
const { logger } = require('~/config');
|
|
expect(logger.info).toHaveBeenCalledWith(expect.stringContaining('Outdated Config version'));
|
|
});
|
|
|
|
it('should change the `imageOutputType` based on config value', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
version: '0.10.0',
|
|
imageOutputType: EImageOutputType.WEBP,
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
expect(app.locals.imageOutputType).toEqual(EImageOutputType.WEBP);
|
|
});
|
|
|
|
it('should default to `PNG` `imageOutputType` with no provided type', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
version: '0.10.0',
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
expect(app.locals.imageOutputType).toEqual(EImageOutputType.PNG);
|
|
});
|
|
|
|
it('should default to `PNG` `imageOutputType` with no provided config', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve(undefined));
|
|
|
|
await AppService(app);
|
|
expect(app.locals.imageOutputType).toEqual(EImageOutputType.PNG);
|
|
});
|
|
|
|
it('should initialize Firebase when fileStrategy is firebase', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
fileStrategy: FileSources.firebase,
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
const { initializeFirebase } = require('./Files/Firebase/initialize');
|
|
expect(initializeFirebase).toHaveBeenCalled();
|
|
|
|
expect(process.env.CDN_PROVIDER).toEqual(FileSources.firebase);
|
|
});
|
|
|
|
it('should load and format tools accurately with defined structure', async () => {
|
|
const { loadAndFormatTools } = require('./ToolService');
|
|
const { setCachedTools, getCachedTools } = require('./Config');
|
|
|
|
await AppService(app);
|
|
|
|
expect(loadAndFormatTools).toHaveBeenCalledWith({
|
|
adminFilter: undefined,
|
|
adminIncluded: undefined,
|
|
directory: expect.anything(),
|
|
});
|
|
|
|
// Verify setCachedTools was called with the tools
|
|
expect(setCachedTools).toHaveBeenCalledWith(
|
|
{
|
|
ExampleTool: {
|
|
type: 'function',
|
|
function: {
|
|
description: 'Example tool function',
|
|
name: 'exampleFunction',
|
|
parameters: {
|
|
type: 'object',
|
|
properties: {
|
|
param1: { type: 'string', description: 'An example parameter' },
|
|
},
|
|
required: ['param1'],
|
|
},
|
|
},
|
|
},
|
|
},
|
|
{ isGlobal: true },
|
|
);
|
|
|
|
// Verify we can retrieve the tools from cache
|
|
const cachedTools = await getCachedTools({ includeGlobal: true });
|
|
expect(cachedTools.ExampleTool).toBeDefined();
|
|
expect(cachedTools.ExampleTool).toEqual({
|
|
type: 'function',
|
|
function: {
|
|
description: 'Example tool function',
|
|
name: 'exampleFunction',
|
|
parameters: {
|
|
type: 'object',
|
|
properties: {
|
|
param1: { type: 'string', description: 'An example parameter' },
|
|
},
|
|
required: ['param1'],
|
|
},
|
|
},
|
|
});
|
|
});
|
|
|
|
it('should correctly configure Assistants endpoint based on custom config', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.assistants]: {
|
|
disableBuilder: true,
|
|
pollIntervalMs: 5000,
|
|
timeoutMs: 30000,
|
|
supportedIds: ['id1', 'id2'],
|
|
privateAssistants: false,
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toHaveProperty(EModelEndpoint.assistants);
|
|
expect(app.locals[EModelEndpoint.assistants]).toEqual(
|
|
expect.objectContaining({
|
|
disableBuilder: true,
|
|
pollIntervalMs: 5000,
|
|
timeoutMs: 30000,
|
|
supportedIds: expect.arrayContaining(['id1', 'id2']),
|
|
privateAssistants: false,
|
|
}),
|
|
);
|
|
});
|
|
|
|
it('should correctly configure Agents endpoint based on custom config', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.agents]: {
|
|
disableBuilder: true,
|
|
recursionLimit: 10,
|
|
maxRecursionLimit: 20,
|
|
allowedProviders: ['openai', 'anthropic'],
|
|
capabilities: [AgentCapabilities.tools, AgentCapabilities.actions],
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toHaveProperty(EModelEndpoint.agents);
|
|
expect(app.locals[EModelEndpoint.agents]).toEqual(
|
|
expect.objectContaining({
|
|
disableBuilder: true,
|
|
recursionLimit: 10,
|
|
maxRecursionLimit: 20,
|
|
allowedProviders: expect.arrayContaining(['openai', 'anthropic']),
|
|
capabilities: expect.arrayContaining([AgentCapabilities.tools, AgentCapabilities.actions]),
|
|
}),
|
|
);
|
|
});
|
|
|
|
it('should configure Agents endpoint with defaults when no config is provided', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve({}));
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toHaveProperty(EModelEndpoint.agents);
|
|
expect(app.locals[EModelEndpoint.agents]).toEqual(
|
|
expect.objectContaining({
|
|
disableBuilder: false,
|
|
capabilities: expect.arrayContaining([...defaultAgentCapabilities]),
|
|
}),
|
|
);
|
|
});
|
|
|
|
it('should configure Agents endpoint with defaults when endpoints exist but agents is not defined', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.openAI]: {
|
|
titleConvo: true,
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toHaveProperty(EModelEndpoint.agents);
|
|
expect(app.locals[EModelEndpoint.agents]).toEqual(
|
|
expect.objectContaining({
|
|
disableBuilder: false,
|
|
capabilities: expect.arrayContaining([...defaultAgentCapabilities]),
|
|
}),
|
|
);
|
|
});
|
|
|
|
it('should correctly configure minimum Azure OpenAI Assistant values', async () => {
|
|
const assistantGroups = [azureGroups[0], { ...azureGroups[1], assistants: true }];
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.azureOpenAI]: {
|
|
groups: assistantGroups,
|
|
assistants: true,
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
process.env.WESTUS_API_KEY = 'westus-key';
|
|
process.env.EASTUS_API_KEY = 'eastus-key';
|
|
|
|
await AppService(app);
|
|
expect(app.locals).toHaveProperty(EModelEndpoint.azureAssistants);
|
|
expect(app.locals[EModelEndpoint.azureAssistants].capabilities.length).toEqual(3);
|
|
});
|
|
|
|
it('should correctly configure Azure OpenAI endpoint based on custom config', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.azureOpenAI]: {
|
|
groups: azureGroups,
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
process.env.WESTUS_API_KEY = 'westus-key';
|
|
process.env.EASTUS_API_KEY = 'eastus-key';
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toHaveProperty(EModelEndpoint.azureOpenAI);
|
|
const azureConfig = app.locals[EModelEndpoint.azureOpenAI];
|
|
expect(azureConfig).toHaveProperty('modelNames');
|
|
expect(azureConfig).toHaveProperty('modelGroupMap');
|
|
expect(azureConfig).toHaveProperty('groupMap');
|
|
|
|
const { modelNames, modelGroupMap, groupMap } = validateAzureGroups(azureGroups);
|
|
expect(azureConfig.modelNames).toEqual(modelNames);
|
|
expect(azureConfig.modelGroupMap).toEqual(modelGroupMap);
|
|
expect(azureConfig.groupMap).toEqual(groupMap);
|
|
});
|
|
|
|
it('should not modify FILE_UPLOAD environment variables without rate limits', async () => {
|
|
// Setup initial environment variables
|
|
process.env.FILE_UPLOAD_IP_MAX = '10';
|
|
process.env.FILE_UPLOAD_IP_WINDOW = '15';
|
|
process.env.FILE_UPLOAD_USER_MAX = '5';
|
|
process.env.FILE_UPLOAD_USER_WINDOW = '20';
|
|
|
|
const initialEnv = { ...process.env };
|
|
|
|
await AppService(app);
|
|
|
|
// Expect environment variables to remain unchanged
|
|
expect(process.env.FILE_UPLOAD_IP_MAX).toEqual(initialEnv.FILE_UPLOAD_IP_MAX);
|
|
expect(process.env.FILE_UPLOAD_IP_WINDOW).toEqual(initialEnv.FILE_UPLOAD_IP_WINDOW);
|
|
expect(process.env.FILE_UPLOAD_USER_MAX).toEqual(initialEnv.FILE_UPLOAD_USER_MAX);
|
|
expect(process.env.FILE_UPLOAD_USER_WINDOW).toEqual(initialEnv.FILE_UPLOAD_USER_WINDOW);
|
|
});
|
|
|
|
it('should correctly set FILE_UPLOAD environment variables based on rate limits', async () => {
|
|
// Define and mock a custom configuration with rate limits
|
|
const rateLimitsConfig = {
|
|
rateLimits: {
|
|
fileUploads: {
|
|
ipMax: '100',
|
|
ipWindowInMinutes: '60',
|
|
userMax: '50',
|
|
userWindowInMinutes: '30',
|
|
},
|
|
},
|
|
};
|
|
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve(rateLimitsConfig),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
// Verify that process.env has been updated according to the rate limits config
|
|
expect(process.env.FILE_UPLOAD_IP_MAX).toEqual('100');
|
|
expect(process.env.FILE_UPLOAD_IP_WINDOW).toEqual('60');
|
|
expect(process.env.FILE_UPLOAD_USER_MAX).toEqual('50');
|
|
expect(process.env.FILE_UPLOAD_USER_WINDOW).toEqual('30');
|
|
});
|
|
|
|
it('should fallback to default FILE_UPLOAD environment variables when rate limits are unspecified', async () => {
|
|
// Setup initial environment variables to non-default values
|
|
process.env.FILE_UPLOAD_IP_MAX = 'initialMax';
|
|
process.env.FILE_UPLOAD_IP_WINDOW = 'initialWindow';
|
|
process.env.FILE_UPLOAD_USER_MAX = 'initialUserMax';
|
|
process.env.FILE_UPLOAD_USER_WINDOW = 'initialUserWindow';
|
|
|
|
await AppService(app);
|
|
|
|
// Verify that process.env falls back to the initial values
|
|
expect(process.env.FILE_UPLOAD_IP_MAX).toEqual('initialMax');
|
|
expect(process.env.FILE_UPLOAD_IP_WINDOW).toEqual('initialWindow');
|
|
expect(process.env.FILE_UPLOAD_USER_MAX).toEqual('initialUserMax');
|
|
expect(process.env.FILE_UPLOAD_USER_WINDOW).toEqual('initialUserWindow');
|
|
});
|
|
|
|
it('should not modify IMPORT environment variables without rate limits', async () => {
|
|
// Setup initial environment variables
|
|
process.env.IMPORT_IP_MAX = '10';
|
|
process.env.IMPORT_IP_WINDOW = '15';
|
|
process.env.IMPORT_USER_MAX = '5';
|
|
process.env.IMPORT_USER_WINDOW = '20';
|
|
|
|
const initialEnv = { ...process.env };
|
|
|
|
await AppService(app);
|
|
|
|
// Expect environment variables to remain unchanged
|
|
expect(process.env.IMPORT_IP_MAX).toEqual(initialEnv.IMPORT_IP_MAX);
|
|
expect(process.env.IMPORT_IP_WINDOW).toEqual(initialEnv.IMPORT_IP_WINDOW);
|
|
expect(process.env.IMPORT_USER_MAX).toEqual(initialEnv.IMPORT_USER_MAX);
|
|
expect(process.env.IMPORT_USER_WINDOW).toEqual(initialEnv.IMPORT_USER_WINDOW);
|
|
});
|
|
|
|
it('should correctly set IMPORT environment variables based on rate limits', async () => {
|
|
// Define and mock a custom configuration with rate limits
|
|
const importLimitsConfig = {
|
|
rateLimits: {
|
|
conversationsImport: {
|
|
ipMax: '150',
|
|
ipWindowInMinutes: '60',
|
|
userMax: '50',
|
|
userWindowInMinutes: '30',
|
|
},
|
|
},
|
|
};
|
|
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve(importLimitsConfig),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
// Verify that process.env has been updated according to the rate limits config
|
|
expect(process.env.IMPORT_IP_MAX).toEqual('150');
|
|
expect(process.env.IMPORT_IP_WINDOW).toEqual('60');
|
|
expect(process.env.IMPORT_USER_MAX).toEqual('50');
|
|
expect(process.env.IMPORT_USER_WINDOW).toEqual('30');
|
|
});
|
|
|
|
it('should fallback to default IMPORT environment variables when rate limits are unspecified', async () => {
|
|
// Setup initial environment variables to non-default values
|
|
process.env.IMPORT_IP_MAX = 'initialMax';
|
|
process.env.IMPORT_IP_WINDOW = 'initialWindow';
|
|
process.env.IMPORT_USER_MAX = 'initialUserMax';
|
|
process.env.IMPORT_USER_WINDOW = 'initialUserWindow';
|
|
|
|
await AppService(app);
|
|
|
|
// Verify that process.env falls back to the initial values
|
|
expect(process.env.IMPORT_IP_MAX).toEqual('initialMax');
|
|
expect(process.env.IMPORT_IP_WINDOW).toEqual('initialWindow');
|
|
expect(process.env.IMPORT_USER_MAX).toEqual('initialUserMax');
|
|
expect(process.env.IMPORT_USER_WINDOW).toEqual('initialUserWindow');
|
|
});
|
|
});
|
|
|
|
describe('AppService updating app.locals and issuing warnings', () => {
|
|
let app;
|
|
let initialEnv;
|
|
|
|
beforeEach(() => {
|
|
// Store initial environment variables to restore them after each test
|
|
initialEnv = { ...process.env };
|
|
|
|
app = { locals: {} };
|
|
process.env.CDN_PROVIDER = undefined;
|
|
});
|
|
|
|
afterEach(() => {
|
|
// Restore initial environment variables
|
|
process.env = { ...initialEnv };
|
|
});
|
|
|
|
it('should update app.locals with default values if loadCustomConfig returns undefined', async () => {
|
|
// Mock loadCustomConfig to return undefined
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve(undefined));
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toBeDefined();
|
|
expect(app.locals.paths).toBeDefined();
|
|
expect(app.locals.fileStrategy).toEqual(FileSources.local);
|
|
expect(app.locals.socialLogins).toEqual(defaultSocialLogins);
|
|
expect(app.locals.balance).toEqual(
|
|
expect.objectContaining({
|
|
enabled: false,
|
|
startBalance: undefined,
|
|
}),
|
|
);
|
|
});
|
|
|
|
it('should update app.locals with values from loadCustomConfig', async () => {
|
|
// Mock loadCustomConfig to return a specific config object with a complete balance config
|
|
const customConfig = {
|
|
fileStrategy: 'firebase',
|
|
registration: { socialLogins: ['testLogin'] },
|
|
balance: {
|
|
enabled: false,
|
|
startBalance: 5000,
|
|
autoRefillEnabled: true,
|
|
refillIntervalValue: 15,
|
|
refillIntervalUnit: 'hours',
|
|
refillAmount: 5000,
|
|
},
|
|
};
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve(customConfig),
|
|
);
|
|
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toBeDefined();
|
|
expect(app.locals.paths).toBeDefined();
|
|
expect(app.locals.fileStrategy).toEqual(customConfig.fileStrategy);
|
|
expect(app.locals.socialLogins).toEqual(customConfig.registration.socialLogins);
|
|
expect(app.locals.balance).toEqual(customConfig.balance);
|
|
});
|
|
|
|
it('should apply the assistants endpoint configuration correctly to app.locals', async () => {
|
|
const mockConfig = {
|
|
endpoints: {
|
|
assistants: {
|
|
disableBuilder: true,
|
|
pollIntervalMs: 5000,
|
|
timeoutMs: 30000,
|
|
supportedIds: ['id1', 'id2'],
|
|
},
|
|
},
|
|
};
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve(mockConfig));
|
|
|
|
const app = { locals: {} };
|
|
await AppService(app);
|
|
|
|
expect(app.locals).toHaveProperty('assistants');
|
|
const { assistants } = app.locals;
|
|
expect(assistants.disableBuilder).toBe(true);
|
|
expect(assistants.pollIntervalMs).toBe(5000);
|
|
expect(assistants.timeoutMs).toBe(30000);
|
|
expect(assistants.supportedIds).toEqual(['id1', 'id2']);
|
|
expect(assistants.excludedIds).toBeUndefined();
|
|
});
|
|
|
|
it('should log a warning when both supportedIds and excludedIds are provided', async () => {
|
|
const mockConfig = {
|
|
endpoints: {
|
|
assistants: {
|
|
disableBuilder: false,
|
|
pollIntervalMs: 3000,
|
|
timeoutMs: 20000,
|
|
supportedIds: ['id1', 'id2'],
|
|
excludedIds: ['id3'],
|
|
},
|
|
},
|
|
};
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve(mockConfig));
|
|
|
|
const app = { locals: {} };
|
|
await require('./AppService')(app);
|
|
|
|
const { logger } = require('~/config');
|
|
expect(logger.warn).toHaveBeenCalledWith(
|
|
expect.stringContaining(
|
|
"The 'assistants' endpoint has both 'supportedIds' and 'excludedIds' defined.",
|
|
),
|
|
);
|
|
});
|
|
|
|
it('should log a warning when privateAssistants and supportedIds or excludedIds are provided', async () => {
|
|
const mockConfig = {
|
|
endpoints: {
|
|
assistants: {
|
|
privateAssistants: true,
|
|
supportedIds: ['id1'],
|
|
},
|
|
},
|
|
};
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve(mockConfig));
|
|
|
|
const app = { locals: {} };
|
|
await require('./AppService')(app);
|
|
|
|
const { logger } = require('~/config');
|
|
expect(logger.warn).toHaveBeenCalledWith(
|
|
expect.stringContaining(
|
|
"The 'assistants' endpoint has both 'privateAssistants' and 'supportedIds' or 'excludedIds' defined.",
|
|
),
|
|
);
|
|
});
|
|
|
|
it('should issue expected warnings when loading Azure Groups with deprecated Environment Variables', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.azureOpenAI]: {
|
|
groups: azureGroups,
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
deprecatedAzureVariables.forEach((varInfo) => {
|
|
process.env[varInfo.key] = 'test';
|
|
});
|
|
|
|
const app = { locals: {} };
|
|
await require('./AppService')(app);
|
|
|
|
const { logger } = require('~/config');
|
|
deprecatedAzureVariables.forEach(({ key, description }) => {
|
|
expect(logger.warn).toHaveBeenCalledWith(
|
|
`The \`${key}\` environment variable (related to ${description}) should not be used in combination with the \`azureOpenAI\` endpoint configuration, as you will experience conflicts and errors.`,
|
|
);
|
|
});
|
|
});
|
|
|
|
it('should issue expected warnings when loading conflicting Azure Envrionment Variables', async () => {
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() =>
|
|
Promise.resolve({
|
|
endpoints: {
|
|
[EModelEndpoint.azureOpenAI]: {
|
|
groups: azureGroups,
|
|
},
|
|
},
|
|
}),
|
|
);
|
|
|
|
conflictingAzureVariables.forEach((varInfo) => {
|
|
process.env[varInfo.key] = 'test';
|
|
});
|
|
|
|
const app = { locals: {} };
|
|
await require('./AppService')(app);
|
|
|
|
const { logger } = require('~/config');
|
|
conflictingAzureVariables.forEach(({ key }) => {
|
|
expect(logger.warn).toHaveBeenCalledWith(
|
|
`The \`${key}\` environment variable should not be used in combination with the \`azureOpenAI\` endpoint configuration, as you may experience with the defined placeholders for mapping to the current model grouping using the same name.`,
|
|
);
|
|
});
|
|
});
|
|
|
|
it('should not parse environment variable references in OCR config', async () => {
|
|
// Mock custom configuration with env variable references in OCR config
|
|
const mockConfig = {
|
|
ocr: {
|
|
apiKey: '${OCR_API_KEY_CUSTOM_VAR_NAME}',
|
|
baseURL: '${OCR_BASEURL_CUSTOM_VAR_NAME}',
|
|
strategy: 'mistral_ocr',
|
|
mistralModel: 'mistral-medium',
|
|
},
|
|
};
|
|
|
|
require('./Config/loadCustomConfig').mockImplementationOnce(() => Promise.resolve(mockConfig));
|
|
|
|
// Set actual environment variables with different values
|
|
process.env.OCR_API_KEY_CUSTOM_VAR_NAME = 'actual-api-key';
|
|
process.env.OCR_BASEURL_CUSTOM_VAR_NAME = 'https://actual-ocr-url.com';
|
|
|
|
// Initialize app
|
|
const app = { locals: {} };
|
|
await AppService(app);
|
|
|
|
// Verify that the raw string references were preserved and not interpolated
|
|
expect(app.locals.ocr).toBeDefined();
|
|
expect(app.locals.ocr.apiKey).toEqual('${OCR_API_KEY_CUSTOM_VAR_NAME}');
|
|
expect(app.locals.ocr.baseURL).toEqual('${OCR_BASEURL_CUSTOM_VAR_NAME}');
|
|
expect(app.locals.ocr.strategy).toEqual('mistral_ocr');
|
|
expect(app.locals.ocr.mistralModel).toEqual('mistral-medium');
|
|
});
|
|
});
|