mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-24 04:10:15 +01:00
🛜 refactor: Streamline App Config Usage (#9234)
* WIP: app.locals refactoring
WIP: appConfig
fix: update memory configuration retrieval to use getAppConfig based on user role
fix: update comment for AppConfig interface to clarify purpose
🏷️ refactor: Update tests to use getAppConfig for endpoint configurations
ci: Update AppService tests to initialize app config instead of app.locals
ci: Integrate getAppConfig into remaining tests
refactor: Update multer storage destination to use promise-based getAppConfig and improve error handling in tests
refactor: Rename initializeAppConfig to setAppConfig and update related tests
ci: Mock getAppConfig in various tests to provide default configurations
refactor: Update convertMCPToolsToPlugins to use mcpManager for server configuration and adjust related tests
chore: rename `Config/getAppConfig` -> `Config/app`
fix: streamline OpenAI image tools configuration by removing direct appConfig dependency and using function parameters
chore: correct parameter documentation for imageOutputType in ToolService.js
refactor: remove `getCustomConfig` dependency in config route
refactor: update domain validation to use appConfig for allowed domains
refactor: use appConfig registration property
chore: remove app parameter from AppService invocation
refactor: update AppConfig interface to correct registration and turnstile configurations
refactor: remove getCustomConfig dependency and use getAppConfig in PluginController, multer, and MCP services
refactor: replace getCustomConfig with getAppConfig in STTService, TTSService, and related files
refactor: replace getCustomConfig with getAppConfig in Conversation and Message models, update tempChatRetention functions to use AppConfig type
refactor: update getAppConfig calls in Conversation and Message models to include user role for temporary chat expiration
ci: update related tests
refactor: update getAppConfig call in getCustomConfigSpeech to include user role
fix: update appConfig usage to access allowedDomains from actions instead of registration
refactor: enhance AppConfig to include fileStrategies and update related file strategy logic
refactor: update imports to use normalizeEndpointName from @librechat/api and remove redundant definitions
chore: remove deprecated unused RunManager
refactor: get balance config primarily from appConfig
refactor: remove customConfig dependency for appConfig and streamline loadConfigModels logic
refactor: remove getCustomConfig usage and use app config in file citations
refactor: consolidate endpoint loading logic into loadEndpoints function
refactor: update appConfig access to use endpoints structure across various services
refactor: implement custom endpoints configuration and streamline endpoint loading logic
refactor: update getAppConfig call to include user role parameter
refactor: streamline endpoint configuration and enhance appConfig usage across services
refactor: replace getMCPAuthMap with getUserMCPAuthMap and remove unused getCustomConfig file
refactor: add type annotation for loadedEndpoints in loadEndpoints function
refactor: move /services/Files/images/parse to TS API
chore: add missing FILE_CITATIONS permission to IRole interface
refactor: restructure toolkits to TS API
refactor: separate manifest logic into its own module
refactor: consolidate tool loading logic into a new tools module for startup logic
refactor: move interface config logic to TS API
refactor: migrate checkEmailConfig to TypeScript and update imports
refactor: add FunctionTool interface and availableTools to AppConfig
refactor: decouple caching and DB operations from AppService, make part of consolidated `getAppConfig`
WIP: fix tests
* fix: rebase conflicts
* refactor: remove app.locals references
* refactor: replace getBalanceConfig with getAppConfig in various strategies and middleware
* refactor: replace appConfig?.balance with getBalanceConfig in various controllers and clients
* test: add balance configuration to titleConvo method in AgentClient tests
* chore: remove unused `openai-chat-tokens` package
* chore: remove unused imports in initializeMCPs.js
* refactor: update balance configuration to use getAppConfig instead of getBalanceConfig
* refactor: integrate configMiddleware for centralized configuration handling
* refactor: optimize email domain validation by removing unnecessary async calls
* refactor: simplify multer storage configuration by removing async calls
* refactor: reorder imports for better readability in user.js
* refactor: replace getAppConfig calls with req.config for improved performance
* chore: replace getAppConfig calls with req.config in tests for centralized configuration handling
* chore: remove unused override config
* refactor: add configMiddleware to endpoint route and replace getAppConfig with req.config
* chore: remove customConfig parameter from TTSService constructor
* refactor: pass appConfig from request to processFileCitations for improved configuration handling
* refactor: remove configMiddleware from endpoint route and retrieve appConfig directly in getEndpointsConfig if not in `req.config`
* test: add mockAppConfig to processFileCitations tests for improved configuration handling
* fix: pass req.config to hasCustomUserVars and call without await after synchronous refactor
* fix: type safety in useExportConversation
* refactor: retrieve appConfig using getAppConfig in PluginController and remove configMiddleware from plugins route, to avoid always retrieving when plugins are cached
* chore: change `MongoUser` typedef to `IUser`
* fix: Add `user` and `config` fields to ServerRequest and update JSDoc type annotations from Express.Request to ServerRequest
* fix: remove unused setAppConfig mock from Server configuration tests
This commit is contained in:
parent
e1ad235f17
commit
9a210971f5
210 changed files with 4102 additions and 3465 deletions
|
|
@ -1,27 +0,0 @@
|
|||
const { CacheKeys } = require('librechat-data-provider');
|
||||
const { loadOverrideConfig } = require('~/server/services/Config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
async function overrideController(req, res) {
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
let overrideConfig = await cache.get(CacheKeys.OVERRIDE_CONFIG);
|
||||
if (overrideConfig) {
|
||||
res.send(overrideConfig);
|
||||
return;
|
||||
} else if (overrideConfig === false) {
|
||||
res.send(false);
|
||||
return;
|
||||
}
|
||||
overrideConfig = await loadOverrideConfig();
|
||||
const { endpointsConfig, modelsConfig } = overrideConfig;
|
||||
if (endpointsConfig) {
|
||||
await cache.set(CacheKeys.ENDPOINT_CONFIG, endpointsConfig);
|
||||
}
|
||||
if (modelsConfig) {
|
||||
await cache.set(CacheKeys.MODELS_CONFIG, modelsConfig);
|
||||
}
|
||||
await cache.set(CacheKeys.OVERRIDE_CONFIG, overrideConfig);
|
||||
res.send(JSON.stringify(overrideConfig));
|
||||
}
|
||||
|
||||
module.exports = overrideController;
|
||||
|
|
@ -7,14 +7,9 @@ const {
|
|||
convertMCPToolToPlugin,
|
||||
convertMCPToolsToPlugins,
|
||||
} = require('@librechat/api');
|
||||
const {
|
||||
getCachedTools,
|
||||
setCachedTools,
|
||||
mergeUserTools,
|
||||
getCustomConfig,
|
||||
} = require('~/server/services/Config');
|
||||
const { loadAndFormatTools } = require('~/server/services/ToolService');
|
||||
const { getCachedTools, setCachedTools, mergeUserTools } = require('~/server/services/Config');
|
||||
const { availableTools, toolkits } = require('~/app/clients/tools');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
const { getMCPManager } = require('~/config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
|
|
@ -27,8 +22,9 @@ const getAvailablePluginsController = async (req, res) => {
|
|||
return;
|
||||
}
|
||||
|
||||
const appConfig = await getAppConfig({ role: req.user?.role });
|
||||
/** @type {{ filteredTools: string[], includedTools: string[] }} */
|
||||
const { filteredTools = [], includedTools = [] } = req.app.locals;
|
||||
const { filteredTools = [], includedTools = [] } = appConfig;
|
||||
/** @type {import('@librechat/api').LCManifestTool[]} */
|
||||
const pluginManifest = availableTools;
|
||||
|
||||
|
|
@ -74,13 +70,14 @@ const getAvailableTools = async (req, res) => {
|
|||
logger.warn('[getAvailableTools] User ID not found in request');
|
||||
return res.status(401).json({ message: 'Unauthorized' });
|
||||
}
|
||||
const customConfig = await getCustomConfig();
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
const cachedToolsArray = await cache.get(CacheKeys.TOOLS);
|
||||
const cachedUserTools = await getCachedTools({ userId });
|
||||
|
||||
const mcpManager = getMCPManager();
|
||||
const userPlugins =
|
||||
cachedUserTools != null
|
||||
? convertMCPToolsToPlugins({ functionTools: cachedUserTools, customConfig })
|
||||
? convertMCPToolsToPlugins({ functionTools: cachedUserTools, mcpManager })
|
||||
: undefined;
|
||||
|
||||
if (cachedToolsArray != null && userPlugins != null) {
|
||||
|
|
@ -93,28 +90,19 @@ const getAvailableTools = async (req, res) => {
|
|||
let toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
let prelimCachedTools;
|
||||
|
||||
// TODO: this is a temp fix until app config is refactored
|
||||
if (!toolDefinitions) {
|
||||
toolDefinitions = loadAndFormatTools({
|
||||
adminFilter: req.app.locals?.filteredTools,
|
||||
adminIncluded: req.app.locals?.includedTools,
|
||||
directory: req.app.locals?.paths.structuredTools,
|
||||
});
|
||||
prelimCachedTools = toolDefinitions;
|
||||
}
|
||||
|
||||
/** @type {import('@librechat/api').LCManifestTool[]} */
|
||||
let pluginManifest = availableTools;
|
||||
if (customConfig?.mcpServers != null) {
|
||||
|
||||
const appConfig = req.config ?? (await getAppConfig({ role: req.user?.role }));
|
||||
if (appConfig?.mcpConfig != null) {
|
||||
try {
|
||||
const mcpManager = getMCPManager();
|
||||
const mcpTools = await mcpManager.getAllToolFunctions(userId);
|
||||
prelimCachedTools = prelimCachedTools ?? {};
|
||||
for (const [toolKey, toolData] of Object.entries(mcpTools)) {
|
||||
const plugin = convertMCPToolToPlugin({
|
||||
toolKey,
|
||||
toolData,
|
||||
customConfig,
|
||||
mcpManager,
|
||||
});
|
||||
if (plugin) {
|
||||
pluginManifest.push(plugin);
|
||||
|
|
@ -161,7 +149,7 @@ const getAvailableTools = async (req, res) => {
|
|||
if (plugin.pluginKey.includes(Constants.mcp_delimiter)) {
|
||||
const parts = plugin.pluginKey.split(Constants.mcp_delimiter);
|
||||
const serverName = parts[parts.length - 1];
|
||||
const serverConfig = customConfig?.mcpServers?.[serverName];
|
||||
const serverConfig = appConfig?.mcpConfig?.[serverName];
|
||||
|
||||
if (serverConfig?.customUserVars) {
|
||||
const customVarKeys = Object.keys(serverConfig.customUserVars);
|
||||
|
|
|
|||
|
|
@ -1,30 +1,31 @@
|
|||
const { Constants } = require('librechat-data-provider');
|
||||
const { getCustomConfig, getCachedTools } = require('~/server/services/Config');
|
||||
const { getCachedTools, getAppConfig } = require('~/server/services/Config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
// Mock the dependencies
|
||||
jest.mock('@librechat/data-schemas', () => ({
|
||||
logger: {
|
||||
debug: jest.fn(),
|
||||
error: jest.fn(),
|
||||
warn: jest.fn(),
|
||||
},
|
||||
}));
|
||||
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
getCustomConfig: jest.fn(),
|
||||
getCachedTools: jest.fn(),
|
||||
getAppConfig: jest.fn().mockResolvedValue({
|
||||
filteredTools: [],
|
||||
includedTools: [],
|
||||
}),
|
||||
setCachedTools: jest.fn(),
|
||||
mergeUserTools: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('~/server/services/ToolService', () => ({
|
||||
getToolkitKey: jest.fn(),
|
||||
loadAndFormatTools: jest.fn(),
|
||||
}));
|
||||
// loadAndFormatTools mock removed - no longer used in PluginController
|
||||
|
||||
jest.mock('~/config', () => ({
|
||||
getMCPManager: jest.fn(() => ({
|
||||
loadAllManifestTools: jest.fn().mockResolvedValue([]),
|
||||
getAllToolFunctions: jest.fn().mockResolvedValue({}),
|
||||
getRawConfig: jest.fn().mockReturnValue({}),
|
||||
})),
|
||||
getFlowStateManager: jest.fn(),
|
||||
}));
|
||||
|
|
@ -39,7 +40,6 @@ jest.mock('~/cache', () => ({
|
|||
}));
|
||||
|
||||
const { getAvailableTools, getAvailablePluginsController } = require('./PluginController');
|
||||
const { loadAndFormatTools } = require('~/server/services/ToolService');
|
||||
|
||||
describe('PluginController', () => {
|
||||
let mockReq, mockRes, mockCache;
|
||||
|
|
@ -48,12 +48,9 @@ describe('PluginController', () => {
|
|||
jest.clearAllMocks();
|
||||
mockReq = {
|
||||
user: { id: 'test-user-id' },
|
||||
app: {
|
||||
locals: {
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
filteredTools: null,
|
||||
includedTools: null,
|
||||
},
|
||||
config: {
|
||||
filteredTools: [],
|
||||
includedTools: [],
|
||||
},
|
||||
};
|
||||
mockRes = { status: jest.fn().mockReturnThis(), json: jest.fn() };
|
||||
|
|
@ -63,13 +60,19 @@ describe('PluginController', () => {
|
|||
// Clear availableTools and toolkits arrays before each test
|
||||
require('~/app/clients/tools').availableTools.length = 0;
|
||||
require('~/app/clients/tools').toolkits.length = 0;
|
||||
|
||||
// Reset getCachedTools mock to ensure clean state
|
||||
getCachedTools.mockReset();
|
||||
|
||||
// Reset getAppConfig mock to ensure clean state with default values
|
||||
getAppConfig.mockReset();
|
||||
getAppConfig.mockResolvedValue({
|
||||
filteredTools: [],
|
||||
includedTools: [],
|
||||
});
|
||||
});
|
||||
|
||||
describe('getAvailablePluginsController', () => {
|
||||
beforeEach(() => {
|
||||
mockReq.app = { locals: { filteredTools: [], includedTools: [] } };
|
||||
});
|
||||
|
||||
it('should use filterUniquePlugins to remove duplicate plugins', async () => {
|
||||
// Add plugins with duplicates to availableTools
|
||||
const mockPlugins = [
|
||||
|
|
@ -82,10 +85,17 @@ describe('PluginController', () => {
|
|||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
|
||||
// Configure getAppConfig to return the expected config
|
||||
getAppConfig.mockResolvedValueOnce({
|
||||
filteredTools: [],
|
||||
includedTools: [],
|
||||
});
|
||||
|
||||
await getAvailablePluginsController(mockReq, mockRes);
|
||||
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
// The real filterUniquePlugins should have removed the duplicate
|
||||
expect(responseData).toHaveLength(2);
|
||||
expect(responseData[0].pluginKey).toBe('key1');
|
||||
expect(responseData[1].pluginKey).toBe('key2');
|
||||
|
|
@ -99,10 +109,16 @@ describe('PluginController', () => {
|
|||
require('~/app/clients/tools').availableTools.push(mockPlugin);
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
|
||||
// Configure getAppConfig to return the expected config
|
||||
getAppConfig.mockResolvedValueOnce({
|
||||
filteredTools: [],
|
||||
includedTools: [],
|
||||
});
|
||||
|
||||
await getAvailablePluginsController(mockReq, mockRes);
|
||||
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
// checkPluginAuth returns false, so authenticated property is not added
|
||||
// The real checkPluginAuth returns false for plugins without authConfig, so authenticated property is not added
|
||||
expect(responseData[0].authenticated).toBeUndefined();
|
||||
});
|
||||
|
||||
|
|
@ -126,9 +142,14 @@ describe('PluginController', () => {
|
|||
];
|
||||
|
||||
require('~/app/clients/tools').availableTools.push(...mockPlugins);
|
||||
mockReq.app.locals.includedTools = ['key1'];
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
|
||||
// Configure getAppConfig to return config with includedTools
|
||||
getAppConfig.mockResolvedValueOnce({
|
||||
filteredTools: [],
|
||||
includedTools: ['key1'],
|
||||
});
|
||||
|
||||
await getAvailablePluginsController(mockReq, mockRes);
|
||||
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
|
|
@ -152,20 +173,26 @@ describe('PluginController', () => {
|
|||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValueOnce(mockUserTools);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock second call to return tool definitions
|
||||
// Mock second call to return tool definitions (includeGlobal: true)
|
||||
getCachedTools.mockResolvedValueOnce(mockUserTools);
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
// convertMCPToolsToPlugins should have converted the tool
|
||||
expect(responseData).toBeDefined();
|
||||
expect(Array.isArray(responseData)).toBe(true);
|
||||
expect(responseData.length).toBeGreaterThan(0);
|
||||
const convertedTool = responseData.find(
|
||||
(tool) => tool.pluginKey === `tool1${Constants.mcp_delimiter}server1`,
|
||||
);
|
||||
expect(convertedTool).toBeDefined();
|
||||
// The real convertMCPToolsToPlugins extracts the name from the delimiter
|
||||
expect(convertedTool.name).toBe('tool1');
|
||||
});
|
||||
|
||||
|
|
@ -188,15 +215,20 @@ describe('PluginController', () => {
|
|||
|
||||
mockCache.get.mockResolvedValue(mockCachedPlugins);
|
||||
getCachedTools.mockResolvedValueOnce(mockUserTools);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock second call to return tool definitions
|
||||
getCachedTools.mockResolvedValueOnce(mockUserTools);
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
// Should have deduplicated tools with same pluginKey
|
||||
expect(Array.isArray(responseData)).toBe(true);
|
||||
// The real filterUniquePlugins should have deduplicated tools with same pluginKey
|
||||
const userToolCount = responseData.filter((tool) => tool.pluginKey === 'user-tool').length;
|
||||
expect(userToolCount).toBe(1);
|
||||
});
|
||||
|
|
@ -213,11 +245,15 @@ describe('PluginController', () => {
|
|||
require('~/app/clients/tools').availableTools.push(mockPlugin);
|
||||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
// First call returns null for user tools
|
||||
getCachedTools.mockResolvedValueOnce(null);
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock loadAndFormatTools to return tool definitions including our tool
|
||||
loadAndFormatTools.mockReturnValue({
|
||||
// Second call (with includeGlobal: true) returns the tool definitions
|
||||
getCachedTools.mockResolvedValueOnce({
|
||||
tool1: {
|
||||
type: 'function',
|
||||
function: {
|
||||
|
|
@ -235,7 +271,7 @@ describe('PluginController', () => {
|
|||
expect(Array.isArray(responseData)).toBe(true);
|
||||
const tool = responseData.find((t) => t.pluginKey === 'tool1');
|
||||
expect(tool).toBeDefined();
|
||||
// checkPluginAuth returns false, so authenticated property is not added
|
||||
// The real checkPluginAuth returns false for plugins without authConfig, so authenticated property is not added
|
||||
expect(tool.authenticated).toBeUndefined();
|
||||
});
|
||||
|
||||
|
|
@ -257,11 +293,15 @@ describe('PluginController', () => {
|
|||
});
|
||||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
// First call returns null for user tools
|
||||
getCachedTools.mockResolvedValueOnce(null);
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock loadAndFormatTools to return tool definitions
|
||||
loadAndFormatTools.mockReturnValue({
|
||||
// Second call (with includeGlobal: true) returns the tool definitions
|
||||
getCachedTools.mockResolvedValueOnce({
|
||||
toolkit1_function: {
|
||||
type: 'function',
|
||||
function: {
|
||||
|
|
@ -283,9 +323,8 @@ describe('PluginController', () => {
|
|||
});
|
||||
|
||||
describe('plugin.icon behavior', () => {
|
||||
const callGetAvailableToolsWithMCPServer = async (mcpServers) => {
|
||||
const callGetAvailableToolsWithMCPServer = async (serverConfig) => {
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue({ mcpServers });
|
||||
|
||||
const functionTools = {
|
||||
[`test-tool${Constants.mcp_delimiter}test-server`]: {
|
||||
|
|
@ -298,17 +337,24 @@ describe('PluginController', () => {
|
|||
},
|
||||
};
|
||||
|
||||
// Mock the MCP manager to return tools
|
||||
// Mock the MCP manager to return tools and server config
|
||||
const mockMCPManager = {
|
||||
getAllToolFunctions: jest.fn().mockResolvedValue(functionTools),
|
||||
getRawConfig: jest.fn().mockReturnValue(serverConfig),
|
||||
};
|
||||
require('~/config').getMCPManager.mockReturnValue(mockMCPManager);
|
||||
|
||||
// First call returns empty user tools
|
||||
getCachedTools.mockResolvedValueOnce({});
|
||||
|
||||
// Mock loadAndFormatTools to return empty object since these are MCP tools
|
||||
loadAndFormatTools.mockReturnValue({});
|
||||
// Mock getAppConfig to return the mcpConfig
|
||||
mockReq.config = {
|
||||
mcpConfig: {
|
||||
'test-server': serverConfig,
|
||||
},
|
||||
};
|
||||
|
||||
// Second call (with includeGlobal: true) returns the tool definitions
|
||||
getCachedTools.mockResolvedValueOnce(functionTools);
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
|
@ -319,28 +365,24 @@ describe('PluginController', () => {
|
|||
};
|
||||
|
||||
it('should set plugin.icon when iconPath is defined', async () => {
|
||||
const mcpServers = {
|
||||
'test-server': {
|
||||
iconPath: '/path/to/icon.png',
|
||||
},
|
||||
const serverConfig = {
|
||||
iconPath: '/path/to/icon.png',
|
||||
};
|
||||
const testTool = await callGetAvailableToolsWithMCPServer(mcpServers);
|
||||
const testTool = await callGetAvailableToolsWithMCPServer(serverConfig);
|
||||
expect(testTool.icon).toBe('/path/to/icon.png');
|
||||
});
|
||||
|
||||
it('should set plugin.icon to undefined when iconPath is not defined', async () => {
|
||||
const mcpServers = {
|
||||
'test-server': {},
|
||||
};
|
||||
const testTool = await callGetAvailableToolsWithMCPServer(mcpServers);
|
||||
const serverConfig = {};
|
||||
const testTool = await callGetAvailableToolsWithMCPServer(serverConfig);
|
||||
expect(testTool.icon).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('helper function integration', () => {
|
||||
it('should properly handle MCP tools with custom user variables', async () => {
|
||||
const customConfig = {
|
||||
mcpServers: {
|
||||
const appConfig = {
|
||||
mcpConfig: {
|
||||
'test-server': {
|
||||
customUserVars: {
|
||||
API_KEY: { title: 'API Key', description: 'Your API key' },
|
||||
|
|
@ -364,24 +406,28 @@ describe('PluginController', () => {
|
|||
// Mock the MCP manager to return tools
|
||||
const mockMCPManager = {
|
||||
getAllToolFunctions: jest.fn().mockResolvedValue(mcpToolFunctions),
|
||||
getRawConfig: jest.fn().mockReturnValue({
|
||||
customUserVars: {
|
||||
API_KEY: { title: 'API Key', description: 'Your API key' },
|
||||
},
|
||||
}),
|
||||
};
|
||||
require('~/config').getMCPManager.mockReturnValue(mockMCPManager);
|
||||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue(customConfig);
|
||||
mockReq.config = appConfig;
|
||||
|
||||
// First call returns user tools (empty in this case)
|
||||
getCachedTools.mockResolvedValueOnce({});
|
||||
|
||||
// Mock loadAndFormatTools to return empty object for MCP tools
|
||||
loadAndFormatTools.mockReturnValue({});
|
||||
|
||||
// Second call returns tool definitions including our MCP tool
|
||||
// Second call (with includeGlobal: true) returns tool definitions including our MCP tool
|
||||
getCachedTools.mockResolvedValueOnce(mcpToolFunctions);
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
expect(Array.isArray(responseData)).toBe(true);
|
||||
|
||||
// Find the MCP tool in the response
|
||||
const mcpTool = responseData.find(
|
||||
|
|
@ -417,24 +463,36 @@ describe('PluginController', () => {
|
|||
|
||||
it('should handle null cachedTools and cachedUserTools', async () => {
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
// First call returns null for user tools
|
||||
getCachedTools.mockResolvedValueOnce(null);
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock loadAndFormatTools to return empty object when getCachedTools returns null
|
||||
loadAndFormatTools.mockReturnValue({});
|
||||
// Mock MCP manager to return no tools
|
||||
const mockMCPManager = {
|
||||
getAllToolFunctions: jest.fn().mockResolvedValue({}),
|
||||
getRawConfig: jest.fn().mockReturnValue({}),
|
||||
};
|
||||
require('~/config').getMCPManager.mockReturnValue(mockMCPManager);
|
||||
|
||||
// Second call (with includeGlobal: true) returns empty object instead of null
|
||||
getCachedTools.mockResolvedValueOnce({});
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
// Should handle null values gracefully
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
expect(mockRes.json).toHaveBeenCalledWith([]);
|
||||
});
|
||||
|
||||
it('should handle when getCachedTools returns undefined', async () => {
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
|
||||
// Mock loadAndFormatTools to return empty object when getCachedTools returns undefined
|
||||
loadAndFormatTools.mockReturnValue({});
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock getCachedTools to return undefined for both calls
|
||||
getCachedTools.mockReset();
|
||||
|
|
@ -444,6 +502,7 @@ describe('PluginController', () => {
|
|||
|
||||
// Should handle undefined values gracefully
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
expect(mockRes.json).toHaveBeenCalledWith([]);
|
||||
});
|
||||
|
||||
it('should handle cachedToolsArray and userPlugins both being defined', async () => {
|
||||
|
|
@ -461,8 +520,18 @@ describe('PluginController', () => {
|
|||
};
|
||||
|
||||
mockCache.get.mockResolvedValue(cachedTools);
|
||||
getCachedTools.mockResolvedValue(userTools);
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValueOnce(userTools);
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// The controller expects a second call to getCachedTools
|
||||
getCachedTools.mockResolvedValueOnce({
|
||||
'cached-tool': { type: 'function', function: { name: 'cached-tool' } },
|
||||
[`user-tool${Constants.mcp_delimiter}server1`]:
|
||||
userTools[`user-tool${Constants.mcp_delimiter}server1`],
|
||||
});
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
|
|
@ -474,8 +543,20 @@ describe('PluginController', () => {
|
|||
|
||||
it('should handle empty toolDefinitions object', async () => {
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValueOnce({}).mockResolvedValueOnce({});
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
// Reset getCachedTools to ensure clean state
|
||||
getCachedTools.mockReset();
|
||||
getCachedTools.mockResolvedValue({});
|
||||
mockReq.config = {}; // No mcpConfig at all
|
||||
|
||||
// Ensure no plugins are available
|
||||
require('~/app/clients/tools').availableTools.length = 0;
|
||||
|
||||
// Reset MCP manager to default state
|
||||
const mockMCPManager = {
|
||||
getAllToolFunctions: jest.fn().mockResolvedValue({}),
|
||||
getRawConfig: jest.fn().mockReturnValue({}),
|
||||
};
|
||||
require('~/config').getMCPManager.mockReturnValue(mockMCPManager);
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
|
|
@ -484,8 +565,8 @@ describe('PluginController', () => {
|
|||
});
|
||||
|
||||
it('should handle MCP tools without customUserVars', async () => {
|
||||
const customConfig = {
|
||||
mcpServers: {
|
||||
const appConfig = {
|
||||
mcpConfig: {
|
||||
'test-server': {
|
||||
// No customUserVars defined
|
||||
},
|
||||
|
|
@ -494,30 +575,60 @@ describe('PluginController', () => {
|
|||
|
||||
const mockUserTools = {
|
||||
[`tool1${Constants.mcp_delimiter}test-server`]: {
|
||||
function: { name: 'tool1', description: 'Tool 1' },
|
||||
type: 'function',
|
||||
function: {
|
||||
name: `tool1${Constants.mcp_delimiter}test-server`,
|
||||
description: 'Tool 1',
|
||||
parameters: { type: 'object', properties: {} },
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
// Mock the MCP manager to return the tools
|
||||
const mockMCPManager = {
|
||||
getAllToolFunctions: jest.fn().mockResolvedValue(mockUserTools),
|
||||
getRawConfig: jest.fn().mockReturnValue({
|
||||
// No customUserVars defined
|
||||
}),
|
||||
};
|
||||
require('~/config').getMCPManager.mockReturnValue(mockMCPManager);
|
||||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCustomConfig.mockResolvedValue(customConfig);
|
||||
mockReq.config = appConfig;
|
||||
// First call returns empty user tools
|
||||
getCachedTools.mockResolvedValueOnce({});
|
||||
|
||||
// Second call (with includeGlobal: true) returns the tool definitions
|
||||
getCachedTools.mockResolvedValueOnce(mockUserTools);
|
||||
|
||||
getCachedTools.mockResolvedValueOnce({
|
||||
[`tool1${Constants.mcp_delimiter}test-server`]: true,
|
||||
});
|
||||
// Ensure no plugins in availableTools for clean test
|
||||
require('~/app/clients/tools').availableTools.length = 0;
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
const responseData = mockRes.json.mock.calls[0][0];
|
||||
expect(responseData[0].authenticated).toBe(true);
|
||||
// The actual implementation doesn't set authConfig on tools without customUserVars
|
||||
expect(responseData[0].authConfig).toEqual([]);
|
||||
expect(Array.isArray(responseData)).toBe(true);
|
||||
expect(responseData.length).toBeGreaterThan(0);
|
||||
|
||||
const mcpTool = responseData.find(
|
||||
(tool) => tool.pluginKey === `tool1${Constants.mcp_delimiter}test-server`,
|
||||
);
|
||||
|
||||
expect(mcpTool).toBeDefined();
|
||||
expect(mcpTool.authenticated).toBe(true);
|
||||
// The actual implementation sets authConfig to empty array when no customUserVars
|
||||
expect(mcpTool.authConfig).toEqual([]);
|
||||
});
|
||||
|
||||
it('should handle req.app.locals with undefined filteredTools and includedTools', async () => {
|
||||
mockReq.app = { locals: {} };
|
||||
it('should handle undefined filteredTools and includedTools', async () => {
|
||||
mockReq.config = {};
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
|
||||
// Configure getAppConfig to return config with undefined properties
|
||||
// The controller will use default values [] for filteredTools and includedTools
|
||||
getAppConfig.mockResolvedValueOnce({});
|
||||
|
||||
await getAvailablePluginsController(mockReq, mockRes);
|
||||
|
||||
expect(mockRes.status).toHaveBeenCalledWith(200);
|
||||
|
|
@ -532,27 +643,21 @@ describe('PluginController', () => {
|
|||
toolkit: true,
|
||||
};
|
||||
|
||||
// Ensure req.app.locals is properly mocked
|
||||
mockReq.app = {
|
||||
locals: {
|
||||
filteredTools: [],
|
||||
includedTools: [],
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
},
|
||||
};
|
||||
// No need to mock app.locals anymore as it's not used
|
||||
|
||||
// Add the toolkit to availableTools
|
||||
require('~/app/clients/tools').availableTools.push(mockToolkit);
|
||||
|
||||
mockCache.get.mockResolvedValue(null);
|
||||
getCachedTools.mockResolvedValue({});
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
// First call returns empty object
|
||||
getCachedTools.mockResolvedValueOnce({});
|
||||
mockReq.config = {
|
||||
mcpConfig: null,
|
||||
paths: { structuredTools: '/mock/path' },
|
||||
};
|
||||
|
||||
// Mock loadAndFormatTools to return an empty object when toolDefinitions is null
|
||||
loadAndFormatTools.mockReturnValue({});
|
||||
|
||||
// Mock getCachedTools second call to return null
|
||||
getCachedTools.mockResolvedValueOnce({}).mockResolvedValueOnce(null);
|
||||
// Second call (with includeGlobal: true) returns empty object to avoid null reference error
|
||||
getCachedTools.mockResolvedValueOnce({});
|
||||
|
||||
await getAvailableTools(mockReq, mockRes);
|
||||
|
||||
|
|
|
|||
|
|
@ -17,12 +17,14 @@ const { needsRefresh, getNewS3URL } = require('~/server/services/Files/S3/crud')
|
|||
const { Tools, Constants, FileSources } = require('librechat-data-provider');
|
||||
const { processDeleteRequest } = require('~/server/services/Files/process');
|
||||
const { Transaction, Balance, User } = require('~/db/models');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
const { deleteToolCalls } = require('~/models/ToolCall');
|
||||
const { deleteAllSharedLinks } = require('~/models');
|
||||
const { getMCPManager } = require('~/config');
|
||||
|
||||
const getUserController = async (req, res) => {
|
||||
/** @type {MongoUser} */
|
||||
const appConfig = await getAppConfig({ role: req.user?.role });
|
||||
/** @type {IUser} */
|
||||
const userData = req.user.toObject != null ? req.user.toObject() : { ...req.user };
|
||||
/**
|
||||
* These fields should not exist due to secure field selection, but deletion
|
||||
|
|
@ -31,7 +33,7 @@ const getUserController = async (req, res) => {
|
|||
delete userData.password;
|
||||
delete userData.totpSecret;
|
||||
delete userData.backupCodes;
|
||||
if (req.app.locals.fileStrategy === FileSources.s3 && userData.avatar) {
|
||||
if (appConfig.fileStrategy === FileSources.s3 && userData.avatar) {
|
||||
const avatarNeedsRefresh = needsRefresh(userData.avatar, 3600);
|
||||
if (!avatarNeedsRefresh) {
|
||||
return res.status(200).send(userData);
|
||||
|
|
@ -87,6 +89,7 @@ const deleteUserFiles = async (req) => {
|
|||
};
|
||||
|
||||
const updateUserPluginsController = async (req, res) => {
|
||||
const appConfig = await getAppConfig({ role: req.user?.role });
|
||||
const { user } = req;
|
||||
const { pluginKey, action, auth, isEntityTool } = req.body;
|
||||
try {
|
||||
|
|
@ -131,7 +134,7 @@ const updateUserPluginsController = async (req, res) => {
|
|||
|
||||
if (pluginKey === Tools.web_search) {
|
||||
/** @type {TCustomConfig['webSearch']} */
|
||||
const webSearchConfig = req.app.locals?.webSearch;
|
||||
const webSearchConfig = appConfig?.webSearch;
|
||||
keys = extractWebSearchEnvVars({
|
||||
keys: action === 'install' ? keys : webSearchKeys,
|
||||
config: webSearchConfig,
|
||||
|
|
|
|||
|
|
@ -246,6 +246,7 @@ function createToolEndCallback({ req, res, artifactPromises }) {
|
|||
const attachment = await processFileCitations({
|
||||
user,
|
||||
metadata,
|
||||
appConfig: req.config,
|
||||
toolArtifact: output.artifact,
|
||||
toolCallId: output.tool_call_id,
|
||||
});
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ const {
|
|||
createRun,
|
||||
Tokenizer,
|
||||
checkAccess,
|
||||
getBalanceConfig,
|
||||
memoryInstructions,
|
||||
formatContentStrings,
|
||||
createMemoryProcessor,
|
||||
|
|
@ -446,8 +447,8 @@ class AgentClient extends BaseClient {
|
|||
);
|
||||
return;
|
||||
}
|
||||
/** @type {TCustomConfig['memory']} */
|
||||
const memoryConfig = this.options.req?.app?.locals?.memory;
|
||||
const appConfig = this.options.req.config;
|
||||
const memoryConfig = appConfig.memory;
|
||||
if (!memoryConfig || memoryConfig.disabled === true) {
|
||||
return;
|
||||
}
|
||||
|
|
@ -455,7 +456,7 @@ class AgentClient extends BaseClient {
|
|||
/** @type {Agent} */
|
||||
let prelimAgent;
|
||||
const allowedProviders = new Set(
|
||||
this.options.req?.app?.locals?.[EModelEndpoint.agents]?.allowedProviders,
|
||||
appConfig?.endpoints?.[EModelEndpoint.agents]?.allowedProviders,
|
||||
);
|
||||
try {
|
||||
if (memoryConfig.agent?.id != null && memoryConfig.agent.id !== this.options.agent.id) {
|
||||
|
|
@ -577,8 +578,8 @@ class AgentClient extends BaseClient {
|
|||
if (this.processMemory == null) {
|
||||
return;
|
||||
}
|
||||
/** @type {TCustomConfig['memory']} */
|
||||
const memoryConfig = this.options.req?.app?.locals?.memory;
|
||||
const appConfig = this.options.req.config;
|
||||
const memoryConfig = appConfig.memory;
|
||||
const messageWindowSize = memoryConfig?.messageWindowSize ?? 5;
|
||||
|
||||
let messagesToProcess = [...messages];
|
||||
|
|
@ -620,9 +621,15 @@ class AgentClient extends BaseClient {
|
|||
* @param {Object} params
|
||||
* @param {string} [params.model]
|
||||
* @param {string} [params.context='message']
|
||||
* @param {AppConfig['balance']} [params.balance]
|
||||
* @param {UsageMetadata[]} [params.collectedUsage=this.collectedUsage]
|
||||
*/
|
||||
async recordCollectedUsage({ model, context = 'message', collectedUsage = this.collectedUsage }) {
|
||||
async recordCollectedUsage({
|
||||
model,
|
||||
balance,
|
||||
context = 'message',
|
||||
collectedUsage = this.collectedUsage,
|
||||
}) {
|
||||
if (!collectedUsage || !collectedUsage.length) {
|
||||
return;
|
||||
}
|
||||
|
|
@ -644,6 +651,7 @@ class AgentClient extends BaseClient {
|
|||
|
||||
const txMetadata = {
|
||||
context,
|
||||
balance,
|
||||
conversationId: this.conversationId,
|
||||
user: this.user ?? this.options.req.user?.id,
|
||||
endpointTokenConfig: this.options.endpointTokenConfig,
|
||||
|
|
@ -761,8 +769,9 @@ class AgentClient extends BaseClient {
|
|||
abortController = new AbortController();
|
||||
}
|
||||
|
||||
/** @type {TCustomConfig['endpoints']['agents']} */
|
||||
const agentsEConfig = this.options.req.app.locals[EModelEndpoint.agents];
|
||||
const appConfig = this.options.req.config;
|
||||
/** @type {AppConfig['endpoints']['agents']} */
|
||||
const agentsEConfig = appConfig.endpoints?.[EModelEndpoint.agents];
|
||||
|
||||
config = {
|
||||
configurable: {
|
||||
|
|
@ -1030,7 +1039,8 @@ class AgentClient extends BaseClient {
|
|||
this.artifactPromises.push(...attachments);
|
||||
}
|
||||
|
||||
await this.recordCollectedUsage({ context: 'message' });
|
||||
const balanceConfig = getBalanceConfig(appConfig);
|
||||
await this.recordCollectedUsage({ context: 'message', balance: balanceConfig });
|
||||
} catch (err) {
|
||||
logger.error(
|
||||
'[api/server/controllers/agents/client.js #chatCompletion] Error recording collected usage',
|
||||
|
|
@ -1071,6 +1081,7 @@ class AgentClient extends BaseClient {
|
|||
}
|
||||
const { handleLLMEnd, collected: collectedMetadata } = createMetadataAggregator();
|
||||
const { req, res, agent } = this.options;
|
||||
const appConfig = req.config;
|
||||
let endpoint = agent.endpoint;
|
||||
|
||||
/** @type {import('@librechat/agents').ClientOptions} */
|
||||
|
|
@ -1078,11 +1089,13 @@ class AgentClient extends BaseClient {
|
|||
model: agent.model || agent.model_parameters.model,
|
||||
};
|
||||
|
||||
let titleProviderConfig = await getProviderConfig(endpoint);
|
||||
let titleProviderConfig = getProviderConfig({ provider: endpoint, appConfig });
|
||||
|
||||
/** @type {TEndpoint | undefined} */
|
||||
const endpointConfig =
|
||||
req.app.locals.all ?? req.app.locals[endpoint] ?? titleProviderConfig.customEndpointConfig;
|
||||
appConfig.endpoints?.all ??
|
||||
appConfig.endpoints?.[endpoint] ??
|
||||
titleProviderConfig.customEndpointConfig;
|
||||
if (!endpointConfig) {
|
||||
logger.warn(
|
||||
'[api/server/controllers/agents/client.js #titleConvo] Error getting endpoint config',
|
||||
|
|
@ -1091,7 +1104,10 @@ class AgentClient extends BaseClient {
|
|||
|
||||
if (endpointConfig?.titleEndpoint && endpointConfig.titleEndpoint !== endpoint) {
|
||||
try {
|
||||
titleProviderConfig = await getProviderConfig(endpointConfig.titleEndpoint);
|
||||
titleProviderConfig = getProviderConfig({
|
||||
provider: endpointConfig.titleEndpoint,
|
||||
appConfig,
|
||||
});
|
||||
endpoint = endpointConfig.titleEndpoint;
|
||||
} catch (error) {
|
||||
logger.warn(
|
||||
|
|
@ -1100,7 +1116,7 @@ class AgentClient extends BaseClient {
|
|||
);
|
||||
// Fall back to original provider config
|
||||
endpoint = agent.endpoint;
|
||||
titleProviderConfig = await getProviderConfig(endpoint);
|
||||
titleProviderConfig = getProviderConfig({ provider: endpoint, appConfig });
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1203,10 +1219,12 @@ class AgentClient extends BaseClient {
|
|||
};
|
||||
});
|
||||
|
||||
const balanceConfig = getBalanceConfig(appConfig);
|
||||
await this.recordCollectedUsage({
|
||||
model: clientOptions.model,
|
||||
context: 'title',
|
||||
collectedUsage,
|
||||
context: 'title',
|
||||
model: clientOptions.model,
|
||||
balance: balanceConfig,
|
||||
}).catch((err) => {
|
||||
logger.error(
|
||||
'[api/server/controllers/agents/client.js #titleConvo] Error recording collected usage',
|
||||
|
|
@ -1225,17 +1243,26 @@ class AgentClient extends BaseClient {
|
|||
* @param {object} params
|
||||
* @param {number} params.promptTokens
|
||||
* @param {number} params.completionTokens
|
||||
* @param {OpenAIUsageMetadata} [params.usage]
|
||||
* @param {string} [params.model]
|
||||
* @param {OpenAIUsageMetadata} [params.usage]
|
||||
* @param {AppConfig['balance']} [params.balance]
|
||||
* @param {string} [params.context='message']
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async recordTokenUsage({ model, promptTokens, completionTokens, usage, context = 'message' }) {
|
||||
async recordTokenUsage({
|
||||
model,
|
||||
usage,
|
||||
balance,
|
||||
promptTokens,
|
||||
completionTokens,
|
||||
context = 'message',
|
||||
}) {
|
||||
try {
|
||||
await spendTokens(
|
||||
{
|
||||
model,
|
||||
context,
|
||||
balance,
|
||||
conversationId: this.conversationId,
|
||||
user: this.user ?? this.options.req.user?.id,
|
||||
endpointTokenConfig: this.options.endpointTokenConfig,
|
||||
|
|
@ -1252,6 +1279,7 @@ class AgentClient extends BaseClient {
|
|||
await spendTokens(
|
||||
{
|
||||
model,
|
||||
balance,
|
||||
context: 'reasoning',
|
||||
conversationId: this.conversationId,
|
||||
user: this.user ?? this.options.req.user?.id,
|
||||
|
|
|
|||
|
|
@ -41,8 +41,16 @@ describe('AgentClient - titleConvo', () => {
|
|||
|
||||
// Mock request and response
|
||||
mockReq = {
|
||||
app: {
|
||||
locals: {
|
||||
user: {
|
||||
id: 'user-123',
|
||||
},
|
||||
body: {
|
||||
model: 'gpt-4',
|
||||
endpoint: EModelEndpoint.openAI,
|
||||
key: null,
|
||||
},
|
||||
config: {
|
||||
endpoints: {
|
||||
[EModelEndpoint.openAI]: {
|
||||
// Match the agent endpoint
|
||||
titleModel: 'gpt-3.5-turbo',
|
||||
|
|
@ -52,14 +60,6 @@ describe('AgentClient - titleConvo', () => {
|
|||
},
|
||||
},
|
||||
},
|
||||
user: {
|
||||
id: 'user-123',
|
||||
},
|
||||
body: {
|
||||
model: 'gpt-4',
|
||||
endpoint: EModelEndpoint.openAI,
|
||||
key: null,
|
||||
},
|
||||
};
|
||||
|
||||
mockRes = {};
|
||||
|
|
@ -143,7 +143,7 @@ describe('AgentClient - titleConvo', () => {
|
|||
|
||||
it('should handle missing endpoint config gracefully', async () => {
|
||||
// Remove endpoint config
|
||||
mockReq.app.locals[EModelEndpoint.openAI] = undefined;
|
||||
mockReq.config = { endpoints: {} };
|
||||
|
||||
const text = 'Test conversation text';
|
||||
const abortController = new AbortController();
|
||||
|
|
@ -161,7 +161,16 @@ describe('AgentClient - titleConvo', () => {
|
|||
|
||||
it('should use agent model when titleModel is not provided', async () => {
|
||||
// Remove titleModel from config
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI].titleModel;
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.openAI]: {
|
||||
titlePrompt: 'Custom title prompt',
|
||||
titleMethod: 'structured',
|
||||
titlePromptTemplate: 'Template: {{content}}',
|
||||
// titleModel is omitted
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = 'Test conversation text';
|
||||
const abortController = new AbortController();
|
||||
|
|
@ -173,7 +182,16 @@ describe('AgentClient - titleConvo', () => {
|
|||
});
|
||||
|
||||
it('should not use titleModel when it equals CURRENT_MODEL constant', async () => {
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titleModel = Constants.CURRENT_MODEL;
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.openAI]: {
|
||||
titleModel: Constants.CURRENT_MODEL,
|
||||
titlePrompt: 'Custom title prompt',
|
||||
titleMethod: 'structured',
|
||||
titlePromptTemplate: 'Template: {{content}}',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = 'Test conversation text';
|
||||
const abortController = new AbortController();
|
||||
|
|
@ -216,6 +234,9 @@ describe('AgentClient - titleConvo', () => {
|
|||
model: 'gpt-3.5-turbo',
|
||||
context: 'title',
|
||||
collectedUsage: expect.any(Array),
|
||||
balance: {
|
||||
enabled: false,
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
|
|
@ -245,10 +266,17 @@ describe('AgentClient - titleConvo', () => {
|
|||
process.env.ANTHROPIC_API_KEY = 'test-api-key';
|
||||
|
||||
// Add titleEndpoint to the config
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titleEndpoint = EModelEndpoint.anthropic;
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titleMethod = 'structured';
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titlePrompt = 'Custom title prompt';
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titlePromptTemplate = 'Custom template';
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.openAI]: {
|
||||
titleModel: 'gpt-3.5-turbo',
|
||||
titleEndpoint: EModelEndpoint.anthropic,
|
||||
titleMethod: 'structured',
|
||||
titlePrompt: 'Custom title prompt',
|
||||
titlePromptTemplate: 'Custom template',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = 'Test conversation text';
|
||||
const abortController = new AbortController();
|
||||
|
|
@ -274,18 +302,16 @@ describe('AgentClient - titleConvo', () => {
|
|||
});
|
||||
|
||||
it('should use all config when endpoint config is missing', async () => {
|
||||
// Remove endpoint-specific config
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI].titleModel;
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI].titlePrompt;
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI].titleMethod;
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI].titlePromptTemplate;
|
||||
|
||||
// Set 'all' config
|
||||
mockReq.app.locals.all = {
|
||||
titleModel: 'gpt-4o-mini',
|
||||
titlePrompt: 'All config title prompt',
|
||||
titleMethod: 'completion',
|
||||
titlePromptTemplate: 'All config template: {{content}}',
|
||||
// Set 'all' config without endpoint-specific config
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
all: {
|
||||
titleModel: 'gpt-4o-mini',
|
||||
titlePrompt: 'All config title prompt',
|
||||
titleMethod: 'completion',
|
||||
titlePromptTemplate: 'All config template: {{content}}',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = 'Test conversation text';
|
||||
|
|
@ -309,17 +335,21 @@ describe('AgentClient - titleConvo', () => {
|
|||
|
||||
it('should prioritize all config over endpoint config for title settings', async () => {
|
||||
// Set both endpoint and 'all' config
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titleModel = 'gpt-3.5-turbo';
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titlePrompt = 'Endpoint title prompt';
|
||||
mockReq.app.locals[EModelEndpoint.openAI].titleMethod = 'structured';
|
||||
// Remove titlePromptTemplate from endpoint config to test fallback
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI].titlePromptTemplate;
|
||||
|
||||
mockReq.app.locals.all = {
|
||||
titleModel: 'gpt-4o-mini',
|
||||
titlePrompt: 'All config title prompt',
|
||||
titleMethod: 'completion',
|
||||
titlePromptTemplate: 'All config template',
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.openAI]: {
|
||||
titleModel: 'gpt-3.5-turbo',
|
||||
titlePrompt: 'Endpoint title prompt',
|
||||
titleMethod: 'structured',
|
||||
// titlePromptTemplate is omitted to test fallback
|
||||
},
|
||||
all: {
|
||||
titleModel: 'gpt-4o-mini',
|
||||
titlePrompt: 'All config title prompt',
|
||||
titleMethod: 'completion',
|
||||
titlePromptTemplate: 'All config template',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = 'Test conversation text';
|
||||
|
|
@ -346,17 +376,18 @@ describe('AgentClient - titleConvo', () => {
|
|||
const originalApiKey = process.env.ANTHROPIC_API_KEY;
|
||||
process.env.ANTHROPIC_API_KEY = 'test-anthropic-key';
|
||||
|
||||
// Remove endpoint-specific config to test 'all' config
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI];
|
||||
|
||||
// Set comprehensive 'all' config with all new title options
|
||||
mockReq.app.locals.all = {
|
||||
titleConvo: true,
|
||||
titleModel: 'claude-3-haiku-20240307',
|
||||
titleMethod: 'completion', // Testing the new default method
|
||||
titlePrompt: 'Generate a concise, descriptive title for this conversation',
|
||||
titlePromptTemplate: 'Conversation summary: {{content}}',
|
||||
titleEndpoint: EModelEndpoint.anthropic, // Should switch provider to Anthropic
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
all: {
|
||||
titleConvo: true,
|
||||
titleModel: 'claude-3-haiku-20240307',
|
||||
titleMethod: 'completion', // Testing the new default method
|
||||
titlePrompt: 'Generate a concise, descriptive title for this conversation',
|
||||
titlePromptTemplate: 'Conversation summary: {{content}}',
|
||||
titleEndpoint: EModelEndpoint.anthropic, // Should switch provider to Anthropic
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = 'Test conversation about AI and machine learning';
|
||||
|
|
@ -402,15 +433,16 @@ describe('AgentClient - titleConvo', () => {
|
|||
// Clear previous calls
|
||||
mockRun.generateTitle.mockClear();
|
||||
|
||||
// Remove endpoint config
|
||||
delete mockReq.app.locals[EModelEndpoint.openAI];
|
||||
|
||||
// Set 'all' config with specific titleMethod
|
||||
mockReq.app.locals.all = {
|
||||
titleModel: 'gpt-4o-mini',
|
||||
titleMethod: method,
|
||||
titlePrompt: `Testing ${method} method`,
|
||||
titlePromptTemplate: `Template for ${method}: {{content}}`,
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
all: {
|
||||
titleModel: 'gpt-4o-mini',
|
||||
titleMethod: method,
|
||||
titlePrompt: `Testing ${method} method`,
|
||||
titlePromptTemplate: `Template for ${method}: {{content}}`,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const text = `Test conversation for ${method} method`;
|
||||
|
|
@ -455,29 +487,33 @@ describe('AgentClient - titleConvo', () => {
|
|||
// Set up Azure endpoint with serverless config
|
||||
mockAgent.endpoint = EModelEndpoint.azureOpenAI;
|
||||
mockAgent.provider = EModelEndpoint.azureOpenAI;
|
||||
mockReq.app.locals[EModelEndpoint.azureOpenAI] = {
|
||||
titleConvo: true,
|
||||
titleModel: 'grok-3',
|
||||
titleMethod: 'completion',
|
||||
titlePrompt: 'Azure serverless title prompt',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'grok-3': {
|
||||
group: 'Azure AI Foundry',
|
||||
deploymentName: 'grok-3',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'Azure AI Foundry': {
|
||||
apiKey: '${AZURE_API_KEY}',
|
||||
baseURL: 'https://test.services.ai.azure.com/models',
|
||||
version: '2024-05-01-preview',
|
||||
serverless: true,
|
||||
models: {
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
titleConvo: true,
|
||||
titleModel: 'grok-3',
|
||||
titleMethod: 'completion',
|
||||
titlePrompt: 'Azure serverless title prompt',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'grok-3': {
|
||||
group: 'Azure AI Foundry',
|
||||
deploymentName: 'grok-3',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'Azure AI Foundry': {
|
||||
apiKey: '${AZURE_API_KEY}',
|
||||
baseURL: 'https://test.services.ai.azure.com/models',
|
||||
version: '2024-05-01-preview',
|
||||
serverless: true,
|
||||
models: {
|
||||
'grok-3': {
|
||||
deploymentName: 'grok-3',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
|
@ -503,28 +539,32 @@ describe('AgentClient - titleConvo', () => {
|
|||
// Set up Azure endpoint
|
||||
mockAgent.endpoint = EModelEndpoint.azureOpenAI;
|
||||
mockAgent.provider = EModelEndpoint.azureOpenAI;
|
||||
mockReq.app.locals[EModelEndpoint.azureOpenAI] = {
|
||||
titleConvo: true,
|
||||
titleModel: 'gpt-4o',
|
||||
titleMethod: 'structured',
|
||||
titlePrompt: 'Azure instance title prompt',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'gpt-4o': {
|
||||
group: 'eastus',
|
||||
deploymentName: 'gpt-4o',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
eastus: {
|
||||
apiKey: '${EASTUS_API_KEY}',
|
||||
instanceName: 'region-instance',
|
||||
version: '2024-02-15-preview',
|
||||
models: {
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
titleConvo: true,
|
||||
titleModel: 'gpt-4o',
|
||||
titleMethod: 'structured',
|
||||
titlePrompt: 'Azure instance title prompt',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'gpt-4o': {
|
||||
group: 'eastus',
|
||||
deploymentName: 'gpt-4o',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
eastus: {
|
||||
apiKey: '${EASTUS_API_KEY}',
|
||||
instanceName: 'region-instance',
|
||||
version: '2024-02-15-preview',
|
||||
models: {
|
||||
'gpt-4o': {
|
||||
deploymentName: 'gpt-4o',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
|
@ -551,29 +591,33 @@ describe('AgentClient - titleConvo', () => {
|
|||
mockAgent.endpoint = EModelEndpoint.azureOpenAI;
|
||||
mockAgent.provider = EModelEndpoint.azureOpenAI;
|
||||
mockAgent.model_parameters.model = 'gpt-4o-latest';
|
||||
mockReq.app.locals[EModelEndpoint.azureOpenAI] = {
|
||||
titleConvo: true,
|
||||
titleModel: Constants.CURRENT_MODEL,
|
||||
titleMethod: 'functions',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'gpt-4o-latest': {
|
||||
group: 'region-eastus',
|
||||
deploymentName: 'gpt-4o-mini',
|
||||
version: '2024-02-15-preview',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'region-eastus': {
|
||||
apiKey: '${EASTUS2_API_KEY}',
|
||||
instanceName: 'test-instance',
|
||||
version: '2024-12-01-preview',
|
||||
models: {
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
titleConvo: true,
|
||||
titleModel: Constants.CURRENT_MODEL,
|
||||
titleMethod: 'functions',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'gpt-4o-latest': {
|
||||
group: 'region-eastus',
|
||||
deploymentName: 'gpt-4o-mini',
|
||||
version: '2024-02-15-preview',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'region-eastus': {
|
||||
apiKey: '${EASTUS2_API_KEY}',
|
||||
instanceName: 'test-instance',
|
||||
version: '2024-12-01-preview',
|
||||
models: {
|
||||
'gpt-4o-latest': {
|
||||
deploymentName: 'gpt-4o-mini',
|
||||
version: '2024-02-15-preview',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
|
@ -598,56 +642,60 @@ describe('AgentClient - titleConvo', () => {
|
|||
// Set up Azure endpoint
|
||||
mockAgent.endpoint = EModelEndpoint.azureOpenAI;
|
||||
mockAgent.provider = EModelEndpoint.azureOpenAI;
|
||||
mockReq.app.locals[EModelEndpoint.azureOpenAI] = {
|
||||
titleConvo: true,
|
||||
titleModel: 'o1-mini',
|
||||
titleMethod: 'completion',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'gpt-4o': {
|
||||
group: 'eastus',
|
||||
deploymentName: 'gpt-4o',
|
||||
},
|
||||
'o1-mini': {
|
||||
group: 'region-eastus',
|
||||
deploymentName: 'o1-mini',
|
||||
},
|
||||
'codex-mini': {
|
||||
group: 'codex-mini',
|
||||
deploymentName: 'codex-mini',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
eastus: {
|
||||
apiKey: '${EASTUS_API_KEY}',
|
||||
instanceName: 'region-eastus',
|
||||
version: '2024-02-15-preview',
|
||||
models: {
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
titleConvo: true,
|
||||
titleModel: 'o1-mini',
|
||||
titleMethod: 'completion',
|
||||
streamRate: 35,
|
||||
modelGroupMap: {
|
||||
'gpt-4o': {
|
||||
group: 'eastus',
|
||||
deploymentName: 'gpt-4o',
|
||||
},
|
||||
},
|
||||
},
|
||||
'region-eastus': {
|
||||
apiKey: '${EASTUS2_API_KEY}',
|
||||
instanceName: 'region-eastus2',
|
||||
version: '2024-12-01-preview',
|
||||
models: {
|
||||
'o1-mini': {
|
||||
group: 'region-eastus',
|
||||
deploymentName: 'o1-mini',
|
||||
},
|
||||
},
|
||||
},
|
||||
'codex-mini': {
|
||||
apiKey: '${AZURE_API_KEY}',
|
||||
baseURL: 'https://example.cognitiveservices.azure.com/openai/',
|
||||
version: '2025-04-01-preview',
|
||||
serverless: true,
|
||||
models: {
|
||||
'codex-mini': {
|
||||
group: 'codex-mini',
|
||||
deploymentName: 'codex-mini',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
eastus: {
|
||||
apiKey: '${EASTUS_API_KEY}',
|
||||
instanceName: 'region-eastus',
|
||||
version: '2024-02-15-preview',
|
||||
models: {
|
||||
'gpt-4o': {
|
||||
deploymentName: 'gpt-4o',
|
||||
},
|
||||
},
|
||||
},
|
||||
'region-eastus': {
|
||||
apiKey: '${EASTUS2_API_KEY}',
|
||||
instanceName: 'region-eastus2',
|
||||
version: '2024-12-01-preview',
|
||||
models: {
|
||||
'o1-mini': {
|
||||
deploymentName: 'o1-mini',
|
||||
},
|
||||
},
|
||||
},
|
||||
'codex-mini': {
|
||||
apiKey: '${AZURE_API_KEY}',
|
||||
baseURL: 'https://example.cognitiveservices.azure.com/openai/',
|
||||
version: '2025-04-01-preview',
|
||||
serverless: true,
|
||||
models: {
|
||||
'codex-mini': {
|
||||
deploymentName: 'codex-mini',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
|
@ -679,33 +727,34 @@ describe('AgentClient - titleConvo', () => {
|
|||
mockReq.body.endpoint = EModelEndpoint.azureOpenAI;
|
||||
mockReq.body.model = 'gpt-4';
|
||||
|
||||
// Remove Azure-specific config
|
||||
delete mockReq.app.locals[EModelEndpoint.azureOpenAI];
|
||||
|
||||
// Set 'all' config as fallback with a serverless Azure config
|
||||
mockReq.app.locals.all = {
|
||||
titleConvo: true,
|
||||
titleModel: 'gpt-4',
|
||||
titleMethod: 'structured',
|
||||
titlePrompt: 'Fallback title prompt from all config',
|
||||
titlePromptTemplate: 'Template: {{content}}',
|
||||
modelGroupMap: {
|
||||
'gpt-4': {
|
||||
group: 'default-group',
|
||||
deploymentName: 'gpt-4',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'default-group': {
|
||||
apiKey: '${AZURE_API_KEY}',
|
||||
baseURL: 'https://default.openai.azure.com/',
|
||||
version: '2024-02-15-preview',
|
||||
serverless: true,
|
||||
models: {
|
||||
mockReq.config = {
|
||||
endpoints: {
|
||||
all: {
|
||||
titleConvo: true,
|
||||
titleModel: 'gpt-4',
|
||||
titleMethod: 'structured',
|
||||
titlePrompt: 'Fallback title prompt from all config',
|
||||
titlePromptTemplate: 'Template: {{content}}',
|
||||
modelGroupMap: {
|
||||
'gpt-4': {
|
||||
group: 'default-group',
|
||||
deploymentName: 'gpt-4',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'default-group': {
|
||||
apiKey: '${AZURE_API_KEY}',
|
||||
baseURL: 'https://default.openai.azure.com/',
|
||||
version: '2024-02-15-preview',
|
||||
serverless: true,
|
||||
models: {
|
||||
'gpt-4': {
|
||||
deploymentName: 'gpt-4',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
|
@ -982,13 +1031,6 @@ describe('AgentClient - titleConvo', () => {
|
|||
};
|
||||
|
||||
mockReq = {
|
||||
app: {
|
||||
locals: {
|
||||
memory: {
|
||||
messageWindowSize: 3,
|
||||
},
|
||||
},
|
||||
},
|
||||
user: {
|
||||
id: 'user-123',
|
||||
personalization: {
|
||||
|
|
@ -997,6 +1039,13 @@ describe('AgentClient - titleConvo', () => {
|
|||
},
|
||||
};
|
||||
|
||||
// Mock getAppConfig for memory tests
|
||||
mockReq.config = {
|
||||
memory: {
|
||||
messageWindowSize: 3,
|
||||
},
|
||||
};
|
||||
|
||||
mockRes = {};
|
||||
|
||||
mockOptions = {
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ const getLogStores = require('~/cache/getLogStores');
|
|||
|
||||
/**
|
||||
* @typedef {Object} ErrorHandlerDependencies
|
||||
* @property {Express.Request} req - The Express request object
|
||||
* @property {ServerRequest} req - The Express request object
|
||||
* @property {Express.Response} res - The Express response object
|
||||
* @property {() => ErrorHandlerContext} getContext - Function to get the current context
|
||||
* @property {string} [originPath] - The origin path for the error handler
|
||||
|
|
|
|||
|
|
@ -487,6 +487,7 @@ const getListAgentsHandler = async (req, res) => {
|
|||
*/
|
||||
const uploadAgentAvatarHandler = async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
filterFile({ req, file: req.file, image: true, isAvatar: true });
|
||||
const { agent_id } = req.params;
|
||||
if (!agent_id) {
|
||||
|
|
@ -510,9 +511,7 @@ const uploadAgentAvatarHandler = async (req, res) => {
|
|||
}
|
||||
|
||||
const buffer = await fs.readFile(req.file.path);
|
||||
|
||||
const fileStrategy = getFileStrategy(req.app.locals, { isAvatar: true });
|
||||
|
||||
const fileStrategy = getFileStrategy(appConfig, { isAvatar: true });
|
||||
const resizedBuffer = await resizeAvatar({
|
||||
userId: req.user.id,
|
||||
input: buffer,
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
const { v4 } = require('uuid');
|
||||
const { sleep } = require('@librechat/agents');
|
||||
const { sendEvent } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { sendEvent, getBalanceConfig } = require('@librechat/api');
|
||||
const {
|
||||
Time,
|
||||
Constants,
|
||||
|
|
@ -47,6 +47,7 @@ const { getOpenAIClient } = require('./helpers');
|
|||
* @returns {void}
|
||||
*/
|
||||
const chatV1 = async (req, res) => {
|
||||
const appConfig = req.config;
|
||||
logger.debug('[/assistants/chat/] req.body', req.body);
|
||||
|
||||
const {
|
||||
|
|
@ -251,8 +252,8 @@ const chatV1 = async (req, res) => {
|
|||
}
|
||||
|
||||
const checkBalanceBeforeRun = async () => {
|
||||
const balance = req.app?.locals?.balance;
|
||||
if (!balance?.enabled) {
|
||||
const balanceConfig = getBalanceConfig(appConfig);
|
||||
if (!balanceConfig?.enabled) {
|
||||
return;
|
||||
}
|
||||
const transactions =
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
const { v4 } = require('uuid');
|
||||
const { sleep } = require('@librechat/agents');
|
||||
const { sendEvent } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { sendEvent, getBalanceConfig } = require('@librechat/api');
|
||||
const {
|
||||
Time,
|
||||
Constants,
|
||||
|
|
@ -38,12 +38,13 @@ const { getOpenAIClient } = require('./helpers');
|
|||
* @route POST /
|
||||
* @desc Chat with an assistant
|
||||
* @access Public
|
||||
* @param {Express.Request} req - The request object, containing the request data.
|
||||
* @param {ServerRequest} req - The request object, containing the request data.
|
||||
* @param {Express.Response} res - The response object, used to send back a response.
|
||||
* @returns {void}
|
||||
*/
|
||||
const chatV2 = async (req, res) => {
|
||||
logger.debug('[/assistants/chat/] req.body', req.body);
|
||||
const appConfig = req.config;
|
||||
|
||||
/** @type {{files: MongoFile[]}} */
|
||||
const {
|
||||
|
|
@ -126,8 +127,8 @@ const chatV2 = async (req, res) => {
|
|||
}
|
||||
|
||||
const checkBalanceBeforeRun = async () => {
|
||||
const balance = req.app?.locals?.balance;
|
||||
if (!balance?.enabled) {
|
||||
const balanceConfig = getBalanceConfig(appConfig);
|
||||
if (!balanceConfig?.enabled) {
|
||||
return;
|
||||
}
|
||||
const transactions =
|
||||
|
|
@ -374,9 +375,9 @@ const chatV2 = async (req, res) => {
|
|||
};
|
||||
|
||||
/** @type {undefined | TAssistantEndpoint} */
|
||||
const config = req.app.locals[endpoint] ?? {};
|
||||
const config = appConfig.endpoints?.[endpoint] ?? {};
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const allConfig = req.app.locals.all;
|
||||
const allConfig = appConfig.endpoints?.all;
|
||||
|
||||
const streamRunManager = new StreamRunManager({
|
||||
req,
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ const getLogStores = require('~/cache/getLogStores');
|
|||
|
||||
/**
|
||||
* @typedef {Object} ErrorHandlerDependencies
|
||||
* @property {Express.Request} req - The Express request object
|
||||
* @property {ServerRequest} req - The Express request object
|
||||
* @property {Express.Response} res - The Express response object
|
||||
* @property {() => ErrorHandlerContext} getContext - Function to get the current context
|
||||
* @property {string} [originPath] - The origin path for the error handler
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ const { initializeClient } = require('~/server/services/Endpoints/assistants');
|
|||
const { getEndpointsConfig } = require('~/server/services/Config');
|
||||
|
||||
/**
|
||||
* @param {Express.Request} req
|
||||
* @param {ServerRequest} req
|
||||
* @param {string} [endpoint]
|
||||
* @returns {Promise<string>}
|
||||
*/
|
||||
|
|
@ -210,6 +210,7 @@ async function getOpenAIClient({ req, res, endpointOption, initAppClient, overri
|
|||
* @returns {Promise<AssistantListResponse>} 200 - success response - application/json
|
||||
*/
|
||||
const fetchAssistants = async ({ req, res, overrideEndpoint }) => {
|
||||
const appConfig = req.config;
|
||||
const {
|
||||
limit = 100,
|
||||
order = 'desc',
|
||||
|
|
@ -230,20 +231,20 @@ const fetchAssistants = async ({ req, res, overrideEndpoint }) => {
|
|||
if (endpoint === EModelEndpoint.assistants) {
|
||||
({ body } = await listAllAssistants({ req, res, version, query }));
|
||||
} else if (endpoint === EModelEndpoint.azureAssistants) {
|
||||
const azureConfig = req.app.locals[EModelEndpoint.azureOpenAI];
|
||||
const azureConfig = appConfig.endpoints?.[EModelEndpoint.azureOpenAI];
|
||||
body = await listAssistantsForAzure({ req, res, version, azureConfig, query });
|
||||
}
|
||||
|
||||
if (req.user.role === SystemRoles.ADMIN) {
|
||||
return body;
|
||||
} else if (!req.app.locals[endpoint]) {
|
||||
} else if (!appConfig.endpoints?.[endpoint]) {
|
||||
return body;
|
||||
}
|
||||
|
||||
body.data = filterAssistants({
|
||||
userId: req.user.id,
|
||||
assistants: body.data,
|
||||
assistantsConfig: req.app.locals[endpoint],
|
||||
assistantsConfig: appConfig.endpoints?.[endpoint],
|
||||
});
|
||||
return body;
|
||||
};
|
||||
|
|
|
|||
|
|
@ -258,8 +258,9 @@ function filterAssistantDocs({ documents, userId, assistantsConfig = {} }) {
|
|||
*/
|
||||
const getAssistantDocuments = async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const endpoint = req.query;
|
||||
const assistantsConfig = req.app.locals[endpoint];
|
||||
const assistantsConfig = appConfig.endpoints?.[endpoint];
|
||||
const documents = await getAssistants(
|
||||
{},
|
||||
{
|
||||
|
|
@ -296,6 +297,7 @@ const getAssistantDocuments = async (req, res) => {
|
|||
*/
|
||||
const uploadAssistantAvatar = async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
filterFile({ req, file: req.file, image: true, isAvatar: true });
|
||||
const { assistant_id } = req.params;
|
||||
if (!assistant_id) {
|
||||
|
|
@ -337,7 +339,7 @@ const uploadAssistantAvatar = async (req, res) => {
|
|||
const metadata = {
|
||||
..._metadata,
|
||||
avatar: image.filepath,
|
||||
avatar_source: req.app.locals.fileStrategy,
|
||||
avatar_source: appConfig.fileStrategy,
|
||||
};
|
||||
|
||||
const promises = [];
|
||||
|
|
@ -347,7 +349,7 @@ const uploadAssistantAvatar = async (req, res) => {
|
|||
{
|
||||
avatar: {
|
||||
filepath: image.filepath,
|
||||
source: req.app.locals.fileStrategy,
|
||||
source: appConfig.fileStrategy,
|
||||
},
|
||||
user: req.user.id,
|
||||
},
|
||||
|
|
|
|||
|
|
@ -94,7 +94,7 @@ const createAssistant = async (req, res) => {
|
|||
/**
|
||||
* Modifies an assistant.
|
||||
* @param {object} params
|
||||
* @param {Express.Request} params.req
|
||||
* @param {ServerRequest} params.req
|
||||
* @param {OpenAIClient} params.openai
|
||||
* @param {string} params.assistant_id
|
||||
* @param {AssistantUpdateParams} params.updateData
|
||||
|
|
@ -199,7 +199,7 @@ const updateAssistant = async ({ req, openai, assistant_id, updateData }) => {
|
|||
/**
|
||||
* Modifies an assistant with the resource file id.
|
||||
* @param {object} params
|
||||
* @param {Express.Request} params.req
|
||||
* @param {ServerRequest} params.req
|
||||
* @param {OpenAIClient} params.openai
|
||||
* @param {string} params.assistant_id
|
||||
* @param {string} params.tool_resource
|
||||
|
|
@ -227,7 +227,7 @@ const addResourceFileId = async ({ req, openai, assistant_id, tool_resource, fil
|
|||
/**
|
||||
* Deletes a file ID from an assistant's resource.
|
||||
* @param {object} params
|
||||
* @param {Express.Request} params.req
|
||||
* @param {ServerRequest} params.req
|
||||
* @param {OpenAIClient} params.openai
|
||||
* @param {string} params.assistant_id
|
||||
* @param {string} [params.tool_resource]
|
||||
|
|
|
|||
|
|
@ -35,9 +35,10 @@ const toolAccessPermType = {
|
|||
*/
|
||||
const verifyWebSearchAuth = async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const userId = req.user.id;
|
||||
/** @type {TCustomConfig['webSearch']} */
|
||||
const webSearchConfig = req.app.locals?.webSearch || {};
|
||||
const webSearchConfig = appConfig?.webSearch || {};
|
||||
const result = await loadWebSearchAuth({
|
||||
userId,
|
||||
loadAuthValues,
|
||||
|
|
@ -110,6 +111,7 @@ const verifyToolAuth = async (req, res) => {
|
|||
*/
|
||||
const callTool = async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const { toolId = '' } = req.params;
|
||||
if (!fieldsMap[toolId]) {
|
||||
logger.warn(`[${toolId}/call] User ${req.user.id} attempted call to invalid tool`);
|
||||
|
|
@ -155,8 +157,10 @@ const callTool = async (req, res) => {
|
|||
returnMetadata: true,
|
||||
processFileURL,
|
||||
uploadImageBuffer,
|
||||
fileStrategy: req.app.locals.fileStrategy,
|
||||
},
|
||||
webSearch: appConfig.webSearch,
|
||||
fileStrategy: appConfig.fileStrategy,
|
||||
imageOutputType: appConfig.imageOutputType,
|
||||
});
|
||||
|
||||
const tool = loadedTools[0];
|
||||
|
|
|
|||
|
|
@ -14,12 +14,14 @@ const { isEnabled, ErrorController } = require('@librechat/api');
|
|||
const { connectDb, indexSync } = require('~/db');
|
||||
const validateImageRequest = require('./middleware/validateImageRequest');
|
||||
const { jwtLogin, ldapLogin, passportLogin } = require('~/strategies');
|
||||
const { updateInterfacePermissions } = require('~/models/interface');
|
||||
const { checkMigrations } = require('./services/start/migration');
|
||||
const initializeMCPs = require('./services/initializeMCPs');
|
||||
const configureSocialLogins = require('./socialLogins');
|
||||
const AppService = require('./services/AppService');
|
||||
const { getAppConfig } = require('./services/Config');
|
||||
const staticCache = require('./utils/staticCache');
|
||||
const noIndex = require('./middleware/noIndex');
|
||||
const { seedDatabase } = require('~/models');
|
||||
const routes = require('./routes');
|
||||
|
||||
const { PORT, HOST, ALLOW_SOCIAL_LOGIN, DISABLE_COMPRESSION, TRUST_PROXY } = process.env ?? {};
|
||||
|
|
@ -45,9 +47,11 @@ const startServer = async () => {
|
|||
app.disable('x-powered-by');
|
||||
app.set('trust proxy', trusted_proxy);
|
||||
|
||||
await AppService(app);
|
||||
await seedDatabase();
|
||||
|
||||
const indexPath = path.join(app.locals.paths.dist, 'index.html');
|
||||
const appConfig = await getAppConfig();
|
||||
await updateInterfacePermissions(appConfig);
|
||||
const indexPath = path.join(appConfig.paths.dist, 'index.html');
|
||||
const indexHTML = fs.readFileSync(indexPath, 'utf8');
|
||||
|
||||
app.get('/health', (_req, res) => res.status(200).send('OK'));
|
||||
|
|
@ -66,10 +70,9 @@ const startServer = async () => {
|
|||
console.warn('Response compression has been disabled via DISABLE_COMPRESSION.');
|
||||
}
|
||||
|
||||
// Serve static assets with aggressive caching
|
||||
app.use(staticCache(app.locals.paths.dist));
|
||||
app.use(staticCache(app.locals.paths.fonts));
|
||||
app.use(staticCache(app.locals.paths.assets));
|
||||
app.use(staticCache(appConfig.paths.dist));
|
||||
app.use(staticCache(appConfig.paths.fonts));
|
||||
app.use(staticCache(appConfig.paths.assets));
|
||||
|
||||
if (!ALLOW_SOCIAL_LOGIN) {
|
||||
console.warn('Social logins are disabled. Set ALLOW_SOCIAL_LOGIN=true to enable them.');
|
||||
|
|
@ -146,7 +149,7 @@ const startServer = async () => {
|
|||
logger.info(`Server listening at http://${host == '0.0.0.0' ? 'localhost' : host}:${port}`);
|
||||
}
|
||||
|
||||
initializeMCPs(app).then(() => checkMigrations());
|
||||
initializeMCPs().then(() => checkMigrations());
|
||||
});
|
||||
};
|
||||
|
||||
|
|
|
|||
|
|
@ -3,9 +3,27 @@ const request = require('supertest');
|
|||
const { MongoMemoryServer } = require('mongodb-memory-server');
|
||||
const mongoose = require('mongoose');
|
||||
|
||||
jest.mock('~/server/services/Config/loadCustomConfig', () => {
|
||||
return jest.fn(() => Promise.resolve({}));
|
||||
});
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
loadCustomConfig: jest.fn(() => Promise.resolve({})),
|
||||
getAppConfig: jest.fn().mockResolvedValue({
|
||||
paths: {
|
||||
uploads: '/tmp',
|
||||
dist: '/tmp/dist',
|
||||
fonts: '/tmp/fonts',
|
||||
assets: '/tmp/assets',
|
||||
},
|
||||
fileStrategy: 'local',
|
||||
imageOutputType: 'PNG',
|
||||
}),
|
||||
setCachedTools: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('~/app/clients/tools', () => ({
|
||||
createOpenAIImageTools: jest.fn(() => []),
|
||||
createYouTubeTools: jest.fn(() => []),
|
||||
manifestToolMap: {},
|
||||
toolkits: [],
|
||||
}));
|
||||
|
||||
describe('Server Configuration', () => {
|
||||
// Increase the default timeout to allow for Mongo cleanup
|
||||
|
|
@ -31,6 +49,22 @@ describe('Server Configuration', () => {
|
|||
});
|
||||
|
||||
beforeAll(async () => {
|
||||
// Create the required directories and files for the test
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const dirs = ['/tmp/dist', '/tmp/fonts', '/tmp/assets'];
|
||||
dirs.forEach((dir) => {
|
||||
if (!fs.existsSync(dir)) {
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
}
|
||||
});
|
||||
|
||||
fs.writeFileSync(
|
||||
path.join('/tmp/dist', 'index.html'),
|
||||
'<!DOCTYPE html><html><head><title>LibreChat</title></head><body><div id="root"></div></body></html>',
|
||||
);
|
||||
|
||||
mongoServer = await MongoMemoryServer.create();
|
||||
process.env.MONGO_URI = mongoServer.getUri();
|
||||
process.env.PORT = '0'; // Use a random available port
|
||||
|
|
|
|||
|
|
@ -12,8 +12,9 @@ const { handleAbortError } = require('~/server/middleware/abortMiddleware');
|
|||
const validateAssistant = async (req, res, next) => {
|
||||
const { endpoint, conversationId, assistant_id, messageId } = req.body;
|
||||
|
||||
const appConfig = req.config;
|
||||
/** @type {Partial<TAssistantEndpoint>} */
|
||||
const assistantsConfig = req.app.locals?.[endpoint];
|
||||
const assistantsConfig = appConfig.endpoints?.[endpoint];
|
||||
if (!assistantsConfig) {
|
||||
return next();
|
||||
}
|
||||
|
|
|
|||
|
|
@ -20,8 +20,9 @@ const validateAuthor = async ({ req, openai, overrideEndpoint, overrideAssistant
|
|||
const assistant_id =
|
||||
overrideAssistantId ?? req.params.id ?? req.body.assistant_id ?? req.query.assistant_id;
|
||||
|
||||
const appConfig = req.config;
|
||||
/** @type {Partial<TAssistantEndpoint>} */
|
||||
const assistantsConfig = req.app.locals?.[endpoint];
|
||||
const assistantsConfig = appConfig.endpoints?.[endpoint];
|
||||
if (!assistantsConfig) {
|
||||
return;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -40,9 +40,10 @@ async function buildEndpointOption(req, res, next) {
|
|||
return handleError(res, { text: 'Error parsing conversation' });
|
||||
}
|
||||
|
||||
if (req.app.locals.modelSpecs?.list && req.app.locals.modelSpecs?.enforce) {
|
||||
const appConfig = req.config;
|
||||
if (appConfig.modelSpecs?.list && appConfig.modelSpecs?.enforce) {
|
||||
/** @type {{ list: TModelSpec[] }}*/
|
||||
const { list } = req.app.locals.modelSpecs;
|
||||
const { list } = appConfig.modelSpecs;
|
||||
const { spec } = parsedBody;
|
||||
|
||||
if (!spec) {
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { isEmailDomainAllowed } = require('~/server/services/domains');
|
||||
const { logger } = require('~/config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
/**
|
||||
* Checks the domain's social login is allowed
|
||||
|
|
@ -14,7 +15,10 @@ const { logger } = require('~/config');
|
|||
*/
|
||||
const checkDomainAllowed = async (req, res, next = () => {}) => {
|
||||
const email = req?.user?.email;
|
||||
if (email && !(await isEmailDomainAllowed(email))) {
|
||||
const appConfig = await getAppConfig({
|
||||
role: req?.user?.role,
|
||||
});
|
||||
if (email && !isEmailDomainAllowed(email, appConfig?.registration?.allowedDomains)) {
|
||||
logger.error(`[Social Login] [Social Login not allowed] [Email: ${email}]`);
|
||||
return res.redirect('/login');
|
||||
} else {
|
||||
|
|
|
|||
27
api/server/middleware/config/app.js
Normal file
27
api/server/middleware/config/app.js
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
const configMiddleware = async (req, res, next) => {
|
||||
try {
|
||||
const userRole = req.user?.role;
|
||||
req.config = await getAppConfig({ role: userRole });
|
||||
|
||||
next();
|
||||
} catch (error) {
|
||||
logger.error('Config middleware error:', {
|
||||
error: error.message,
|
||||
userRole: req.user?.role,
|
||||
path: req.path,
|
||||
});
|
||||
|
||||
try {
|
||||
req.config = await getAppConfig();
|
||||
next();
|
||||
} catch (fallbackError) {
|
||||
logger.error('Fallback config middleware error:', fallbackError);
|
||||
next(fallbackError);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
module.exports = configMiddleware;
|
||||
|
|
@ -82,7 +82,7 @@ const sendError = async (req, res, options, callback) => {
|
|||
|
||||
/**
|
||||
* Sends the response based on whether headers have been sent or not.
|
||||
* @param {Express.Request} req - The server response.
|
||||
* @param {ServerRequest} req - The server response.
|
||||
* @param {Express.Response} res - The server response.
|
||||
* @param {Object} data - The data to be sent.
|
||||
* @param {string} [errorMessage] - The error message, if any.
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ const requireLdapAuth = require('./requireLdapAuth');
|
|||
const abortMiddleware = require('./abortMiddleware');
|
||||
const checkInviteUser = require('./checkInviteUser');
|
||||
const requireJwtAuth = require('./requireJwtAuth');
|
||||
const configMiddleware = require('./config/app');
|
||||
const validateModel = require('./validateModel');
|
||||
const moderateText = require('./moderateText');
|
||||
const logHeaders = require('./logHeaders');
|
||||
|
|
@ -43,6 +44,7 @@ module.exports = {
|
|||
requireLocalAuth,
|
||||
canDeleteAccount,
|
||||
validateEndpoint,
|
||||
configMiddleware,
|
||||
concurrentLimiter,
|
||||
checkDomainAllowed,
|
||||
validateMessageReq,
|
||||
|
|
|
|||
|
|
@ -1,13 +1,18 @@
|
|||
const jwt = require('jsonwebtoken');
|
||||
const validateImageRequest = require('~/server/middleware/validateImageRequest');
|
||||
|
||||
jest.mock('~/server/services/Config/app', () => ({
|
||||
getAppConfig: jest.fn(),
|
||||
}));
|
||||
|
||||
describe('validateImageRequest middleware', () => {
|
||||
let req, res, next;
|
||||
const validObjectId = '65cfb246f7ecadb8b1e8036b';
|
||||
const { getAppConfig } = require('~/server/services/Config/app');
|
||||
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
req = {
|
||||
app: { locals: { secureImageLinks: true } },
|
||||
headers: {},
|
||||
originalUrl: '',
|
||||
};
|
||||
|
|
@ -17,79 +22,86 @@ describe('validateImageRequest middleware', () => {
|
|||
};
|
||||
next = jest.fn();
|
||||
process.env.JWT_REFRESH_SECRET = 'test-secret';
|
||||
|
||||
// Mock getAppConfig to return secureImageLinks: true by default
|
||||
getAppConfig.mockResolvedValue({
|
||||
secureImageLinks: true,
|
||||
});
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
|
||||
test('should call next() if secureImageLinks is false', () => {
|
||||
req.app.locals.secureImageLinks = false;
|
||||
validateImageRequest(req, res, next);
|
||||
test('should call next() if secureImageLinks is false', async () => {
|
||||
getAppConfig.mockResolvedValue({
|
||||
secureImageLinks: false,
|
||||
});
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(next).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
test('should return 401 if refresh token is not provided', () => {
|
||||
validateImageRequest(req, res, next);
|
||||
test('should return 401 if refresh token is not provided', async () => {
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(res.status).toHaveBeenCalledWith(401);
|
||||
expect(res.send).toHaveBeenCalledWith('Unauthorized');
|
||||
});
|
||||
|
||||
test('should return 403 if refresh token is invalid', () => {
|
||||
test('should return 403 if refresh token is invalid', async () => {
|
||||
req.headers.cookie = 'refreshToken=invalid-token';
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(res.status).toHaveBeenCalledWith(403);
|
||||
expect(res.send).toHaveBeenCalledWith('Access Denied');
|
||||
});
|
||||
|
||||
test('should return 403 if refresh token is expired', () => {
|
||||
test('should return 403 if refresh token is expired', async () => {
|
||||
const expiredToken = jwt.sign(
|
||||
{ id: validObjectId, exp: Math.floor(Date.now() / 1000) - 3600 },
|
||||
process.env.JWT_REFRESH_SECRET,
|
||||
);
|
||||
req.headers.cookie = `refreshToken=${expiredToken}`;
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(res.status).toHaveBeenCalledWith(403);
|
||||
expect(res.send).toHaveBeenCalledWith('Access Denied');
|
||||
});
|
||||
|
||||
test('should call next() for valid image path', () => {
|
||||
test('should call next() for valid image path', async () => {
|
||||
const validToken = jwt.sign(
|
||||
{ id: validObjectId, exp: Math.floor(Date.now() / 1000) + 3600 },
|
||||
process.env.JWT_REFRESH_SECRET,
|
||||
);
|
||||
req.headers.cookie = `refreshToken=${validToken}`;
|
||||
req.originalUrl = `/images/${validObjectId}/example.jpg`;
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(next).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
test('should return 403 for invalid image path', () => {
|
||||
test('should return 403 for invalid image path', async () => {
|
||||
const validToken = jwt.sign(
|
||||
{ id: validObjectId, exp: Math.floor(Date.now() / 1000) + 3600 },
|
||||
process.env.JWT_REFRESH_SECRET,
|
||||
);
|
||||
req.headers.cookie = `refreshToken=${validToken}`;
|
||||
req.originalUrl = '/images/65cfb246f7ecadb8b1e8036c/example.jpg'; // Different ObjectId
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(res.status).toHaveBeenCalledWith(403);
|
||||
expect(res.send).toHaveBeenCalledWith('Access Denied');
|
||||
});
|
||||
|
||||
test('should return 403 for invalid ObjectId format', () => {
|
||||
test('should return 403 for invalid ObjectId format', async () => {
|
||||
const validToken = jwt.sign(
|
||||
{ id: validObjectId, exp: Math.floor(Date.now() / 1000) + 3600 },
|
||||
process.env.JWT_REFRESH_SECRET,
|
||||
);
|
||||
req.headers.cookie = `refreshToken=${validToken}`;
|
||||
req.originalUrl = '/images/123/example.jpg'; // Invalid ObjectId
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(res.status).toHaveBeenCalledWith(403);
|
||||
expect(res.send).toHaveBeenCalledWith('Access Denied');
|
||||
});
|
||||
|
||||
// File traversal tests
|
||||
test('should prevent file traversal attempts', () => {
|
||||
test('should prevent file traversal attempts', async () => {
|
||||
const validToken = jwt.sign(
|
||||
{ id: validObjectId, exp: Math.floor(Date.now() / 1000) + 3600 },
|
||||
process.env.JWT_REFRESH_SECRET,
|
||||
|
|
@ -103,23 +115,23 @@ describe('validateImageRequest middleware', () => {
|
|||
`/images/${validObjectId}/%2e%2e%2f%2e%2e%2f%2e%2e%2fetc%2fpasswd`,
|
||||
];
|
||||
|
||||
traversalAttempts.forEach((attempt) => {
|
||||
for (const attempt of traversalAttempts) {
|
||||
req.originalUrl = attempt;
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(res.status).toHaveBeenCalledWith(403);
|
||||
expect(res.send).toHaveBeenCalledWith('Access Denied');
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
test('should handle URL encoded characters in valid paths', () => {
|
||||
test('should handle URL encoded characters in valid paths', async () => {
|
||||
const validToken = jwt.sign(
|
||||
{ id: validObjectId, exp: Math.floor(Date.now() / 1000) + 3600 },
|
||||
process.env.JWT_REFRESH_SECRET,
|
||||
);
|
||||
req.headers.cookie = `refreshToken=${validToken}`;
|
||||
req.originalUrl = `/images/${validObjectId}/image%20with%20spaces.jpg`;
|
||||
validateImageRequest(req, res, next);
|
||||
await validateImageRequest(req, res, next);
|
||||
expect(next).toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ const { USE_REDIS, CONVO_ACCESS_VIOLATION_SCORE: score = 0 } = process.env ?? {}
|
|||
* If the `cache` store is not available, the middleware will skip its logic.
|
||||
*
|
||||
* @function
|
||||
* @param {Express.Request} req - Express request object containing user information.
|
||||
* @param {ServerRequest} req - Express request object containing user information.
|
||||
* @param {Express.Response} res - Express response object.
|
||||
* @param {function} next - Express next middleware function.
|
||||
* @throws {Error} Throws an error if the user doesn't have access to the conversation.
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
const cookies = require('cookie');
|
||||
const jwt = require('jsonwebtoken');
|
||||
const { logger } = require('~/config');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { getAppConfig } = require('~/server/services/Config/app');
|
||||
|
||||
const OBJECT_ID_LENGTH = 24;
|
||||
const OBJECT_ID_PATTERN = /^[0-9a-f]{24}$/i;
|
||||
|
|
@ -24,8 +25,9 @@ function isValidObjectId(id) {
|
|||
* Middleware to validate image request.
|
||||
* Must be set by `secureImageLinks` via custom config file.
|
||||
*/
|
||||
function validateImageRequest(req, res, next) {
|
||||
if (!req.app.locals.secureImageLinks) {
|
||||
async function validateImageRequest(req, res, next) {
|
||||
const appConfig = await getAppConfig({ role: req.user?.role });
|
||||
if (!appConfig.secureImageLinks) {
|
||||
return next();
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ const { logViolation } = require('~/cache');
|
|||
* Validates the model of the request.
|
||||
*
|
||||
* @async
|
||||
* @param {Express.Request} req - The Express request object.
|
||||
* @param {ServerRequest} req - The Express request object.
|
||||
* @param {Express.Response} res - The Express response object.
|
||||
* @param {Function} next - The Express next function.
|
||||
*/
|
||||
|
|
|
|||
|
|
@ -83,7 +83,11 @@ router.post(
|
|||
}
|
||||
|
||||
let metadata = await encryptMetadata(removeNullishValues(_metadata, true));
|
||||
const isDomainAllowed = await isActionDomainAllowed(metadata.domain);
|
||||
const appConfig = req.config;
|
||||
const isDomainAllowed = await isActionDomainAllowed(
|
||||
metadata.domain,
|
||||
appConfig?.actions?.allowedDomains,
|
||||
);
|
||||
if (!isDomainAllowed) {
|
||||
return res.status(400).json({ message: 'Domain not allowed' });
|
||||
}
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ const {
|
|||
checkBan,
|
||||
requireJwtAuth,
|
||||
messageIpLimiter,
|
||||
configMiddleware,
|
||||
concurrentLimiter,
|
||||
messageUserLimiter,
|
||||
} = require('~/server/middleware');
|
||||
|
|
@ -22,6 +23,8 @@ router.use(uaParser);
|
|||
router.use('/', v1);
|
||||
|
||||
const chatRouter = express.Router();
|
||||
chatRouter.use(configMiddleware);
|
||||
|
||||
if (isEnabled(LIMIT_CONCURRENT_MESSAGES)) {
|
||||
chatRouter.use(concurrentLimiter);
|
||||
}
|
||||
|
|
@ -37,6 +40,4 @@ if (isEnabled(LIMIT_MESSAGE_USER)) {
|
|||
chatRouter.use('/', chat);
|
||||
router.use('/chat', chatRouter);
|
||||
|
||||
// Add marketplace routes
|
||||
|
||||
module.exports = router;
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
const express = require('express');
|
||||
const { callTool, verifyToolAuth, getToolCalls } = require('~/server/controllers/tools');
|
||||
const { getAvailableTools } = require('~/server/controllers/PluginController');
|
||||
const { toolCallLimiter } = require('~/server/middleware/limiters');
|
||||
const { toolCallLimiter } = require('~/server/middleware');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
const express = require('express');
|
||||
const { generateCheckAccess } = require('@librechat/api');
|
||||
const { PermissionTypes, Permissions, PermissionBits } = require('librechat-data-provider');
|
||||
const { requireJwtAuth, canAccessAgentResource } = require('~/server/middleware');
|
||||
const { requireJwtAuth, configMiddleware, canAccessAgentResource } = require('~/server/middleware');
|
||||
const v1 = require('~/server/controllers/agents/v1');
|
||||
const { getRoleByName } = require('~/models/Role');
|
||||
const actions = require('./actions');
|
||||
|
|
@ -36,13 +36,13 @@ router.use(requireJwtAuth);
|
|||
* Agent actions route.
|
||||
* @route GET|POST /agents/actions
|
||||
*/
|
||||
router.use('/actions', actions);
|
||||
router.use('/actions', configMiddleware, actions);
|
||||
|
||||
/**
|
||||
* Get a list of available tools for agents.
|
||||
* @route GET /agents/tools
|
||||
*/
|
||||
router.use('/tools', tools);
|
||||
router.use('/tools', configMiddleware, tools);
|
||||
|
||||
/**
|
||||
* Get all agent categories with counts
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
const express = require('express');
|
||||
const { nanoid } = require('nanoid');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { actionDelimiter, EModelEndpoint, removeNullishValues } = require('librechat-data-provider');
|
||||
const { encryptMetadata, domainParser } = require('~/server/services/ActionService');
|
||||
const { getOpenAIClient } = require('~/server/controllers/assistants/helpers');
|
||||
const { updateAction, getActions, deleteAction } = require('~/models/Action');
|
||||
const { updateAssistantDoc, getAssistant } = require('~/models/Assistant');
|
||||
const { isActionDomainAllowed } = require('~/server/services/domains');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
|
|
@ -21,6 +21,7 @@ const router = express.Router();
|
|||
*/
|
||||
router.post('/:assistant_id', async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const { assistant_id } = req.params;
|
||||
|
||||
/** @type {{ functions: FunctionTool[], action_id: string, metadata: ActionMetadata }} */
|
||||
|
|
@ -30,7 +31,10 @@ router.post('/:assistant_id', async (req, res) => {
|
|||
}
|
||||
|
||||
let metadata = await encryptMetadata(removeNullishValues(_metadata, true));
|
||||
const isDomainAllowed = await isActionDomainAllowed(metadata.domain);
|
||||
const isDomainAllowed = await isActionDomainAllowed(
|
||||
metadata.domain,
|
||||
appConfig?.actions?.allowedDomains,
|
||||
);
|
||||
if (!isDomainAllowed) {
|
||||
return res.status(400).json({ message: 'Domain not allowed' });
|
||||
}
|
||||
|
|
@ -125,7 +129,7 @@ router.post('/:assistant_id', async (req, res) => {
|
|||
}
|
||||
|
||||
/* Map Azure OpenAI model to the assistant as defined by config */
|
||||
if (req.app.locals[EModelEndpoint.azureOpenAI]?.assistants) {
|
||||
if (appConfig.endpoints?.[EModelEndpoint.azureOpenAI]?.assistants) {
|
||||
updatedAssistant = {
|
||||
...updatedAssistant,
|
||||
model: req.body.model,
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
const express = require('express');
|
||||
const { uaParser, checkBan, requireJwtAuth, configMiddleware } = require('~/server/middleware');
|
||||
const router = express.Router();
|
||||
const { uaParser, checkBan, requireJwtAuth } = require('~/server/middleware');
|
||||
|
||||
const { v1 } = require('./v1');
|
||||
const chatV1 = require('./chatV1');
|
||||
|
|
@ -10,6 +10,7 @@ const chatV2 = require('./chatV2');
|
|||
router.use(requireJwtAuth);
|
||||
router.use(checkBan);
|
||||
router.use(uaParser);
|
||||
router.use(configMiddleware);
|
||||
router.use('/v1/', v1);
|
||||
router.use('/v1/chat', chatV1);
|
||||
router.use('/v2/', v2);
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
const express = require('express');
|
||||
const { configMiddleware } = require('~/server/middleware');
|
||||
const v1 = require('~/server/controllers/assistants/v1');
|
||||
const v2 = require('~/server/controllers/assistants/v2');
|
||||
const documents = require('./documents');
|
||||
|
|
@ -6,6 +7,7 @@ const actions = require('./actions');
|
|||
const tools = require('./tools');
|
||||
|
||||
const router = express.Router();
|
||||
router.use(configMiddleware);
|
||||
|
||||
/**
|
||||
* Assistant actions route.
|
||||
|
|
|
|||
|
|
@ -17,12 +17,12 @@ const {
|
|||
const { verify2FAWithTempToken } = require('~/server/controllers/auth/TwoFactorAuthController');
|
||||
const { logoutController } = require('~/server/controllers/auth/LogoutController');
|
||||
const { loginController } = require('~/server/controllers/auth/LoginController');
|
||||
const { getBalanceConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
const middleware = require('~/server/middleware');
|
||||
const { Balance } = require('~/db/models');
|
||||
|
||||
const setBalanceConfig = createSetBalanceConfig({
|
||||
getBalanceConfig,
|
||||
getAppConfig,
|
||||
Balance,
|
||||
});
|
||||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,14 @@
|
|||
const express = require('express');
|
||||
const { isEnabled } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { CacheKeys, defaultSocialLogins, Constants } = require('librechat-data-provider');
|
||||
const { getCustomConfig } = require('~/server/services/Config/getCustomConfig');
|
||||
const { isEnabled, getBalanceConfig } = require('@librechat/api');
|
||||
const {
|
||||
Constants,
|
||||
CacheKeys,
|
||||
removeNullishValues,
|
||||
defaultSocialLogins,
|
||||
} = require('librechat-data-provider');
|
||||
const { getLdapConfig } = require('~/server/services/Config/ldap');
|
||||
const { getAppConfig } = require('~/server/services/Config/app');
|
||||
const { getProjectByName } = require('~/models/Project');
|
||||
const { getMCPManager } = require('~/config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
|
@ -43,6 +48,8 @@ router.get('/', async function (req, res) {
|
|||
const ldap = getLdapConfig();
|
||||
|
||||
try {
|
||||
const appConfig = await getAppConfig({ role: req.user?.role });
|
||||
|
||||
const isOpenIdEnabled =
|
||||
!!process.env.OPENID_CLIENT_ID &&
|
||||
!!process.env.OPENID_CLIENT_SECRET &&
|
||||
|
|
@ -55,10 +62,12 @@ router.get('/', async function (req, res) {
|
|||
!!process.env.SAML_CERT &&
|
||||
!!process.env.SAML_SESSION_SECRET;
|
||||
|
||||
const balanceConfig = getBalanceConfig(appConfig);
|
||||
|
||||
/** @type {TStartupConfig} */
|
||||
const payload = {
|
||||
appTitle: process.env.APP_TITLE || 'LibreChat',
|
||||
socialLogins: req.app.locals.socialLogins ?? defaultSocialLogins,
|
||||
socialLogins: appConfig?.registration?.socialLogins ?? defaultSocialLogins,
|
||||
discordLoginEnabled: !!process.env.DISCORD_CLIENT_ID && !!process.env.DISCORD_CLIENT_SECRET,
|
||||
facebookLoginEnabled:
|
||||
!!process.env.FACEBOOK_CLIENT_ID && !!process.env.FACEBOOK_CLIENT_SECRET,
|
||||
|
|
@ -91,10 +100,10 @@ router.get('/', async function (req, res) {
|
|||
isEnabled(process.env.SHOW_BIRTHDAY_ICON) ||
|
||||
process.env.SHOW_BIRTHDAY_ICON === '',
|
||||
helpAndFaqURL: process.env.HELP_AND_FAQ_URL || 'https://librechat.ai',
|
||||
interface: req.app.locals.interfaceConfig,
|
||||
turnstile: req.app.locals.turnstileConfig,
|
||||
modelSpecs: req.app.locals.modelSpecs,
|
||||
balance: req.app.locals.balance,
|
||||
interface: appConfig?.interfaceConfig,
|
||||
turnstile: appConfig?.turnstileConfig,
|
||||
modelSpecs: appConfig?.modelSpecs,
|
||||
balance: balanceConfig,
|
||||
sharedLinksEnabled,
|
||||
publicSharedLinksEnabled,
|
||||
analyticsGtmId: process.env.ANALYTICS_GTM_ID,
|
||||
|
|
@ -109,27 +118,31 @@ router.get('/', async function (req, res) {
|
|||
};
|
||||
|
||||
payload.mcpServers = {};
|
||||
const config = await getCustomConfig();
|
||||
if (config?.mcpServers != null) {
|
||||
const getMCPServers = () => {
|
||||
try {
|
||||
const mcpManager = getMCPManager();
|
||||
if (!mcpManager) {
|
||||
return;
|
||||
}
|
||||
const mcpServers = mcpManager.getAllServers();
|
||||
if (!mcpServers) return;
|
||||
const oauthServers = mcpManager.getOAuthServers();
|
||||
for (const serverName in config.mcpServers) {
|
||||
const serverConfig = config.mcpServers[serverName];
|
||||
payload.mcpServers[serverName] = {
|
||||
for (const serverName in mcpServers) {
|
||||
const serverConfig = mcpServers[serverName];
|
||||
payload.mcpServers[serverName] = removeNullishValues({
|
||||
startup: serverConfig?.startup,
|
||||
chatMenu: serverConfig?.chatMenu,
|
||||
isOAuth: oauthServers?.has(serverName),
|
||||
customUserVars: serverConfig?.customUserVars || {},
|
||||
};
|
||||
customUserVars: serverConfig?.customUserVars,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
logger.error('Error loading MCP servers', err);
|
||||
} catch (error) {
|
||||
logger.error('Error loading MCP servers', error);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
/** @type {TCustomConfig['webSearch']} */
|
||||
const webSearchConfig = req.app.locals.webSearch;
|
||||
getMCPServers();
|
||||
const webSearchConfig = appConfig?.webSearch;
|
||||
if (
|
||||
webSearchConfig != null &&
|
||||
(webSearchConfig.searchProvider ||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,7 @@
|
|||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const endpointController = require('~/server/controllers/EndpointController');
|
||||
const overrideController = require('~/server/controllers/OverrideController');
|
||||
|
||||
const router = express.Router();
|
||||
router.get('/', endpointController);
|
||||
router.get('/config/override', overrideController);
|
||||
|
||||
module.exports = router;
|
||||
|
|
|
|||
|
|
@ -1,15 +1,16 @@
|
|||
const fs = require('fs').promises;
|
||||
const express = require('express');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { getStrategyFunctions } = require('~/server/services/Files/strategies');
|
||||
const { resizeAvatar } = require('~/server/services/Files/images/avatar');
|
||||
const { filterFile } = require('~/server/services/Files/process');
|
||||
const { getFileStrategy } = require('~/server/utils/getFileStrategy');
|
||||
const { logger } = require('~/config');
|
||||
const { filterFile } = require('~/server/services/Files/process');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
filterFile({ req, file: req.file, image: true, isAvatar: true });
|
||||
const userId = req.user.id;
|
||||
const { manual } = req.body;
|
||||
|
|
@ -19,8 +20,8 @@ router.post('/', async (req, res) => {
|
|||
throw new Error('User ID is undefined');
|
||||
}
|
||||
|
||||
const fileStrategy = getFileStrategy(req.app.locals, { isAvatar: true });
|
||||
const desiredFormat = req.app.locals.imageOutputType;
|
||||
const fileStrategy = getFileStrategy(appConfig, { isAvatar: true });
|
||||
const desiredFormat = appConfig.imageOutputType;
|
||||
const resizedBuffer = await resizeAvatar({
|
||||
userId,
|
||||
input,
|
||||
|
|
@ -39,7 +40,7 @@ router.post('/', async (req, res) => {
|
|||
try {
|
||||
await fs.unlink(req.file.path);
|
||||
logger.debug('[/files/images/avatar] Temp. image upload file deleted');
|
||||
} catch (error) {
|
||||
} catch {
|
||||
logger.debug('[/files/images/avatar] Temp. image upload file already deleted');
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -36,8 +36,9 @@ const router = express.Router();
|
|||
|
||||
router.get('/', async (req, res) => {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const files = await getFiles({ user: req.user.id });
|
||||
if (req.app.locals.fileStrategy === FileSources.s3) {
|
||||
if (appConfig.fileStrategy === FileSources.s3) {
|
||||
try {
|
||||
const cache = getLogStores(CacheKeys.S3_EXPIRY_INTERVAL);
|
||||
const alreadyChecked = await cache.get(req.user.id);
|
||||
|
|
@ -114,7 +115,8 @@ router.get('/agent/:agent_id', async (req, res) => {
|
|||
|
||||
router.get('/config', async (req, res) => {
|
||||
try {
|
||||
res.status(200).json(req.app.locals.fileConfig);
|
||||
const appConfig = req.config;
|
||||
res.status(200).json(appConfig.fileConfig);
|
||||
} catch (error) {
|
||||
logger.error('[/files] Error getting fileConfig', error);
|
||||
res.status(400).json({ message: 'Error in request', error: error.message });
|
||||
|
|
|
|||
|
|
@ -1,18 +1,19 @@
|
|||
const path = require('path');
|
||||
const fs = require('fs').promises;
|
||||
const express = require('express');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { isAgentsEndpoint } = require('librechat-data-provider');
|
||||
const {
|
||||
filterFile,
|
||||
processImageFile,
|
||||
processAgentFileUpload,
|
||||
} = require('~/server/services/Files/process');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
const metadata = req.body;
|
||||
const appConfig = req.config;
|
||||
|
||||
try {
|
||||
filterFile({ req, image: true });
|
||||
|
|
@ -30,7 +31,7 @@ router.post('/', async (req, res) => {
|
|||
logger.error('[/files/images] Error processing file:', error);
|
||||
try {
|
||||
const filepath = path.join(
|
||||
req.app.locals.paths.imageOutput,
|
||||
appConfig.paths.imageOutput,
|
||||
req.user.id,
|
||||
path.basename(req.file.filename),
|
||||
);
|
||||
|
|
@ -43,7 +44,7 @@ router.post('/', async (req, res) => {
|
|||
try {
|
||||
await fs.unlink(req.file.path);
|
||||
logger.debug('[/files/images] Temp. image upload file deleted');
|
||||
} catch (error) {
|
||||
} catch {
|
||||
logger.debug('[/files/images] Temp. image upload file already deleted');
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,5 +1,11 @@
|
|||
const express = require('express');
|
||||
const { uaParser, checkBan, requireJwtAuth, createFileLimiters } = require('~/server/middleware');
|
||||
const {
|
||||
createFileLimiters,
|
||||
configMiddleware,
|
||||
requireJwtAuth,
|
||||
uaParser,
|
||||
checkBan,
|
||||
} = require('~/server/middleware');
|
||||
const { avatar: asstAvatarRouter } = require('~/server/routes/assistants/v1');
|
||||
const { avatar: agentAvatarRouter } = require('~/server/routes/agents/v1');
|
||||
const { createMulterInstance } = require('./multer');
|
||||
|
|
@ -12,6 +18,7 @@ const speech = require('./speech');
|
|||
const initialize = async () => {
|
||||
const router = express.Router();
|
||||
router.use(requireJwtAuth);
|
||||
router.use(configMiddleware);
|
||||
router.use(checkBan);
|
||||
router.use(uaParser);
|
||||
|
||||
|
|
|
|||
|
|
@ -4,11 +4,12 @@ const crypto = require('crypto');
|
|||
const multer = require('multer');
|
||||
const { sanitizeFilename } = require('@librechat/api');
|
||||
const { fileConfig: defaultFileConfig, mergeFileConfig } = require('librechat-data-provider');
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
const storage = multer.diskStorage({
|
||||
destination: function (req, file, cb) {
|
||||
const outputPath = path.join(req.app.locals.paths.uploads, 'temp', req.user.id);
|
||||
const appConfig = req.config;
|
||||
const outputPath = path.join(appConfig.paths.uploads, 'temp', req.user.id);
|
||||
if (!fs.existsSync(outputPath)) {
|
||||
fs.mkdirSync(outputPath, { recursive: true });
|
||||
}
|
||||
|
|
@ -68,8 +69,8 @@ const createFileFilter = (customFileConfig) => {
|
|||
};
|
||||
|
||||
const createMulterInstance = async () => {
|
||||
const customConfig = await getCustomConfig();
|
||||
const fileConfig = mergeFileConfig(customConfig?.fileConfig);
|
||||
const appConfig = await getAppConfig();
|
||||
const fileConfig = mergeFileConfig(appConfig?.fileConfig);
|
||||
const fileFilter = createFileFilter(fileConfig);
|
||||
return multer({
|
||||
storage,
|
||||
|
|
|
|||
|
|
@ -8,21 +8,7 @@ const { createMulterInstance, storage, importFileFilter } = require('./multer');
|
|||
|
||||
// Mock only the config service that requires external dependencies
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
getCustomConfig: jest.fn(() =>
|
||||
Promise.resolve({
|
||||
fileConfig: {
|
||||
endpoints: {
|
||||
openAI: {
|
||||
supportedMimeTypes: ['image/jpeg', 'image/png', 'application/pdf'],
|
||||
},
|
||||
default: {
|
||||
supportedMimeTypes: ['image/jpeg', 'image/png', 'text/plain'],
|
||||
},
|
||||
},
|
||||
serverFileSizeLimit: 10000000, // 10MB
|
||||
},
|
||||
}),
|
||||
),
|
||||
getAppConfig: jest.fn(),
|
||||
}));
|
||||
|
||||
describe('Multer Configuration', () => {
|
||||
|
|
@ -36,15 +22,13 @@ describe('Multer Configuration', () => {
|
|||
|
||||
mockReq = {
|
||||
user: { id: 'test-user-123' },
|
||||
app: {
|
||||
locals: {
|
||||
paths: {
|
||||
uploads: tempDir,
|
||||
},
|
||||
},
|
||||
},
|
||||
body: {},
|
||||
originalUrl: '/api/files/upload',
|
||||
config: {
|
||||
paths: {
|
||||
uploads: tempDir,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
mockFile = {
|
||||
|
|
@ -79,7 +63,7 @@ describe('Multer Configuration', () => {
|
|||
|
||||
it("should create directory recursively if it doesn't exist", (done) => {
|
||||
const deepPath = path.join(tempDir, 'deep', 'nested', 'path');
|
||||
mockReq.app.locals.paths.uploads = deepPath;
|
||||
mockReq.config.paths.uploads = deepPath;
|
||||
|
||||
const cb = jest.fn((err, destination) => {
|
||||
expect(err).toBeNull();
|
||||
|
|
@ -331,11 +315,11 @@ describe('Multer Configuration', () => {
|
|||
});
|
||||
|
||||
it('should use real config merging', async () => {
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
const multerInstance = await createMulterInstance();
|
||||
|
||||
expect(getCustomConfig).toHaveBeenCalled();
|
||||
expect(getAppConfig).toHaveBeenCalled();
|
||||
expect(multerInstance).toBeDefined();
|
||||
});
|
||||
|
||||
|
|
@ -462,26 +446,15 @@ describe('Multer Configuration', () => {
|
|||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it('should handle file system errors when directory creation fails', (done) => {
|
||||
it('should handle file system errors when directory creation fails', () => {
|
||||
// Test with a non-existent parent directory to simulate fs issues
|
||||
const invalidPath = '/nonexistent/path/that/should/not/exist';
|
||||
mockReq.app.locals.paths.uploads = invalidPath;
|
||||
mockReq.config.paths.uploads = invalidPath;
|
||||
|
||||
try {
|
||||
// Call getDestination which should fail due to permission/path issues
|
||||
storage.getDestination(mockReq, mockFile, (err, destination) => {
|
||||
// If callback is reached, we didn't get the expected error
|
||||
done(new Error('Expected mkdirSync to throw an error but callback was called'));
|
||||
});
|
||||
// If we get here without throwing, something unexpected happened
|
||||
done(new Error('Expected mkdirSync to throw an error but no error was thrown'));
|
||||
} catch (error) {
|
||||
// This is the expected behavior - mkdirSync throws synchronously for invalid paths
|
||||
// On Linux, this typically returns EACCES (permission denied)
|
||||
// On macOS/Darwin, this returns ENOENT (no such file or directory)
|
||||
expect(['EACCES', 'ENOENT']).toContain(error.code);
|
||||
done();
|
||||
}
|
||||
// The current implementation doesn't catch errors, so they're thrown synchronously
|
||||
expect(() => {
|
||||
storage.getDestination(mockReq, mockFile, jest.fn());
|
||||
}).toThrow();
|
||||
});
|
||||
|
||||
it('should handle malformed filenames with real sanitization', (done) => {
|
||||
|
|
@ -538,10 +511,10 @@ describe('Multer Configuration', () => {
|
|||
|
||||
describe('Real Configuration Testing', () => {
|
||||
it('should handle missing custom config gracefully with real mergeFileConfig', async () => {
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
// Mock getCustomConfig to return undefined
|
||||
getCustomConfig.mockResolvedValueOnce(undefined);
|
||||
// Mock getAppConfig to return undefined
|
||||
getAppConfig.mockResolvedValueOnce(undefined);
|
||||
|
||||
const multerInstance = await createMulterInstance();
|
||||
expect(multerInstance).toBeDefined();
|
||||
|
|
@ -549,25 +522,28 @@ describe('Multer Configuration', () => {
|
|||
});
|
||||
|
||||
it('should properly integrate real fileConfig with custom endpoints', async () => {
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
// Mock a custom config with additional endpoints
|
||||
getCustomConfig.mockResolvedValueOnce({
|
||||
// Mock appConfig with fileConfig
|
||||
getAppConfig.mockResolvedValueOnce({
|
||||
paths: {
|
||||
uploads: tempDir,
|
||||
},
|
||||
fileConfig: {
|
||||
endpoints: {
|
||||
anthropic: {
|
||||
supportedMimeTypes: ['text/plain', 'image/png'],
|
||||
},
|
||||
},
|
||||
serverFileSizeLimit: 20, // 20 MB
|
||||
serverFileSizeLimit: 20971520, // 20 MB in bytes (mergeFileConfig converts)
|
||||
},
|
||||
});
|
||||
|
||||
const multerInstance = await createMulterInstance();
|
||||
expect(multerInstance).toBeDefined();
|
||||
|
||||
// Verify that getCustomConfig was called (we can't spy on the actual merge function easily)
|
||||
expect(getCustomConfig).toHaveBeenCalled();
|
||||
// Verify that getAppConfig was called
|
||||
expect(getAppConfig).toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ const {
|
|||
deleteMemory,
|
||||
setMemory,
|
||||
} = require('~/models');
|
||||
const { requireJwtAuth } = require('~/server/middleware');
|
||||
const { requireJwtAuth, configMiddleware } = require('~/server/middleware');
|
||||
const { getRoleByName } = require('~/models/Role');
|
||||
|
||||
const router = express.Router();
|
||||
|
|
@ -48,7 +48,7 @@ router.use(requireJwtAuth);
|
|||
* Returns all memories for the authenticated user, sorted by updated_at (newest first).
|
||||
* Also includes memory usage percentage based on token limit.
|
||||
*/
|
||||
router.get('/', checkMemoryRead, async (req, res) => {
|
||||
router.get('/', checkMemoryRead, configMiddleware, async (req, res) => {
|
||||
try {
|
||||
const memories = await getAllUserMemories(req.user.id);
|
||||
|
||||
|
|
@ -60,7 +60,8 @@ router.get('/', checkMemoryRead, async (req, res) => {
|
|||
return sum + (memory.tokenCount || 0);
|
||||
}, 0);
|
||||
|
||||
const memoryConfig = req.app.locals?.memory;
|
||||
const appConfig = req.config;
|
||||
const memoryConfig = appConfig?.memory;
|
||||
const tokenLimit = memoryConfig?.tokenLimit;
|
||||
const charLimit = memoryConfig?.charLimit || 10000;
|
||||
|
||||
|
|
@ -87,7 +88,7 @@ router.get('/', checkMemoryRead, async (req, res) => {
|
|||
* Body: { key: string, value: string }
|
||||
* Returns 201 and { created: true, memory: <createdDoc> } when successful.
|
||||
*/
|
||||
router.post('/', memoryPayloadLimit, checkMemoryCreate, async (req, res) => {
|
||||
router.post('/', memoryPayloadLimit, checkMemoryCreate, configMiddleware, async (req, res) => {
|
||||
const { key, value } = req.body;
|
||||
|
||||
if (typeof key !== 'string' || key.trim() === '') {
|
||||
|
|
@ -98,7 +99,8 @@ router.post('/', memoryPayloadLimit, checkMemoryCreate, async (req, res) => {
|
|||
return res.status(400).json({ error: 'Value is required and must be a non-empty string.' });
|
||||
}
|
||||
|
||||
const memoryConfig = req.app.locals?.memory;
|
||||
const appConfig = req.config;
|
||||
const memoryConfig = appConfig?.memory;
|
||||
const charLimit = memoryConfig?.charLimit || 10000;
|
||||
|
||||
if (key.length > 1000) {
|
||||
|
|
@ -117,6 +119,9 @@ router.post('/', memoryPayloadLimit, checkMemoryCreate, async (req, res) => {
|
|||
const tokenCount = Tokenizer.getTokenCount(value, 'o200k_base');
|
||||
|
||||
const memories = await getAllUserMemories(req.user.id);
|
||||
|
||||
const appConfig = req.config;
|
||||
const memoryConfig = appConfig?.memory;
|
||||
const tokenLimit = memoryConfig?.tokenLimit;
|
||||
|
||||
if (tokenLimit) {
|
||||
|
|
@ -191,7 +196,7 @@ router.patch('/preferences', checkMemoryOptOut, async (req, res) => {
|
|||
* Body: { key?: string, value: string }
|
||||
* Returns 200 and { updated: true, memory: <updatedDoc> } when successful.
|
||||
*/
|
||||
router.patch('/:key', memoryPayloadLimit, checkMemoryUpdate, async (req, res) => {
|
||||
router.patch('/:key', memoryPayloadLimit, checkMemoryUpdate, configMiddleware, async (req, res) => {
|
||||
const { key: urlKey } = req.params;
|
||||
const { key: bodyKey, value } = req.body || {};
|
||||
|
||||
|
|
@ -200,8 +205,8 @@ router.patch('/:key', memoryPayloadLimit, checkMemoryUpdate, async (req, res) =>
|
|||
}
|
||||
|
||||
const newKey = bodyKey || urlKey;
|
||||
|
||||
const memoryConfig = req.app.locals?.memory;
|
||||
const appConfig = req.config;
|
||||
const memoryConfig = appConfig?.memory;
|
||||
const charLimit = memoryConfig?.charLimit || 10000;
|
||||
|
||||
if (newKey.length > 1000) {
|
||||
|
|
|
|||
|
|
@ -8,11 +8,11 @@ const { isEnabled, createSetBalanceConfig } = require('@librechat/api');
|
|||
const { checkDomainAllowed, loginLimiter, logHeaders, checkBan } = require('~/server/middleware');
|
||||
const { syncUserEntraGroupMemberships } = require('~/server/services/PermissionService');
|
||||
const { setAuthTokens, setOpenIDAuthTokens } = require('~/server/services/AuthService');
|
||||
const { getBalanceConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
const { Balance } = require('~/db/models');
|
||||
|
||||
const setBalanceConfig = createSetBalanceConfig({
|
||||
getBalanceConfig,
|
||||
getAppConfig,
|
||||
Balance,
|
||||
});
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
const express = require('express');
|
||||
const { getAvailablePluginsController } = require('../controllers/PluginController');
|
||||
const requireJwtAuth = require('../middleware/requireJwtAuth');
|
||||
const { getAvailablePluginsController } = require('~/server/controllers/PluginController');
|
||||
const { requireJwtAuth } = require('~/server/middleware');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
|
|
|
|||
|
|
@ -424,7 +424,7 @@ router.get('/', async (req, res) => {
|
|||
/**
|
||||
* Deletes a prompt
|
||||
*
|
||||
* @param {Express.Request} req - The request object.
|
||||
* @param {ServerRequest} req - The request object.
|
||||
* @param {TDeletePromptVariables} req.params - The request parameters
|
||||
* @param {import('mongoose').ObjectId} req.params.promptId - The prompt ID
|
||||
* @param {Express.Response} res - The response object.
|
||||
|
|
|
|||
|
|
@ -14,7 +14,6 @@ const {
|
|||
// Mock modules before importing
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
getCachedTools: jest.fn().mockResolvedValue({}),
|
||||
getCustomConfig: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('~/models/Role', () => ({
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
const express = require('express');
|
||||
const { requireJwtAuth, canDeleteAccount, verifyEmailLimiter } = require('~/server/middleware');
|
||||
const {
|
||||
getUserController,
|
||||
deleteUserController,
|
||||
verifyEmailController,
|
||||
updateUserPluginsController,
|
||||
resendVerificationController,
|
||||
getTermsStatusController,
|
||||
acceptTermsController,
|
||||
verifyEmailController,
|
||||
deleteUserController,
|
||||
getUserController,
|
||||
} = require('~/server/controllers/UserController');
|
||||
const { requireJwtAuth, canDeleteAccount, verifyEmailLimiter } = require('~/server/middleware');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,7 @@
|
|||
const { Constants, EModelEndpoint, actionDomainSeparator } = require('librechat-data-provider');
|
||||
const { Constants, actionDomainSeparator } = require('librechat-data-provider');
|
||||
const { domainParser } = require('./ActionService');
|
||||
|
||||
jest.mock('keyv');
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
getCustomConfig: jest.fn(),
|
||||
}));
|
||||
|
||||
const globalCache = {};
|
||||
jest.mock('~/cache/getLogStores', () => {
|
||||
|
|
@ -53,26 +50,6 @@ jest.mock('~/cache/getLogStores', () => {
|
|||
});
|
||||
|
||||
describe('domainParser', () => {
|
||||
const req = {
|
||||
app: {
|
||||
locals: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
assistants: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const reqNoAzure = {
|
||||
app: {
|
||||
locals: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
assistants: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const TLD = '.com';
|
||||
|
||||
// Non-azure request
|
||||
|
|
|
|||
|
|
@ -1,15 +1,4 @@
|
|||
jest.mock('~/models', () => ({
|
||||
initializeRoles: jest.fn(),
|
||||
seedDefaultRoles: jest.fn(),
|
||||
ensureDefaultCategories: jest.fn(),
|
||||
}));
|
||||
jest.mock('~/models/Role', () => ({
|
||||
updateAccessPermissions: jest.fn(),
|
||||
getRoleByName: jest.fn().mockResolvedValue(null),
|
||||
updateRoleByName: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('~/config', () => ({
|
||||
jest.mock('@librechat/data-schemas', () => ({
|
||||
logger: {
|
||||
info: jest.fn(),
|
||||
warn: jest.fn(),
|
||||
|
|
@ -17,11 +6,11 @@ jest.mock('~/config', () => ({
|
|||
},
|
||||
}));
|
||||
|
||||
jest.mock('./Config/loadCustomConfig', () => jest.fn());
|
||||
jest.mock('./start/interface', () => ({
|
||||
jest.mock('@librechat/api', () => ({
|
||||
...jest.requireActual('@librechat/api'),
|
||||
loadDefaultInterface: jest.fn(),
|
||||
}));
|
||||
jest.mock('./ToolService', () => ({
|
||||
jest.mock('./start/tools', () => ({
|
||||
loadAndFormatTools: jest.fn().mockReturnValue({}),
|
||||
}));
|
||||
jest.mock('./start/checks', () => ({
|
||||
|
|
@ -32,15 +21,15 @@ jest.mock('./start/checks', () => ({
|
|||
checkWebSearchConfig: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('./Config/loadCustomConfig', () => jest.fn());
|
||||
|
||||
const AppService = require('./AppService');
|
||||
const { loadDefaultInterface } = require('./start/interface');
|
||||
const { loadDefaultInterface } = require('@librechat/api');
|
||||
|
||||
describe('AppService interface configuration', () => {
|
||||
let app;
|
||||
let mockLoadCustomConfig;
|
||||
|
||||
beforeEach(() => {
|
||||
app = { locals: {} };
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
mockLoadCustomConfig = require('./Config/loadCustomConfig');
|
||||
|
|
@ -50,10 +39,16 @@ describe('AppService interface configuration', () => {
|
|||
mockLoadCustomConfig.mockResolvedValue({});
|
||||
loadDefaultInterface.mockResolvedValue({ prompts: true, bookmarks: true });
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.prompts).toBe(true);
|
||||
expect(app.locals.interfaceConfig.bookmarks).toBe(true);
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.objectContaining({
|
||||
prompts: true,
|
||||
bookmarks: true,
|
||||
}),
|
||||
}),
|
||||
);
|
||||
expect(loadDefaultInterface).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
|
|
@ -61,10 +56,16 @@ describe('AppService interface configuration', () => {
|
|||
mockLoadCustomConfig.mockResolvedValue({ interface: { prompts: false, bookmarks: false } });
|
||||
loadDefaultInterface.mockResolvedValue({ prompts: false, bookmarks: false });
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.prompts).toBe(false);
|
||||
expect(app.locals.interfaceConfig.bookmarks).toBe(false);
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.objectContaining({
|
||||
prompts: false,
|
||||
bookmarks: false,
|
||||
}),
|
||||
}),
|
||||
);
|
||||
expect(loadDefaultInterface).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
|
|
@ -72,10 +73,17 @@ describe('AppService interface configuration', () => {
|
|||
mockLoadCustomConfig.mockResolvedValue({});
|
||||
loadDefaultInterface.mockResolvedValue({});
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.prompts).toBeUndefined();
|
||||
expect(app.locals.interfaceConfig.bookmarks).toBeUndefined();
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.anything(),
|
||||
}),
|
||||
);
|
||||
|
||||
// Verify that prompts and bookmarks are undefined when not provided
|
||||
expect(result.interfaceConfig.prompts).toBeUndefined();
|
||||
expect(result.interfaceConfig.bookmarks).toBeUndefined();
|
||||
expect(loadDefaultInterface).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
|
|
@ -83,10 +91,16 @@ describe('AppService interface configuration', () => {
|
|||
mockLoadCustomConfig.mockResolvedValue({ interface: { prompts: true, bookmarks: false } });
|
||||
loadDefaultInterface.mockResolvedValue({ prompts: true, bookmarks: false });
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.prompts).toBe(true);
|
||||
expect(app.locals.interfaceConfig.bookmarks).toBe(false);
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.objectContaining({
|
||||
prompts: true,
|
||||
bookmarks: false,
|
||||
}),
|
||||
}),
|
||||
);
|
||||
expect(loadDefaultInterface).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
|
|
@ -108,14 +122,19 @@ describe('AppService interface configuration', () => {
|
|||
},
|
||||
});
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.peoplePicker).toBeDefined();
|
||||
expect(app.locals.interfaceConfig.peoplePicker).toMatchObject({
|
||||
users: true,
|
||||
groups: true,
|
||||
roles: true,
|
||||
});
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.objectContaining({
|
||||
peoplePicker: expect.objectContaining({
|
||||
users: true,
|
||||
groups: true,
|
||||
roles: true,
|
||||
}),
|
||||
}),
|
||||
}),
|
||||
);
|
||||
expect(loadDefaultInterface).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
|
|
@ -137,11 +156,19 @@ describe('AppService interface configuration', () => {
|
|||
},
|
||||
});
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.peoplePicker.users).toBe(true);
|
||||
expect(app.locals.interfaceConfig.peoplePicker.groups).toBe(false);
|
||||
expect(app.locals.interfaceConfig.peoplePicker.roles).toBe(true);
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.objectContaining({
|
||||
peoplePicker: expect.objectContaining({
|
||||
users: true,
|
||||
groups: false,
|
||||
roles: true,
|
||||
}),
|
||||
}),
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('should set default peoplePicker permissions when not provided', async () => {
|
||||
|
|
@ -154,11 +181,18 @@ describe('AppService interface configuration', () => {
|
|||
},
|
||||
});
|
||||
|
||||
await AppService(app);
|
||||
const result = await AppService();
|
||||
|
||||
expect(app.locals.interfaceConfig.peoplePicker).toBeDefined();
|
||||
expect(app.locals.interfaceConfig.peoplePicker.users).toBe(true);
|
||||
expect(app.locals.interfaceConfig.peoplePicker.groups).toBe(true);
|
||||
expect(app.locals.interfaceConfig.peoplePicker.roles).toBe(true);
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
interfaceConfig: expect.objectContaining({
|
||||
peoplePicker: expect.objectContaining({
|
||||
users: true,
|
||||
groups: true,
|
||||
roles: true,
|
||||
}),
|
||||
}),
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ const {
|
|||
loadMemoryConfig,
|
||||
agentsConfigSetup,
|
||||
loadWebSearchConfig,
|
||||
loadDefaultInterface,
|
||||
} = require('@librechat/api');
|
||||
const {
|
||||
FileSources,
|
||||
|
|
@ -12,35 +13,26 @@ const {
|
|||
} = require('librechat-data-provider');
|
||||
const {
|
||||
checkWebSearchConfig,
|
||||
checkAzureVariables,
|
||||
checkVariables,
|
||||
checkHealth,
|
||||
checkConfig,
|
||||
} = require('./start/checks');
|
||||
const { ensureDefaultCategories, seedDefaultRoles, initializeRoles } = require('~/models');
|
||||
const { azureAssistantsDefaults, assistantsConfigSetup } = require('./start/assistants');
|
||||
const { initializeAzureBlobService } = require('./Files/Azure/initialize');
|
||||
const { initializeFirebase } = require('./Files/Firebase/initialize');
|
||||
const loadCustomConfig = require('./Config/loadCustomConfig');
|
||||
const handleRateLimits = require('./Config/handleRateLimits');
|
||||
const { loadDefaultInterface } = require('./start/interface');
|
||||
const loadCustomConfig = require('./Config/loadCustomConfig');
|
||||
const { loadTurnstileConfig } = require('./start/turnstile');
|
||||
const { azureConfigSetup } = require('./start/azureOpenAI');
|
||||
const { processModelSpecs } = require('./start/modelSpecs');
|
||||
const { initializeS3 } = require('./Files/S3/initialize');
|
||||
const { loadAndFormatTools } = require('./ToolService');
|
||||
const { setCachedTools } = require('./Config');
|
||||
const { loadAndFormatTools } = require('./start/tools');
|
||||
const { loadEndpoints } = require('./start/endpoints');
|
||||
const paths = require('~/config/paths');
|
||||
|
||||
/**
|
||||
* Loads custom config and initializes app-wide variables.
|
||||
* @function AppService
|
||||
* @param {Express.Application} app - The Express application object.
|
||||
*/
|
||||
const AppService = async (app) => {
|
||||
await initializeRoles();
|
||||
await seedDefaultRoles();
|
||||
await ensureDefaultCategories();
|
||||
const AppService = async () => {
|
||||
/** @type {TCustomConfig} */
|
||||
const config = (await loadCustomConfig()) ?? {};
|
||||
const configDefaults = getConfigDefaults();
|
||||
|
|
@ -79,101 +71,57 @@ const AppService = async (app) => {
|
|||
directory: paths.structuredTools,
|
||||
});
|
||||
|
||||
await setCachedTools(availableTools, { isGlobal: true });
|
||||
|
||||
// Store MCP config for later initialization
|
||||
const mcpConfig = config.mcpServers || null;
|
||||
|
||||
const socialLogins =
|
||||
config?.registration?.socialLogins ?? configDefaults?.registration?.socialLogins;
|
||||
const interfaceConfig = await loadDefaultInterface(config, configDefaults);
|
||||
const registration = config.registration ?? configDefaults.registration;
|
||||
const interfaceConfig = await loadDefaultInterface({ config, configDefaults });
|
||||
const turnstileConfig = loadTurnstileConfig(config, configDefaults);
|
||||
const speech = config.speech;
|
||||
|
||||
const defaultLocals = {
|
||||
config,
|
||||
const defaultConfig = {
|
||||
ocr,
|
||||
paths,
|
||||
config,
|
||||
memory,
|
||||
speech,
|
||||
balance,
|
||||
mcpConfig,
|
||||
webSearch,
|
||||
fileStrategy,
|
||||
socialLogins,
|
||||
registration,
|
||||
filteredTools,
|
||||
includedTools,
|
||||
availableTools,
|
||||
imageOutputType,
|
||||
interfaceConfig,
|
||||
turnstileConfig,
|
||||
balance,
|
||||
mcpConfig,
|
||||
fileStrategies: config.fileStrategies,
|
||||
};
|
||||
|
||||
const agentsDefaults = agentsConfigSetup(config);
|
||||
|
||||
if (!Object.keys(config).length) {
|
||||
app.locals = {
|
||||
...defaultLocals,
|
||||
[EModelEndpoint.agents]: agentsDefaults,
|
||||
const appConfig = {
|
||||
...defaultConfig,
|
||||
endpoints: {
|
||||
[EModelEndpoint.agents]: agentsDefaults,
|
||||
},
|
||||
};
|
||||
return;
|
||||
return appConfig;
|
||||
}
|
||||
|
||||
checkConfig(config);
|
||||
handleRateLimits(config?.rateLimits);
|
||||
const loadedEndpoints = loadEndpoints(config, agentsDefaults);
|
||||
|
||||
const endpointLocals = {};
|
||||
const endpoints = config?.endpoints;
|
||||
|
||||
if (endpoints?.[EModelEndpoint.azureOpenAI]) {
|
||||
endpointLocals[EModelEndpoint.azureOpenAI] = azureConfigSetup(config);
|
||||
checkAzureVariables();
|
||||
}
|
||||
|
||||
if (endpoints?.[EModelEndpoint.azureOpenAI]?.assistants) {
|
||||
endpointLocals[EModelEndpoint.azureAssistants] = azureAssistantsDefaults();
|
||||
}
|
||||
|
||||
if (endpoints?.[EModelEndpoint.azureAssistants]) {
|
||||
endpointLocals[EModelEndpoint.azureAssistants] = assistantsConfigSetup(
|
||||
config,
|
||||
EModelEndpoint.azureAssistants,
|
||||
endpointLocals[EModelEndpoint.azureAssistants],
|
||||
);
|
||||
}
|
||||
|
||||
if (endpoints?.[EModelEndpoint.assistants]) {
|
||||
endpointLocals[EModelEndpoint.assistants] = assistantsConfigSetup(
|
||||
config,
|
||||
EModelEndpoint.assistants,
|
||||
endpointLocals[EModelEndpoint.assistants],
|
||||
);
|
||||
}
|
||||
|
||||
endpointLocals[EModelEndpoint.agents] = agentsConfigSetup(config, agentsDefaults);
|
||||
|
||||
const endpointKeys = [
|
||||
EModelEndpoint.openAI,
|
||||
EModelEndpoint.google,
|
||||
EModelEndpoint.bedrock,
|
||||
EModelEndpoint.anthropic,
|
||||
EModelEndpoint.gptPlugins,
|
||||
];
|
||||
|
||||
endpointKeys.forEach((key) => {
|
||||
if (endpoints?.[key]) {
|
||||
endpointLocals[key] = endpoints[key];
|
||||
}
|
||||
});
|
||||
|
||||
if (endpoints?.all) {
|
||||
endpointLocals.all = endpoints.all;
|
||||
}
|
||||
|
||||
app.locals = {
|
||||
...defaultLocals,
|
||||
const appConfig = {
|
||||
...defaultConfig,
|
||||
fileConfig: config?.fileConfig,
|
||||
secureImageLinks: config?.secureImageLinks,
|
||||
modelSpecs: processModelSpecs(endpoints, config.modelSpecs, interfaceConfig),
|
||||
...endpointLocals,
|
||||
modelSpecs: processModelSpecs(config?.endpoints, config.modelSpecs, interfaceConfig),
|
||||
endpoints: loadedEndpoints,
|
||||
};
|
||||
|
||||
return appConfig;
|
||||
};
|
||||
|
||||
module.exports = AppService;
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
|
|
@ -350,6 +350,7 @@ async function runAssistant({
|
|||
accumulatedMessages = [],
|
||||
in_progress: inProgress,
|
||||
}) {
|
||||
const appConfig = openai.req.config;
|
||||
let steps = accumulatedSteps;
|
||||
let messages = accumulatedMessages;
|
||||
const in_progress = inProgress ?? createInProgressHandler(openai, thread_id, messages);
|
||||
|
|
@ -396,8 +397,8 @@ async function runAssistant({
|
|||
});
|
||||
|
||||
const { endpoint = EModelEndpoint.azureAssistants } = openai.req.body;
|
||||
/** @type {TCustomConfig.endpoints.assistants} */
|
||||
const assistantsEndpointConfig = openai.req.app.locals?.[endpoint] ?? {};
|
||||
/** @type {AppConfig['endpoints']['assistants']} */
|
||||
const assistantsEndpointConfig = appConfig.endpoints?.[endpoint] ?? {};
|
||||
const { pollIntervalMs, timeoutMs } = assistantsEndpointConfig;
|
||||
|
||||
const run = await waitForRun({
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
const bcrypt = require('bcryptjs');
|
||||
const jwt = require('jsonwebtoken');
|
||||
const { webcrypto } = require('node:crypto');
|
||||
const { isEnabled } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { isEnabled, checkEmailConfig } = require('@librechat/api');
|
||||
const { SystemRoles, errorsToString } = require('librechat-data-provider');
|
||||
const {
|
||||
findUser,
|
||||
|
|
@ -21,9 +21,9 @@ const {
|
|||
generateRefreshToken,
|
||||
} = require('~/models');
|
||||
const { isEmailDomainAllowed } = require('~/server/services/domains');
|
||||
const { checkEmailConfig, sendEmail } = require('~/server/utils');
|
||||
const { getBalanceConfig } = require('~/server/services/Config');
|
||||
const { registerSchema } = require('~/strategies/validators');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
const { sendEmail } = require('~/server/utils');
|
||||
|
||||
const domains = {
|
||||
client: process.env.DOMAIN_CLIENT,
|
||||
|
|
@ -78,7 +78,7 @@ const createTokenHash = () => {
|
|||
|
||||
/**
|
||||
* Send Verification Email
|
||||
* @param {Partial<MongoUser> & { _id: ObjectId, email: string, name: string}} user
|
||||
* @param {Partial<IUser>} user
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
const sendVerificationEmail = async (user) => {
|
||||
|
|
@ -112,7 +112,7 @@ const sendVerificationEmail = async (user) => {
|
|||
|
||||
/**
|
||||
* Verify Email
|
||||
* @param {Express.Request} req
|
||||
* @param {ServerRequest} req
|
||||
*/
|
||||
const verifyEmail = async (req) => {
|
||||
const { email, token } = req.body;
|
||||
|
|
@ -160,9 +160,9 @@ const verifyEmail = async (req) => {
|
|||
|
||||
/**
|
||||
* Register a new user.
|
||||
* @param {MongoUser} user <email, password, name, username>
|
||||
* @param {Partial<MongoUser>} [additionalData={}]
|
||||
* @returns {Promise<{status: number, message: string, user?: MongoUser}>}
|
||||
* @param {IUser} user <email, password, name, username>
|
||||
* @param {Partial<IUser>} [additionalData={}]
|
||||
* @returns {Promise<{status: number, message: string, user?: IUser}>}
|
||||
*/
|
||||
const registerUser = async (user, additionalData = {}) => {
|
||||
const { error } = registerSchema.safeParse(user);
|
||||
|
|
@ -195,7 +195,8 @@ const registerUser = async (user, additionalData = {}) => {
|
|||
return { status: 200, message: genericVerificationMessage };
|
||||
}
|
||||
|
||||
if (!(await isEmailDomainAllowed(email))) {
|
||||
const appConfig = await getAppConfig({ role: user.role });
|
||||
if (!isEmailDomainAllowed(email, appConfig?.registration?.allowedDomains)) {
|
||||
const errorMessage =
|
||||
'The email address provided cannot be used. Please use a different email address.';
|
||||
logger.error(`[registerUser] [Registration not allowed] [Email: ${user.email}]`);
|
||||
|
|
@ -219,9 +220,8 @@ const registerUser = async (user, additionalData = {}) => {
|
|||
|
||||
const emailEnabled = checkEmailConfig();
|
||||
const disableTTL = isEnabled(process.env.ALLOW_UNVERIFIED_EMAIL_LOGIN);
|
||||
const balanceConfig = await getBalanceConfig();
|
||||
|
||||
const newUser = await createUser(newUserData, balanceConfig, disableTTL, true);
|
||||
const newUser = await createUser(newUserData, appConfig.balance, disableTTL, true);
|
||||
newUserId = newUser._id;
|
||||
if (emailEnabled && !newUser.emailVerified) {
|
||||
await sendVerificationEmail({
|
||||
|
|
@ -248,7 +248,7 @@ const registerUser = async (user, additionalData = {}) => {
|
|||
|
||||
/**
|
||||
* Request password reset
|
||||
* @param {Express.Request} req
|
||||
* @param {ServerRequest} req
|
||||
*/
|
||||
const requestPasswordReset = async (req) => {
|
||||
const { email } = req.body;
|
||||
|
|
|
|||
68
api/server/services/Config/app.js
Normal file
68
api/server/services/Config/app.js
Normal file
|
|
@ -0,0 +1,68 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { CacheKeys } = require('librechat-data-provider');
|
||||
const AppService = require('~/server/services/AppService');
|
||||
const { setCachedTools } = require('./getCachedTools');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
|
||||
/**
|
||||
* Get the app configuration based on user context
|
||||
* @param {Object} [options]
|
||||
* @param {string} [options.role] - User role for role-based config
|
||||
* @param {boolean} [options.refresh] - Force refresh the cache
|
||||
* @returns {Promise<AppConfig>}
|
||||
*/
|
||||
async function getAppConfig(options = {}) {
|
||||
const { role, refresh } = options;
|
||||
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
const cacheKey = role ? `${CacheKeys.APP_CONFIG}:${role}` : CacheKeys.APP_CONFIG;
|
||||
|
||||
if (!refresh) {
|
||||
const cached = await cache.get(cacheKey);
|
||||
if (cached) {
|
||||
return cached;
|
||||
}
|
||||
}
|
||||
|
||||
let baseConfig = await cache.get(CacheKeys.APP_CONFIG);
|
||||
if (!baseConfig) {
|
||||
logger.info('[getAppConfig] App configuration not initialized. Initializing AppService...');
|
||||
baseConfig = await AppService();
|
||||
|
||||
if (!baseConfig) {
|
||||
throw new Error('Failed to initialize app configuration through AppService.');
|
||||
}
|
||||
|
||||
if (baseConfig.availableTools) {
|
||||
await setCachedTools(baseConfig.availableTools, { isGlobal: true });
|
||||
}
|
||||
|
||||
await cache.set(CacheKeys.APP_CONFIG, baseConfig);
|
||||
}
|
||||
|
||||
// For now, return the base config
|
||||
// In the future, this is where we'll apply role-based modifications
|
||||
if (role) {
|
||||
// TODO: Apply role-based config modifications
|
||||
// const roleConfig = await applyRoleBasedConfig(baseConfig, role);
|
||||
// await cache.set(cacheKey, roleConfig);
|
||||
// return roleConfig;
|
||||
}
|
||||
|
||||
return baseConfig;
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear the app configuration cache
|
||||
* @returns {Promise<boolean>}
|
||||
*/
|
||||
async function clearAppConfigCache() {
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
const cacheKey = CacheKeys.APP_CONFIG;
|
||||
return await cache.delete(cacheKey);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
getAppConfig,
|
||||
clearAppConfigCache,
|
||||
};
|
||||
|
|
@ -1,69 +0,0 @@
|
|||
const { isEnabled } = require('@librechat/api');
|
||||
const { CacheKeys, EModelEndpoint } = require('librechat-data-provider');
|
||||
const { normalizeEndpointName } = require('~/server/utils');
|
||||
const loadCustomConfig = require('./loadCustomConfig');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
|
||||
/**
|
||||
* Retrieves the configuration object
|
||||
* @function getCustomConfig
|
||||
* @returns {Promise<TCustomConfig | null>}
|
||||
* */
|
||||
async function getCustomConfig() {
|
||||
const cache = getLogStores(CacheKeys.STATIC_CONFIG);
|
||||
return (await cache.get(CacheKeys.LIBRECHAT_YAML_CONFIG)) || (await loadCustomConfig());
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves the configuration object
|
||||
* @function getBalanceConfig
|
||||
* @returns {Promise<TCustomConfig['balance'] | null>}
|
||||
* */
|
||||
async function getBalanceConfig() {
|
||||
const isLegacyEnabled = isEnabled(process.env.CHECK_BALANCE);
|
||||
const startBalance = process.env.START_BALANCE;
|
||||
/** @type {TCustomConfig['balance']} */
|
||||
const config = {
|
||||
enabled: isLegacyEnabled,
|
||||
startBalance: startBalance != null && startBalance ? parseInt(startBalance, 10) : undefined,
|
||||
};
|
||||
const customConfig = await getCustomConfig();
|
||||
if (!customConfig) {
|
||||
return config;
|
||||
}
|
||||
return { ...config, ...(customConfig?.['balance'] ?? {}) };
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string | EModelEndpoint} endpoint
|
||||
* @returns {Promise<TEndpoint | undefined>}
|
||||
*/
|
||||
const getCustomEndpointConfig = async (endpoint) => {
|
||||
const customConfig = await getCustomConfig();
|
||||
if (!customConfig) {
|
||||
throw new Error(`Config not found for the ${endpoint} custom endpoint.`);
|
||||
}
|
||||
|
||||
const { endpoints = {} } = customConfig;
|
||||
const customEndpoints = endpoints[EModelEndpoint.custom] ?? [];
|
||||
return customEndpoints.find(
|
||||
(endpointConfig) => normalizeEndpointName(endpointConfig.name) === endpoint,
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* @returns {Promise<boolean>}
|
||||
*/
|
||||
async function hasCustomUserVars() {
|
||||
const customConfig = await getCustomConfig();
|
||||
const mcpServers = customConfig?.mcpServers;
|
||||
return Object.values(mcpServers ?? {}).some((server) => server.customUserVars);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
getCustomConfig,
|
||||
getBalanceConfig,
|
||||
hasCustomUserVars,
|
||||
getCustomEndpointConfig,
|
||||
};
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
const { loadCustomEndpointsConfig } = require('@librechat/api');
|
||||
const {
|
||||
CacheKeys,
|
||||
EModelEndpoint,
|
||||
|
|
@ -6,8 +7,8 @@ const {
|
|||
defaultAgentCapabilities,
|
||||
} = require('librechat-data-provider');
|
||||
const loadDefaultEndpointsConfig = require('./loadDefaultEConfig');
|
||||
const loadConfigEndpoints = require('./loadConfigEndpoints');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
const { getAppConfig } = require('./app');
|
||||
|
||||
/**
|
||||
*
|
||||
|
|
@ -21,14 +22,36 @@ async function getEndpointsConfig(req) {
|
|||
return cachedEndpointsConfig;
|
||||
}
|
||||
|
||||
const defaultEndpointsConfig = await loadDefaultEndpointsConfig(req);
|
||||
const customConfigEndpoints = await loadConfigEndpoints(req);
|
||||
const appConfig = req.config ?? (await getAppConfig({ role: req.user?.role }));
|
||||
const defaultEndpointsConfig = await loadDefaultEndpointsConfig(appConfig);
|
||||
const customEndpointsConfig = loadCustomEndpointsConfig(appConfig?.endpoints?.custom);
|
||||
|
||||
/** @type {TEndpointsConfig} */
|
||||
const mergedConfig = { ...defaultEndpointsConfig, ...customConfigEndpoints };
|
||||
if (mergedConfig[EModelEndpoint.assistants] && req.app.locals?.[EModelEndpoint.assistants]) {
|
||||
const mergedConfig = {
|
||||
...defaultEndpointsConfig,
|
||||
...customEndpointsConfig,
|
||||
};
|
||||
|
||||
if (appConfig.endpoints?.[EModelEndpoint.azureOpenAI]) {
|
||||
/** @type {Omit<TConfig, 'order'>} */
|
||||
mergedConfig[EModelEndpoint.azureOpenAI] = {
|
||||
userProvide: false,
|
||||
};
|
||||
}
|
||||
|
||||
if (appConfig.endpoints?.[EModelEndpoint.azureOpenAI]?.assistants) {
|
||||
/** @type {Omit<TConfig, 'order'>} */
|
||||
mergedConfig[EModelEndpoint.azureAssistants] = {
|
||||
userProvide: false,
|
||||
};
|
||||
}
|
||||
|
||||
if (
|
||||
mergedConfig[EModelEndpoint.assistants] &&
|
||||
appConfig?.endpoints?.[EModelEndpoint.assistants]
|
||||
) {
|
||||
const { disableBuilder, retrievalModels, capabilities, version, ..._rest } =
|
||||
req.app.locals[EModelEndpoint.assistants];
|
||||
appConfig.endpoints[EModelEndpoint.assistants];
|
||||
|
||||
mergedConfig[EModelEndpoint.assistants] = {
|
||||
...mergedConfig[EModelEndpoint.assistants],
|
||||
|
|
@ -38,9 +61,9 @@ async function getEndpointsConfig(req) {
|
|||
capabilities,
|
||||
};
|
||||
}
|
||||
if (mergedConfig[EModelEndpoint.agents] && req.app.locals?.[EModelEndpoint.agents]) {
|
||||
if (mergedConfig[EModelEndpoint.agents] && appConfig?.endpoints?.[EModelEndpoint.agents]) {
|
||||
const { disableBuilder, capabilities, allowedProviders, ..._rest } =
|
||||
req.app.locals[EModelEndpoint.agents];
|
||||
appConfig.endpoints[EModelEndpoint.agents];
|
||||
|
||||
mergedConfig[EModelEndpoint.agents] = {
|
||||
...mergedConfig[EModelEndpoint.agents],
|
||||
|
|
@ -52,10 +75,10 @@ async function getEndpointsConfig(req) {
|
|||
|
||||
if (
|
||||
mergedConfig[EModelEndpoint.azureAssistants] &&
|
||||
req.app.locals?.[EModelEndpoint.azureAssistants]
|
||||
appConfig?.endpoints?.[EModelEndpoint.azureAssistants]
|
||||
) {
|
||||
const { disableBuilder, retrievalModels, capabilities, version, ..._rest } =
|
||||
req.app.locals[EModelEndpoint.azureAssistants];
|
||||
appConfig.endpoints[EModelEndpoint.azureAssistants];
|
||||
|
||||
mergedConfig[EModelEndpoint.azureAssistants] = {
|
||||
...mergedConfig[EModelEndpoint.azureAssistants],
|
||||
|
|
@ -66,8 +89,8 @@ async function getEndpointsConfig(req) {
|
|||
};
|
||||
}
|
||||
|
||||
if (mergedConfig[EModelEndpoint.bedrock] && req.app.locals?.[EModelEndpoint.bedrock]) {
|
||||
const { availableRegions } = req.app.locals[EModelEndpoint.bedrock];
|
||||
if (mergedConfig[EModelEndpoint.bedrock] && appConfig?.endpoints?.[EModelEndpoint.bedrock]) {
|
||||
const { availableRegions } = appConfig.endpoints[EModelEndpoint.bedrock];
|
||||
mergedConfig[EModelEndpoint.bedrock] = {
|
||||
...mergedConfig[EModelEndpoint.bedrock],
|
||||
availableRegions,
|
||||
|
|
|
|||
|
|
@ -1,12 +1,11 @@
|
|||
const appConfig = require('./app');
|
||||
const { config } = require('./EndpointService');
|
||||
const getCachedTools = require('./getCachedTools');
|
||||
const getCustomConfig = require('./getCustomConfig');
|
||||
const mcpToolsCache = require('./mcpToolsCache');
|
||||
const loadCustomConfig = require('./loadCustomConfig');
|
||||
const loadConfigModels = require('./loadConfigModels');
|
||||
const loadDefaultModels = require('./loadDefaultModels');
|
||||
const getEndpointsConfig = require('./getEndpointsConfig');
|
||||
const loadOverrideConfig = require('./loadOverrideConfig');
|
||||
const loadAsyncEndpoints = require('./loadAsyncEndpoints');
|
||||
|
||||
module.exports = {
|
||||
|
|
@ -14,10 +13,9 @@ module.exports = {
|
|||
loadCustomConfig,
|
||||
loadConfigModels,
|
||||
loadDefaultModels,
|
||||
loadOverrideConfig,
|
||||
loadAsyncEndpoints,
|
||||
...appConfig,
|
||||
...getCachedTools,
|
||||
...getCustomConfig,
|
||||
...mcpToolsCache,
|
||||
...getEndpointsConfig,
|
||||
};
|
||||
|
|
|
|||
|
|
@ -1,16 +1,16 @@
|
|||
const path = require('path');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { loadServiceKey, isUserProvided } = require('@librechat/api');
|
||||
const { EModelEndpoint } = require('librechat-data-provider');
|
||||
const { loadServiceKey, isUserProvided } = require('@librechat/api');
|
||||
const { config } = require('./EndpointService');
|
||||
|
||||
const { openAIApiKey, azureOpenAIApiKey, useAzurePlugins, userProvidedOpenAI, googleKey } = config;
|
||||
|
||||
/**
|
||||
* Load async endpoints and return a configuration object
|
||||
* @param {Express.Request} req - The request object
|
||||
* @param {AppConfig} [appConfig] - The app configuration object
|
||||
*/
|
||||
async function loadAsyncEndpoints(req) {
|
||||
async function loadAsyncEndpoints(appConfig) {
|
||||
let serviceKey, googleUserProvides;
|
||||
|
||||
/** Check if GOOGLE_KEY is provided at all(including 'user_provided') */
|
||||
|
|
@ -34,7 +34,7 @@ async function loadAsyncEndpoints(req) {
|
|||
|
||||
const google = serviceKey || isGoogleKeyProvided ? { userProvide: googleUserProvides } : false;
|
||||
|
||||
const useAzure = req.app.locals[EModelEndpoint.azureOpenAI]?.plugins;
|
||||
const useAzure = !!appConfig?.endpoints?.[EModelEndpoint.azureOpenAI]?.plugins;
|
||||
const gptPlugins =
|
||||
useAzure || openAIApiKey || azureOpenAIApiKey
|
||||
? {
|
||||
|
|
|
|||
|
|
@ -1,73 +0,0 @@
|
|||
const { EModelEndpoint, extractEnvVariable } = require('librechat-data-provider');
|
||||
const { isUserProvided, normalizeEndpointName } = require('~/server/utils');
|
||||
const { getCustomConfig } = require('./getCustomConfig');
|
||||
|
||||
/**
|
||||
* Load config endpoints from the cached configuration object
|
||||
* @param {Express.Request} req - The request object
|
||||
* @returns {Promise<TEndpointsConfig>} A promise that resolves to an object containing the endpoints configuration
|
||||
*/
|
||||
async function loadConfigEndpoints(req) {
|
||||
const customConfig = await getCustomConfig();
|
||||
|
||||
if (!customConfig) {
|
||||
return {};
|
||||
}
|
||||
|
||||
const { endpoints = {} } = customConfig ?? {};
|
||||
const endpointsConfig = {};
|
||||
|
||||
if (Array.isArray(endpoints[EModelEndpoint.custom])) {
|
||||
const customEndpoints = endpoints[EModelEndpoint.custom].filter(
|
||||
(endpoint) =>
|
||||
endpoint.baseURL &&
|
||||
endpoint.apiKey &&
|
||||
endpoint.name &&
|
||||
endpoint.models &&
|
||||
(endpoint.models.fetch || endpoint.models.default),
|
||||
);
|
||||
|
||||
for (let i = 0; i < customEndpoints.length; i++) {
|
||||
const endpoint = customEndpoints[i];
|
||||
const {
|
||||
baseURL,
|
||||
apiKey,
|
||||
name: configName,
|
||||
iconURL,
|
||||
modelDisplayLabel,
|
||||
customParams,
|
||||
} = endpoint;
|
||||
const name = normalizeEndpointName(configName);
|
||||
|
||||
const resolvedApiKey = extractEnvVariable(apiKey);
|
||||
const resolvedBaseURL = extractEnvVariable(baseURL);
|
||||
|
||||
endpointsConfig[name] = {
|
||||
type: EModelEndpoint.custom,
|
||||
userProvide: isUserProvided(resolvedApiKey),
|
||||
userProvideURL: isUserProvided(resolvedBaseURL),
|
||||
modelDisplayLabel,
|
||||
iconURL,
|
||||
customParams,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (req.app.locals[EModelEndpoint.azureOpenAI]) {
|
||||
/** @type {Omit<TConfig, 'order'>} */
|
||||
endpointsConfig[EModelEndpoint.azureOpenAI] = {
|
||||
userProvide: false,
|
||||
};
|
||||
}
|
||||
|
||||
if (req.app.locals[EModelEndpoint.azureOpenAI]?.assistants) {
|
||||
/** @type {Omit<TConfig, 'order'>} */
|
||||
endpointsConfig[EModelEndpoint.azureAssistants] = {
|
||||
userProvide: false,
|
||||
};
|
||||
}
|
||||
|
||||
return endpointsConfig;
|
||||
}
|
||||
|
||||
module.exports = loadConfigEndpoints;
|
||||
|
|
@ -1,43 +1,39 @@
|
|||
const { isUserProvided, normalizeEndpointName } = require('@librechat/api');
|
||||
const { EModelEndpoint, extractEnvVariable } = require('librechat-data-provider');
|
||||
const { isUserProvided, normalizeEndpointName } = require('~/server/utils');
|
||||
const { fetchModels } = require('~/server/services/ModelService');
|
||||
const { getCustomConfig } = require('./getCustomConfig');
|
||||
const { getAppConfig } = require('./app');
|
||||
|
||||
/**
|
||||
* Load config endpoints from the cached configuration object
|
||||
* @function loadConfigModels
|
||||
* @param {Express.Request} req - The Express request object.
|
||||
* @param {ServerRequest} req - The Express request object.
|
||||
*/
|
||||
async function loadConfigModels(req) {
|
||||
const customConfig = await getCustomConfig();
|
||||
|
||||
if (!customConfig) {
|
||||
const appConfig = await getAppConfig({ role: req.user?.role });
|
||||
if (!appConfig) {
|
||||
return {};
|
||||
}
|
||||
|
||||
const { endpoints = {} } = customConfig ?? {};
|
||||
const modelsConfig = {};
|
||||
const azureEndpoint = endpoints[EModelEndpoint.azureOpenAI];
|
||||
const azureConfig = req.app.locals[EModelEndpoint.azureOpenAI];
|
||||
const azureConfig = appConfig.endpoints?.[EModelEndpoint.azureOpenAI];
|
||||
const { modelNames } = azureConfig ?? {};
|
||||
|
||||
if (modelNames && azureEndpoint) {
|
||||
if (modelNames && azureConfig) {
|
||||
modelsConfig[EModelEndpoint.azureOpenAI] = modelNames;
|
||||
}
|
||||
|
||||
if (modelNames && azureEndpoint && azureEndpoint.plugins) {
|
||||
if (modelNames && azureConfig && azureConfig.plugins) {
|
||||
modelsConfig[EModelEndpoint.gptPlugins] = modelNames;
|
||||
}
|
||||
|
||||
if (azureEndpoint?.assistants && azureConfig.assistantModels) {
|
||||
if (azureConfig?.assistants && azureConfig.assistantModels) {
|
||||
modelsConfig[EModelEndpoint.azureAssistants] = azureConfig.assistantModels;
|
||||
}
|
||||
|
||||
if (!Array.isArray(endpoints[EModelEndpoint.custom])) {
|
||||
if (!Array.isArray(appConfig.endpoints?.[EModelEndpoint.custom])) {
|
||||
return modelsConfig;
|
||||
}
|
||||
|
||||
const customEndpoints = endpoints[EModelEndpoint.custom].filter(
|
||||
const customEndpoints = appConfig.endpoints[EModelEndpoint.custom].filter(
|
||||
(endpoint) =>
|
||||
endpoint.baseURL &&
|
||||
endpoint.apiKey &&
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
const { fetchModels } = require('~/server/services/ModelService');
|
||||
const { getCustomConfig } = require('./getCustomConfig');
|
||||
const loadConfigModels = require('./loadConfigModels');
|
||||
const { getAppConfig } = require('./app');
|
||||
|
||||
jest.mock('~/server/services/ModelService');
|
||||
jest.mock('./getCustomConfig');
|
||||
jest.mock('./app');
|
||||
|
||||
const exampleConfig = {
|
||||
endpoints: {
|
||||
|
|
@ -60,7 +60,7 @@ const exampleConfig = {
|
|||
};
|
||||
|
||||
describe('loadConfigModels', () => {
|
||||
const mockRequest = { app: { locals: {} }, user: { id: 'testUserId' } };
|
||||
const mockRequest = { user: { id: 'testUserId' } };
|
||||
|
||||
const originalEnv = process.env;
|
||||
|
||||
|
|
@ -68,6 +68,9 @@ describe('loadConfigModels', () => {
|
|||
jest.resetAllMocks();
|
||||
jest.resetModules();
|
||||
process.env = { ...originalEnv };
|
||||
|
||||
// Default mock for getAppConfig
|
||||
getAppConfig.mockResolvedValue({});
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
|
|
@ -75,18 +78,15 @@ describe('loadConfigModels', () => {
|
|||
});
|
||||
|
||||
it('should return an empty object if customConfig is null', async () => {
|
||||
getCustomConfig.mockResolvedValue(null);
|
||||
getAppConfig.mockResolvedValue(null);
|
||||
const result = await loadConfigModels(mockRequest);
|
||||
expect(result).toEqual({});
|
||||
});
|
||||
|
||||
it('handles azure models and endpoint correctly', async () => {
|
||||
mockRequest.app.locals.azureOpenAI = { modelNames: ['model1', 'model2'] };
|
||||
getCustomConfig.mockResolvedValue({
|
||||
getAppConfig.mockResolvedValue({
|
||||
endpoints: {
|
||||
azureOpenAI: {
|
||||
models: ['model1', 'model2'],
|
||||
},
|
||||
azureOpenAI: { modelNames: ['model1', 'model2'] },
|
||||
},
|
||||
});
|
||||
|
||||
|
|
@ -97,18 +97,16 @@ describe('loadConfigModels', () => {
|
|||
it('fetches custom models based on the unique key', async () => {
|
||||
process.env.BASE_URL = 'http://example.com';
|
||||
process.env.API_KEY = 'some-api-key';
|
||||
const customEndpoints = {
|
||||
custom: [
|
||||
{
|
||||
baseURL: '${BASE_URL}',
|
||||
apiKey: '${API_KEY}',
|
||||
name: 'CustomModel',
|
||||
models: { fetch: true },
|
||||
},
|
||||
],
|
||||
};
|
||||
const customEndpoints = [
|
||||
{
|
||||
baseURL: '${BASE_URL}',
|
||||
apiKey: '${API_KEY}',
|
||||
name: 'CustomModel',
|
||||
models: { fetch: true },
|
||||
},
|
||||
];
|
||||
|
||||
getCustomConfig.mockResolvedValue({ endpoints: customEndpoints });
|
||||
getAppConfig.mockResolvedValue({ endpoints: { custom: customEndpoints } });
|
||||
fetchModels.mockResolvedValue(['customModel1', 'customModel2']);
|
||||
|
||||
const result = await loadConfigModels(mockRequest);
|
||||
|
|
@ -117,7 +115,7 @@ describe('loadConfigModels', () => {
|
|||
});
|
||||
|
||||
it('correctly associates models to names using unique keys', async () => {
|
||||
getCustomConfig.mockResolvedValue({
|
||||
getAppConfig.mockResolvedValue({
|
||||
endpoints: {
|
||||
custom: [
|
||||
{
|
||||
|
|
@ -146,7 +144,7 @@ describe('loadConfigModels', () => {
|
|||
|
||||
it('correctly handles multiple endpoints with the same baseURL but different apiKeys', async () => {
|
||||
// Mock the custom configuration to simulate the user's scenario
|
||||
getCustomConfig.mockResolvedValue({
|
||||
getAppConfig.mockResolvedValue({
|
||||
endpoints: {
|
||||
custom: [
|
||||
{
|
||||
|
|
@ -210,7 +208,7 @@ describe('loadConfigModels', () => {
|
|||
process.env.MY_OPENROUTER_API_KEY = 'actual_openrouter_api_key';
|
||||
// Setup custom configuration with specific API keys for Mistral and OpenRouter
|
||||
// and "user_provided" for groq and Ollama, indicating no fetch for the latter two
|
||||
getCustomConfig.mockResolvedValue(exampleConfig);
|
||||
getAppConfig.mockResolvedValue(exampleConfig);
|
||||
|
||||
// Assuming fetchModels would be called only for Mistral and OpenRouter
|
||||
fetchModels.mockImplementation(({ name }) => {
|
||||
|
|
@ -273,7 +271,7 @@ describe('loadConfigModels', () => {
|
|||
});
|
||||
|
||||
it('falls back to default models if fetching returns an empty array', async () => {
|
||||
getCustomConfig.mockResolvedValue({
|
||||
getAppConfig.mockResolvedValue({
|
||||
endpoints: {
|
||||
custom: [
|
||||
{
|
||||
|
|
@ -306,7 +304,7 @@ describe('loadConfigModels', () => {
|
|||
});
|
||||
|
||||
it('falls back to default models if fetching returns a falsy value', async () => {
|
||||
getCustomConfig.mockResolvedValue({
|
||||
getAppConfig.mockResolvedValue({
|
||||
endpoints: {
|
||||
custom: [
|
||||
{
|
||||
|
|
@ -367,7 +365,7 @@ describe('loadConfigModels', () => {
|
|||
},
|
||||
];
|
||||
|
||||
getCustomConfig.mockResolvedValue({
|
||||
getAppConfig.mockResolvedValue({
|
||||
endpoints: {
|
||||
custom: testCases,
|
||||
},
|
||||
|
|
|
|||
|
|
@ -4,11 +4,11 @@ const { config } = require('./EndpointService');
|
|||
|
||||
/**
|
||||
* Load async endpoints and return a configuration object
|
||||
* @param {Express.Request} req - The request object
|
||||
* @param {AppConfig} appConfig - The app configuration object
|
||||
* @returns {Promise<Object.<string, EndpointWithOrder>>} An object whose keys are endpoint names and values are objects that contain the endpoint configuration and an order.
|
||||
*/
|
||||
async function loadDefaultEndpointsConfig(req) {
|
||||
const { google, gptPlugins } = await loadAsyncEndpoints(req);
|
||||
async function loadDefaultEndpointsConfig(appConfig) {
|
||||
const { google, gptPlugins } = await loadAsyncEndpoints(appConfig);
|
||||
const { assistants, azureAssistants, azureOpenAI, chatGPTBrowser } = config;
|
||||
|
||||
const enabledEndpoints = getEnabledEndpoints();
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ const {
|
|||
* Loads the default models for the application.
|
||||
* @async
|
||||
* @function
|
||||
* @param {Express.Request} req - The Express request object.
|
||||
* @param {ServerRequest} req - The Express request object.
|
||||
*/
|
||||
async function loadDefaultModels(req) {
|
||||
try {
|
||||
|
|
|
|||
|
|
@ -1,6 +0,0 @@
|
|||
// fetch some remote config
|
||||
async function loadOverrideConfig() {
|
||||
return false;
|
||||
}
|
||||
|
||||
module.exports = loadOverrideConfig;
|
||||
|
|
@ -49,6 +49,7 @@ const initializeAgent = async ({
|
|||
allowedProviders,
|
||||
isInitialAgent = false,
|
||||
}) => {
|
||||
const appConfig = req.config;
|
||||
if (
|
||||
isAgentsEndpoint(endpointOption?.endpoint) &&
|
||||
allowedProviders.size > 0 &&
|
||||
|
|
@ -90,10 +91,11 @@ const initializeAgent = async ({
|
|||
const { attachments, tool_resources } = await primeResources({
|
||||
req,
|
||||
getFiles,
|
||||
appConfig,
|
||||
agentId: agent.id,
|
||||
attachments: currentFiles,
|
||||
tool_resources: agent.tool_resources,
|
||||
requestFileSet: new Set(requestFiles?.map((file) => file.file_id)),
|
||||
agentId: agent.id,
|
||||
});
|
||||
|
||||
const provider = agent.provider;
|
||||
|
|
@ -112,7 +114,7 @@ const initializeAgent = async ({
|
|||
})) ?? {};
|
||||
|
||||
agent.endpoint = provider;
|
||||
const { getOptions, overrideProvider } = await getProviderConfig(provider);
|
||||
const { getOptions, overrideProvider } = getProviderConfig({ provider, appConfig });
|
||||
if (overrideProvider !== agent.provider) {
|
||||
agent.provider = overrideProvider;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { validateAgentModel } = require('@librechat/api');
|
||||
const { createContentAggregator } = require('@librechat/agents');
|
||||
const { validateAgentModel, getCustomEndpointConfig } = require('@librechat/api');
|
||||
const {
|
||||
Constants,
|
||||
EModelEndpoint,
|
||||
|
|
@ -13,7 +13,6 @@ const {
|
|||
} = require('~/server/controllers/agents/callbacks');
|
||||
const { initializeAgent } = require('~/server/services/Endpoints/agents/agent');
|
||||
const { getModelsConfig } = require('~/server/controllers/ModelController');
|
||||
const { getCustomEndpointConfig } = require('~/server/services/Config');
|
||||
const { loadAgentTools } = require('~/server/services/ToolService');
|
||||
const AgentClient = require('~/server/controllers/agents/client');
|
||||
const { getAgent } = require('~/models/Agent');
|
||||
|
|
@ -58,6 +57,7 @@ const initializeClient = async ({ req, res, signal, endpointOption }) => {
|
|||
if (!endpointOption) {
|
||||
throw new Error('Endpoint option not provided');
|
||||
}
|
||||
const appConfig = req.config;
|
||||
|
||||
// TODO: use endpointOption to determine options/modelOptions
|
||||
/** @type {Array<UsageMetadata>} */
|
||||
|
|
@ -97,8 +97,7 @@ const initializeClient = async ({ req, res, signal, endpointOption }) => {
|
|||
}
|
||||
|
||||
const agentConfigs = new Map();
|
||||
/** @type {Set<string>} */
|
||||
const allowedProviders = new Set(req?.app?.locals?.[EModelEndpoint.agents]?.allowedProviders);
|
||||
const allowedProviders = new Set(appConfig?.endpoints?.[EModelEndpoint.agents]?.allowedProviders);
|
||||
|
||||
const loadTools = createToolLoader(signal);
|
||||
/** @type {Array<MongoFile>} */
|
||||
|
|
@ -158,10 +157,13 @@ const initializeClient = async ({ req, res, signal, endpointOption }) => {
|
|||
}
|
||||
}
|
||||
|
||||
let endpointConfig = req.app.locals[primaryConfig.endpoint];
|
||||
let endpointConfig = appConfig.endpoints?.[primaryConfig.endpoint];
|
||||
if (!isAgentsEndpoint(primaryConfig.endpoint) && !endpointConfig) {
|
||||
try {
|
||||
endpointConfig = await getCustomEndpointConfig(primaryConfig.endpoint);
|
||||
endpointConfig = getCustomEndpointConfig({
|
||||
endpoint: primaryConfig.endpoint,
|
||||
appConfig,
|
||||
});
|
||||
} catch (err) {
|
||||
logger.error(
|
||||
'[api/server/controllers/agents/client.js #titleConvo] Error getting custom endpoint config',
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ const { getLLMConfig } = require('~/server/services/Endpoints/anthropic/llm');
|
|||
const AnthropicClient = require('~/app/clients/AnthropicClient');
|
||||
|
||||
const initializeClient = async ({ req, res, endpointOption, overrideModel, optionsOnly }) => {
|
||||
const appConfig = req.config;
|
||||
const { ANTHROPIC_API_KEY, ANTHROPIC_REVERSE_PROXY, PROXY } = process.env;
|
||||
const expiresAt = req.body.key;
|
||||
const isUserProvided = ANTHROPIC_API_KEY === 'user_provided';
|
||||
|
|
@ -23,15 +24,14 @@ const initializeClient = async ({ req, res, endpointOption, overrideModel, optio
|
|||
let clientOptions = {};
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const anthropicConfig = req.app.locals[EModelEndpoint.anthropic];
|
||||
const anthropicConfig = appConfig.endpoints?.[EModelEndpoint.anthropic];
|
||||
|
||||
if (anthropicConfig) {
|
||||
clientOptions.streamRate = anthropicConfig.streamRate;
|
||||
clientOptions.titleModel = anthropicConfig.titleModel;
|
||||
}
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const allConfig = req.app.locals.all;
|
||||
const allConfig = appConfig.endpoints?.all;
|
||||
if (allConfig) {
|
||||
clientOptions.streamRate = allConfig.streamRate;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -48,6 +48,7 @@ class Files {
|
|||
}
|
||||
|
||||
const initializeClient = async ({ req, res, version, endpointOption, initAppClient = false }) => {
|
||||
const appConfig = req.config;
|
||||
const { PROXY, OPENAI_ORGANIZATION, AZURE_ASSISTANTS_API_KEY, AZURE_ASSISTANTS_BASE_URL } =
|
||||
process.env;
|
||||
|
||||
|
|
@ -81,7 +82,7 @@ const initializeClient = async ({ req, res, version, endpointOption, initAppClie
|
|||
};
|
||||
|
||||
/** @type {TAzureConfig | undefined} */
|
||||
const azureConfig = req.app.locals[EModelEndpoint.azureOpenAI];
|
||||
const azureConfig = appConfig.endpoints?.[EModelEndpoint.azureOpenAI];
|
||||
|
||||
/** @type {AzureOptions | undefined} */
|
||||
let azureOptions;
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
// const OpenAI = require('openai');
|
||||
const { ProxyAgent } = require('undici');
|
||||
const { ErrorTypes } = require('librechat-data-provider');
|
||||
const { ErrorTypes, EModelEndpoint } = require('librechat-data-provider');
|
||||
const { getUserKey, getUserKeyExpiry, getUserKeyValues } = require('~/server/services/UserService');
|
||||
const initializeClient = require('./initialize');
|
||||
// const { OpenAIClient } = require('~/app');
|
||||
|
|
@ -12,6 +12,8 @@ jest.mock('~/server/services/UserService', () => ({
|
|||
checkUserKeyExpiry: jest.requireActual('~/server/services/UserService').checkUserKeyExpiry,
|
||||
}));
|
||||
|
||||
// Config is now passed via req.config, not getAppConfig
|
||||
|
||||
const today = new Date();
|
||||
const tenDaysFromToday = new Date(today.setDate(today.getDate() + 10));
|
||||
const isoString = tenDaysFromToday.toISOString();
|
||||
|
|
@ -41,7 +43,11 @@ describe('initializeClient', () => {
|
|||
isUserProvided: jest.fn().mockReturnValueOnce(false),
|
||||
}));
|
||||
|
||||
const req = { user: { id: 'user123' }, app };
|
||||
const req = {
|
||||
user: { id: 'user123' },
|
||||
app,
|
||||
config: { endpoints: { [EModelEndpoint.azureOpenAI]: {} } },
|
||||
};
|
||||
const res = {};
|
||||
|
||||
const { openai, openAIApiKey } = await initializeClient({ req, res });
|
||||
|
|
@ -57,7 +63,11 @@ describe('initializeClient', () => {
|
|||
getUserKeyValues.mockResolvedValue({ apiKey: 'user-api-key', baseURL: 'https://user.api.url' });
|
||||
getUserKeyExpiry.mockResolvedValue(isoString);
|
||||
|
||||
const req = { user: { id: 'user123' }, app };
|
||||
const req = {
|
||||
user: { id: 'user123' },
|
||||
app,
|
||||
config: { endpoints: { [EModelEndpoint.azureOpenAI]: {} } },
|
||||
};
|
||||
const res = {};
|
||||
|
||||
const { openai, openAIApiKey } = await initializeClient({ req, res });
|
||||
|
|
@ -74,7 +84,7 @@ describe('initializeClient', () => {
|
|||
let userValues = getUserKey();
|
||||
try {
|
||||
userValues = JSON.parse(userValues);
|
||||
} catch (e) {
|
||||
} catch {
|
||||
throw new Error(
|
||||
JSON.stringify({
|
||||
type: ErrorTypes.INVALID_USER_KEY,
|
||||
|
|
@ -84,7 +94,10 @@ describe('initializeClient', () => {
|
|||
return userValues;
|
||||
});
|
||||
|
||||
const req = { user: { id: 'user123' } };
|
||||
const req = {
|
||||
user: { id: 'user123' },
|
||||
config: { endpoints: { [EModelEndpoint.azureOpenAI]: {} } },
|
||||
};
|
||||
const res = {};
|
||||
|
||||
await expect(initializeClient({ req, res })).rejects.toThrow(/invalid_user_key/);
|
||||
|
|
@ -93,7 +106,11 @@ describe('initializeClient', () => {
|
|||
test('throws error if API key is not provided', async () => {
|
||||
delete process.env.AZURE_ASSISTANTS_API_KEY; // Simulate missing API key
|
||||
|
||||
const req = { user: { id: 'user123' }, app };
|
||||
const req = {
|
||||
user: { id: 'user123' },
|
||||
app,
|
||||
config: { endpoints: { [EModelEndpoint.azureOpenAI]: {} } },
|
||||
};
|
||||
const res = {};
|
||||
|
||||
await expect(initializeClient({ req, res })).rejects.toThrow(/Assistants API key not/);
|
||||
|
|
@ -103,7 +120,11 @@ describe('initializeClient', () => {
|
|||
process.env.AZURE_ASSISTANTS_API_KEY = 'test-key';
|
||||
process.env.PROXY = 'http://proxy.server';
|
||||
|
||||
const req = { user: { id: 'user123' }, app };
|
||||
const req = {
|
||||
user: { id: 'user123' },
|
||||
app,
|
||||
config: { endpoints: { [EModelEndpoint.azureOpenAI]: {} } },
|
||||
};
|
||||
const res = {};
|
||||
|
||||
const { openai } = await initializeClient({ req, res });
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ const {
|
|||
const { getUserKey, checkUserKeyExpiry } = require('~/server/services/UserService');
|
||||
|
||||
const getOptions = async ({ req, overrideModel, endpointOption }) => {
|
||||
const appConfig = req.config;
|
||||
const {
|
||||
BEDROCK_AWS_SECRET_ACCESS_KEY,
|
||||
BEDROCK_AWS_ACCESS_KEY_ID,
|
||||
|
|
@ -50,14 +51,13 @@ const getOptions = async ({ req, overrideModel, endpointOption }) => {
|
|||
let streamRate = Constants.DEFAULT_STREAM_RATE;
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const bedrockConfig = req.app.locals[EModelEndpoint.bedrock];
|
||||
const bedrockConfig = appConfig.endpoints?.[EModelEndpoint.bedrock];
|
||||
|
||||
if (bedrockConfig && bedrockConfig.streamRate) {
|
||||
streamRate = bedrockConfig.streamRate;
|
||||
}
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const allConfig = req.app.locals.all;
|
||||
const allConfig = appConfig.endpoints?.all;
|
||||
if (allConfig && allConfig.streamRate) {
|
||||
streamRate = allConfig.streamRate;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,3 +1,11 @@
|
|||
const { Providers } = require('@librechat/agents');
|
||||
const {
|
||||
resolveHeaders,
|
||||
isUserProvided,
|
||||
getOpenAIConfig,
|
||||
getCustomEndpointConfig,
|
||||
createHandleLLMNewToken,
|
||||
} = require('@librechat/api');
|
||||
const {
|
||||
CacheKeys,
|
||||
ErrorTypes,
|
||||
|
|
@ -5,22 +13,22 @@ const {
|
|||
FetchTokenConfig,
|
||||
extractEnvVariable,
|
||||
} = require('librechat-data-provider');
|
||||
const { Providers } = require('@librechat/agents');
|
||||
const { getOpenAIConfig, createHandleLLMNewToken, resolveHeaders } = require('@librechat/api');
|
||||
const { getUserKeyValues, checkUserKeyExpiry } = require('~/server/services/UserService');
|
||||
const { getCustomEndpointConfig } = require('~/server/services/Config');
|
||||
const { fetchModels } = require('~/server/services/ModelService');
|
||||
const OpenAIClient = require('~/app/clients/OpenAIClient');
|
||||
const { isUserProvided } = require('~/server/utils');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
|
||||
const { PROXY } = process.env;
|
||||
|
||||
const initializeClient = async ({ req, res, endpointOption, optionsOnly, overrideEndpoint }) => {
|
||||
const appConfig = req.config;
|
||||
const { key: expiresAt } = req.body;
|
||||
const endpoint = overrideEndpoint ?? req.body.endpoint;
|
||||
|
||||
const endpointConfig = await getCustomEndpointConfig(endpoint);
|
||||
const endpointConfig = getCustomEndpointConfig({
|
||||
endpoint,
|
||||
appConfig,
|
||||
});
|
||||
if (!endpointConfig) {
|
||||
throw new Error(`Config not found for the ${endpoint} custom endpoint.`);
|
||||
}
|
||||
|
|
@ -117,8 +125,7 @@ const initializeClient = async ({ req, res, endpointOption, optionsOnly, overrid
|
|||
endpointTokenConfig,
|
||||
};
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const allConfig = req.app.locals.all;
|
||||
const allConfig = appConfig.endpoints?.all;
|
||||
if (allConfig) {
|
||||
customOptions.streamRate = allConfig.streamRate;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,21 +1,16 @@
|
|||
const initializeClient = require('./initialize');
|
||||
|
||||
jest.mock('@librechat/api', () => ({
|
||||
...jest.requireActual('@librechat/api'),
|
||||
resolveHeaders: jest.fn(),
|
||||
getOpenAIConfig: jest.fn(),
|
||||
createHandleLLMNewToken: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('librechat-data-provider', () => ({
|
||||
CacheKeys: { TOKEN_CONFIG: 'token_config' },
|
||||
ErrorTypes: { NO_USER_KEY: 'NO_USER_KEY', NO_BASE_URL: 'NO_BASE_URL' },
|
||||
envVarRegex: /\$\{([^}]+)\}/,
|
||||
FetchTokenConfig: {},
|
||||
extractEnvVariable: jest.fn((value) => value),
|
||||
}));
|
||||
|
||||
jest.mock('@librechat/agents', () => ({
|
||||
Providers: { OLLAMA: 'ollama' },
|
||||
getCustomEndpointConfig: jest.fn().mockReturnValue({
|
||||
apiKey: 'test-key',
|
||||
baseURL: 'https://test.com',
|
||||
headers: { 'x-user': '{{LIBRECHAT_USER_ID}}', 'x-email': '{{LIBRECHAT_USER_EMAIL}}' },
|
||||
models: { default: ['test-model'] },
|
||||
}),
|
||||
}));
|
||||
|
||||
jest.mock('~/server/services/UserService', () => ({
|
||||
|
|
@ -23,14 +18,7 @@ jest.mock('~/server/services/UserService', () => ({
|
|||
checkUserKeyExpiry: jest.fn(),
|
||||
}));
|
||||
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
getCustomEndpointConfig: jest.fn().mockResolvedValue({
|
||||
apiKey: 'test-key',
|
||||
baseURL: 'https://test.com',
|
||||
headers: { 'x-user': '{{LIBRECHAT_USER_ID}}', 'x-email': '{{LIBRECHAT_USER_EMAIL}}' },
|
||||
models: { default: ['test-model'] },
|
||||
}),
|
||||
}));
|
||||
// Config is now passed via req.config, not getAppConfig
|
||||
|
||||
jest.mock('~/server/services/ModelService', () => ({
|
||||
fetchModels: jest.fn(),
|
||||
|
|
@ -42,10 +30,6 @@ jest.mock('~/app/clients/OpenAIClient', () => {
|
|||
}));
|
||||
});
|
||||
|
||||
jest.mock('~/server/utils', () => ({
|
||||
isUserProvided: jest.fn().mockReturnValue(false),
|
||||
}));
|
||||
|
||||
jest.mock('~/cache/getLogStores', () =>
|
||||
jest.fn().mockReturnValue({
|
||||
get: jest.fn(),
|
||||
|
|
@ -55,13 +39,35 @@ jest.mock('~/cache/getLogStores', () =>
|
|||
describe('custom/initializeClient', () => {
|
||||
const mockRequest = {
|
||||
body: { endpoint: 'test-endpoint' },
|
||||
user: { id: 'user-123', email: 'test@example.com' },
|
||||
user: { id: 'user-123', email: 'test@example.com', role: 'user' },
|
||||
app: { locals: {} },
|
||||
config: {
|
||||
endpoints: {
|
||||
all: {
|
||||
streamRate: 25,
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
const mockResponse = {};
|
||||
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
const { getCustomEndpointConfig, resolveHeaders, getOpenAIConfig } = require('@librechat/api');
|
||||
getCustomEndpointConfig.mockReturnValue({
|
||||
apiKey: 'test-key',
|
||||
baseURL: 'https://test.com',
|
||||
headers: { 'x-user': '{{LIBRECHAT_USER_ID}}', 'x-email': '{{LIBRECHAT_USER_EMAIL}}' },
|
||||
models: { default: ['test-model'] },
|
||||
});
|
||||
resolveHeaders.mockReturnValue({ 'x-user': 'user-123', 'x-email': 'test@example.com' });
|
||||
getOpenAIConfig.mockReturnValue({
|
||||
useLegacyContent: true,
|
||||
endpointTokenConfig: null,
|
||||
llmConfig: {
|
||||
callbacks: [],
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
it('calls resolveHeaders with headers, user, and body for body placeholder support', async () => {
|
||||
|
|
@ -69,14 +75,14 @@ describe('custom/initializeClient', () => {
|
|||
await initializeClient({ req: mockRequest, res: mockResponse, optionsOnly: true });
|
||||
expect(resolveHeaders).toHaveBeenCalledWith({
|
||||
headers: { 'x-user': '{{LIBRECHAT_USER_ID}}', 'x-email': '{{LIBRECHAT_USER_EMAIL}}' },
|
||||
user: { id: 'user-123', email: 'test@example.com' },
|
||||
user: { id: 'user-123', email: 'test@example.com', role: 'user' },
|
||||
body: { endpoint: 'test-endpoint' }, // body - supports {{LIBRECHAT_BODY_*}} placeholders
|
||||
});
|
||||
});
|
||||
|
||||
it('throws if endpoint config is missing', async () => {
|
||||
const { getCustomEndpointConfig } = require('~/server/services/Config');
|
||||
getCustomEndpointConfig.mockResolvedValueOnce(null);
|
||||
const { getCustomEndpointConfig } = require('@librechat/api');
|
||||
getCustomEndpointConfig.mockReturnValueOnce(null);
|
||||
await expect(
|
||||
initializeClient({ req: mockRequest, res: mockResponse, optionsOnly: true }),
|
||||
).rejects.toThrow('Config not found for the test-endpoint custom endpoint.');
|
||||
|
|
|
|||
|
|
@ -46,10 +46,11 @@ const initializeClient = async ({ req, res, endpointOption, overrideModel, optio
|
|||
|
||||
let clientOptions = {};
|
||||
|
||||
const appConfig = req.config;
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const allConfig = req.app.locals.all;
|
||||
const allConfig = appConfig.endpoints?.all;
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const googleConfig = req.app.locals[EModelEndpoint.google];
|
||||
const googleConfig = appConfig.endpoints?.[EModelEndpoint.google];
|
||||
|
||||
if (googleConfig) {
|
||||
clientOptions.streamRate = googleConfig.streamRate;
|
||||
|
|
|
|||
|
|
@ -8,6 +8,8 @@ jest.mock('~/server/services/UserService', () => ({
|
|||
getUserKey: jest.fn().mockImplementation(() => ({})),
|
||||
}));
|
||||
|
||||
// Config is now passed via req.config, not getAppConfig
|
||||
|
||||
const app = { locals: {} };
|
||||
|
||||
describe('google/initializeClient', () => {
|
||||
|
|
@ -26,6 +28,12 @@ describe('google/initializeClient', () => {
|
|||
body: { key: expiresAt },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: {
|
||||
endpoints: {
|
||||
all: {},
|
||||
google: {},
|
||||
},
|
||||
},
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = { modelOptions: { model: 'default-model' } };
|
||||
|
|
@ -48,6 +56,12 @@ describe('google/initializeClient', () => {
|
|||
body: { key: null },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: {
|
||||
endpoints: {
|
||||
all: {},
|
||||
google: {},
|
||||
},
|
||||
},
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = { modelOptions: { model: 'default-model' } };
|
||||
|
|
@ -71,6 +85,12 @@ describe('google/initializeClient', () => {
|
|||
body: { key: expiresAt },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: {
|
||||
endpoints: {
|
||||
all: {},
|
||||
google: {},
|
||||
},
|
||||
},
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = { modelOptions: { model: 'default-model' } };
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
const { isEnabled } = require('@librechat/api');
|
||||
const { EModelEndpoint, CacheKeys, Constants, googleSettings } = require('librechat-data-provider');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
const initializeClient = require('./initialize');
|
||||
const { isEnabled } = require('~/server/utils');
|
||||
const { saveConvo } = require('~/models');
|
||||
|
||||
const addTitle = async (req, { text, response, client }) => {
|
||||
|
|
@ -14,7 +14,8 @@ const addTitle = async (req, { text, response, client }) => {
|
|||
return;
|
||||
}
|
||||
const { GOOGLE_TITLE_MODEL } = process.env ?? {};
|
||||
const providerConfig = req.app.locals[EModelEndpoint.google];
|
||||
const appConfig = req.config;
|
||||
const providerConfig = appConfig.endpoints?.[EModelEndpoint.google];
|
||||
let model =
|
||||
providerConfig?.titleModel ??
|
||||
GOOGLE_TITLE_MODEL ??
|
||||
|
|
|
|||
|
|
@ -1,11 +1,11 @@
|
|||
const { Providers } = require('@librechat/agents');
|
||||
const { EModelEndpoint } = require('librechat-data-provider');
|
||||
const { getCustomEndpointConfig } = require('@librechat/api');
|
||||
const initAnthropic = require('~/server/services/Endpoints/anthropic/initialize');
|
||||
const getBedrockOptions = require('~/server/services/Endpoints/bedrock/options');
|
||||
const initOpenAI = require('~/server/services/Endpoints/openAI/initialize');
|
||||
const initCustom = require('~/server/services/Endpoints/custom/initialize');
|
||||
const initGoogle = require('~/server/services/Endpoints/google/initialize');
|
||||
const { getCustomEndpointConfig } = require('~/server/services/Config');
|
||||
|
||||
/** Check if the provider is a known custom provider
|
||||
* @param {string | undefined} [provider] - The provider string
|
||||
|
|
@ -31,14 +31,16 @@ const providerConfigMap = {
|
|||
|
||||
/**
|
||||
* Get the provider configuration and override endpoint based on the provider string
|
||||
* @param {string} provider - The provider string
|
||||
* @returns {Promise<{
|
||||
* getOptions: Function,
|
||||
* @param {Object} params
|
||||
* @param {string} params.provider - The provider string
|
||||
* @param {AppConfig} params.appConfig - The application configuration
|
||||
* @returns {{
|
||||
* getOptions: (typeof providerConfigMap)[keyof typeof providerConfigMap],
|
||||
* overrideProvider: string,
|
||||
* customEndpointConfig?: TEndpoint
|
||||
* }>}
|
||||
* }}
|
||||
*/
|
||||
async function getProviderConfig(provider) {
|
||||
function getProviderConfig({ provider, appConfig }) {
|
||||
let getOptions = providerConfigMap[provider];
|
||||
let overrideProvider = provider;
|
||||
/** @type {TEndpoint | undefined} */
|
||||
|
|
@ -48,7 +50,7 @@ async function getProviderConfig(provider) {
|
|||
overrideProvider = provider.toLowerCase();
|
||||
getOptions = providerConfigMap[overrideProvider];
|
||||
} else if (!getOptions) {
|
||||
customEndpointConfig = await getCustomEndpointConfig(provider);
|
||||
customEndpointConfig = getCustomEndpointConfig({ endpoint: provider, appConfig });
|
||||
if (!customEndpointConfig) {
|
||||
throw new Error(`Provider ${provider} not supported`);
|
||||
}
|
||||
|
|
@ -57,7 +59,7 @@ async function getProviderConfig(provider) {
|
|||
}
|
||||
|
||||
if (isKnownCustomProvider(overrideProvider) && !customEndpointConfig) {
|
||||
customEndpointConfig = await getCustomEndpointConfig(provider);
|
||||
customEndpointConfig = getCustomEndpointConfig({ endpoint: provider, appConfig });
|
||||
if (!customEndpointConfig) {
|
||||
throw new Error(`Provider ${provider} not supported`);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@ const initializeClient = async ({
|
|||
overrideEndpoint,
|
||||
overrideModel,
|
||||
}) => {
|
||||
const appConfig = req.config;
|
||||
const {
|
||||
PROXY,
|
||||
OPENAI_API_KEY,
|
||||
|
|
@ -64,7 +65,7 @@ const initializeClient = async ({
|
|||
|
||||
const isAzureOpenAI = endpoint === EModelEndpoint.azureOpenAI;
|
||||
/** @type {false | TAzureConfig} */
|
||||
const azureConfig = isAzureOpenAI && req.app.locals[EModelEndpoint.azureOpenAI];
|
||||
const azureConfig = isAzureOpenAI && appConfig.endpoints?.[EModelEndpoint.azureOpenAI];
|
||||
let serverless = false;
|
||||
if (isAzureOpenAI && azureConfig) {
|
||||
const { modelGroupMap, groupMap } = azureConfig;
|
||||
|
|
@ -113,15 +114,14 @@ const initializeClient = async ({
|
|||
}
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const openAIConfig = req.app.locals[EModelEndpoint.openAI];
|
||||
const openAIConfig = appConfig.endpoints?.[EModelEndpoint.openAI];
|
||||
|
||||
if (!isAzureOpenAI && openAIConfig) {
|
||||
clientOptions.streamRate = openAIConfig.streamRate;
|
||||
clientOptions.titleModel = openAIConfig.titleModel;
|
||||
}
|
||||
|
||||
/** @type {undefined | TBaseEndpoint} */
|
||||
const allConfig = req.app.locals.all;
|
||||
const allConfig = appConfig.endpoints?.all;
|
||||
if (allConfig) {
|
||||
clientOptions.streamRate = allConfig.streamRate;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,13 @@
|
|||
jest.mock('~/cache/getLogStores');
|
||||
jest.mock('~/cache/getLogStores', () => ({
|
||||
getLogStores: jest.fn().mockReturnValue({
|
||||
get: jest.fn().mockResolvedValue({
|
||||
openAI: { apiKey: 'test-key' },
|
||||
}),
|
||||
set: jest.fn(),
|
||||
delete: jest.fn(),
|
||||
}),
|
||||
}));
|
||||
|
||||
const { EModelEndpoint, ErrorTypes, validateAzureGroups } = require('librechat-data-provider');
|
||||
const { getUserKey, getUserKeyValues } = require('~/server/services/UserService');
|
||||
const initializeClient = require('./initialize');
|
||||
|
|
@ -11,6 +20,38 @@ jest.mock('~/server/services/UserService', () => ({
|
|||
checkUserKeyExpiry: jest.requireActual('~/server/services/UserService').checkUserKeyExpiry,
|
||||
}));
|
||||
|
||||
const mockAppConfig = {
|
||||
endpoints: {
|
||||
openAI: {
|
||||
apiKey: 'test-key',
|
||||
},
|
||||
azureOpenAI: {
|
||||
apiKey: 'test-azure-key',
|
||||
modelNames: ['gpt-4-vision-preview', 'gpt-3.5-turbo', 'gpt-4'],
|
||||
modelGroupMap: {
|
||||
'gpt-4-vision-preview': {
|
||||
group: 'librechat-westus',
|
||||
deploymentName: 'gpt-4-vision-preview',
|
||||
version: '2024-02-15-preview',
|
||||
},
|
||||
},
|
||||
groupMap: {
|
||||
'librechat-westus': {
|
||||
apiKey: 'WESTUS_API_KEY',
|
||||
instanceName: 'librechat-westus',
|
||||
version: '2023-12-01-preview',
|
||||
models: {
|
||||
'gpt-4-vision-preview': {
|
||||
deploymentName: 'gpt-4-vision-preview',
|
||||
version: '2024-02-15-preview',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
describe('initializeClient', () => {
|
||||
// Set up environment variables
|
||||
const originalEnvironment = process.env;
|
||||
|
|
@ -79,7 +120,7 @@ describe('initializeClient', () => {
|
|||
},
|
||||
];
|
||||
|
||||
const { modelNames, modelGroupMap, groupMap } = validateAzureGroups(validAzureConfigs);
|
||||
const { modelNames } = validateAzureGroups(validAzureConfigs);
|
||||
|
||||
beforeEach(() => {
|
||||
jest.resetModules(); // Clears the cache
|
||||
|
|
@ -99,6 +140,7 @@ describe('initializeClient', () => {
|
|||
body: { key: null, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -112,25 +154,30 @@ describe('initializeClient', () => {
|
|||
test('should initialize client with Azure credentials when endpoint is azureOpenAI', async () => {
|
||||
process.env.AZURE_API_KEY = 'test-azure-api-key';
|
||||
(process.env.AZURE_OPENAI_API_INSTANCE_NAME = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_VERSION = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_COMPLETIONS_DEPLOYMENT_NAME = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME = 'some-value'),
|
||||
(process.env.OPENAI_API_KEY = 'test-openai-api-key');
|
||||
(process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_VERSION = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_COMPLETIONS_DEPLOYMENT_NAME = 'some-value'),
|
||||
(process.env.AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME = 'some-value'),
|
||||
(process.env.OPENAI_API_KEY = 'test-openai-api-key');
|
||||
process.env.DEBUG_OPENAI = 'false';
|
||||
process.env.OPENAI_SUMMARIZE = 'false';
|
||||
|
||||
const req = {
|
||||
body: { key: null, endpoint: 'azureOpenAI' },
|
||||
body: {
|
||||
key: null,
|
||||
endpoint: 'azureOpenAI',
|
||||
model: 'gpt-4-vision-preview',
|
||||
},
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = { modelOptions: { model: 'test-model' } };
|
||||
const endpointOption = {};
|
||||
|
||||
const client = await initializeClient({ req, res, endpointOption });
|
||||
|
||||
expect(client.openAIApiKey).toBe('test-azure-api-key');
|
||||
expect(client.openAIApiKey).toBe('WESTUS_API_KEY');
|
||||
expect(client.client).toBeInstanceOf(OpenAIClient);
|
||||
});
|
||||
|
||||
|
|
@ -142,6 +189,7 @@ describe('initializeClient', () => {
|
|||
body: { key: null, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -159,6 +207,7 @@ describe('initializeClient', () => {
|
|||
body: { key: null, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -177,6 +226,7 @@ describe('initializeClient', () => {
|
|||
body: { key: null, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -198,6 +248,7 @@ describe('initializeClient', () => {
|
|||
body: { key: expiresAt, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -216,6 +267,7 @@ describe('initializeClient', () => {
|
|||
body: { key: null, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -236,6 +288,7 @@ describe('initializeClient', () => {
|
|||
id: '123',
|
||||
},
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
|
||||
const res = {};
|
||||
|
|
@ -260,6 +313,7 @@ describe('initializeClient', () => {
|
|||
body: { key: invalidKey, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -281,6 +335,7 @@ describe('initializeClient', () => {
|
|||
body: { key: new Date(Date.now() + 10000).toISOString(), endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -291,7 +346,7 @@ describe('initializeClient', () => {
|
|||
let userValues = getUserKey();
|
||||
try {
|
||||
userValues = JSON.parse(userValues);
|
||||
} catch (e) {
|
||||
} catch {
|
||||
throw new Error(
|
||||
JSON.stringify({
|
||||
type: ErrorTypes.INVALID_USER_KEY,
|
||||
|
|
@ -307,6 +362,9 @@ describe('initializeClient', () => {
|
|||
});
|
||||
|
||||
test('should initialize client correctly for Azure OpenAI with valid configuration', async () => {
|
||||
// Set up Azure environment variables
|
||||
process.env.WESTUS_API_KEY = 'test-westus-key';
|
||||
|
||||
const req = {
|
||||
body: {
|
||||
key: null,
|
||||
|
|
@ -314,15 +372,7 @@ describe('initializeClient', () => {
|
|||
model: modelNames[0],
|
||||
},
|
||||
user: { id: '123' },
|
||||
app: {
|
||||
locals: {
|
||||
[EModelEndpoint.azureOpenAI]: {
|
||||
modelNames,
|
||||
modelGroupMap,
|
||||
groupMap,
|
||||
},
|
||||
},
|
||||
},
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -340,6 +390,7 @@ describe('initializeClient', () => {
|
|||
body: { key: null, endpoint: EModelEndpoint.openAI },
|
||||
user: { id: '123' },
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
@ -362,6 +413,7 @@ describe('initializeClient', () => {
|
|||
id: '123',
|
||||
},
|
||||
app,
|
||||
config: mockAppConfig,
|
||||
};
|
||||
const res = {};
|
||||
const endpointOption = {};
|
||||
|
|
|
|||
|
|
@ -2,10 +2,10 @@ const axios = require('axios');
|
|||
const fs = require('fs').promises;
|
||||
const FormData = require('form-data');
|
||||
const { Readable } = require('stream');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { genAzureEndpoint } = require('@librechat/api');
|
||||
const { extractEnvVariable, STTProviders } = require('librechat-data-provider');
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { logger } = require('~/config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
/**
|
||||
* Maps MIME types to their corresponding file extensions for audio files.
|
||||
|
|
@ -84,12 +84,7 @@ function getFileExtensionFromMime(mimeType) {
|
|||
* @class
|
||||
*/
|
||||
class STTService {
|
||||
/**
|
||||
* Creates an instance of STTService.
|
||||
* @param {Object} customConfig - The custom configuration object.
|
||||
*/
|
||||
constructor(customConfig) {
|
||||
this.customConfig = customConfig;
|
||||
constructor() {
|
||||
this.providerStrategies = {
|
||||
[STTProviders.OPENAI]: this.openAIProvider,
|
||||
[STTProviders.AZURE_OPENAI]: this.azureOpenAIProvider,
|
||||
|
|
@ -104,21 +99,20 @@ class STTService {
|
|||
* @throws {Error} If the custom config is not found.
|
||||
*/
|
||||
static async getInstance() {
|
||||
const customConfig = await getCustomConfig();
|
||||
if (!customConfig) {
|
||||
throw new Error('Custom config not found');
|
||||
}
|
||||
return new STTService(customConfig);
|
||||
return new STTService();
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves the configured STT provider and its schema.
|
||||
* @param {ServerRequest} req - The request object.
|
||||
* @returns {Promise<[string, Object]>} A promise that resolves to an array containing the provider name and its schema.
|
||||
* @throws {Error} If no STT schema is set, multiple providers are set, or no provider is set.
|
||||
*/
|
||||
async getProviderSchema() {
|
||||
const sttSchema = this.customConfig.speech.stt;
|
||||
|
||||
async getProviderSchema(req) {
|
||||
const appConfig = await getAppConfig({
|
||||
role: req?.user?.role,
|
||||
});
|
||||
const sttSchema = appConfig?.speech?.stt;
|
||||
if (!sttSchema) {
|
||||
throw new Error(
|
||||
'No STT schema is set. Did you configure STT in the custom config (librechat.yaml)?',
|
||||
|
|
@ -274,7 +268,7 @@ class STTService {
|
|||
* @param {Object} res - The response object.
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async processTextToSpeech(req, res) {
|
||||
async processSpeechToText(req, res) {
|
||||
if (!req.file) {
|
||||
return res.status(400).json({ message: 'No audio file provided in the FormData' });
|
||||
}
|
||||
|
|
@ -287,7 +281,7 @@ class STTService {
|
|||
};
|
||||
|
||||
try {
|
||||
const [provider, sttSchema] = await this.getProviderSchema();
|
||||
const [provider, sttSchema] = await this.getProviderSchema(req);
|
||||
const text = await this.sttRequest(provider, sttSchema, { audioBuffer, audioFile });
|
||||
res.json({ text });
|
||||
} catch (error) {
|
||||
|
|
@ -297,7 +291,7 @@ class STTService {
|
|||
try {
|
||||
await fs.unlink(req.file.path);
|
||||
logger.debug('[/speech/stt] Temp. audio upload file deleted');
|
||||
} catch (error) {
|
||||
} catch {
|
||||
logger.debug('[/speech/stt] Temp. audio upload file already deleted');
|
||||
}
|
||||
}
|
||||
|
|
@ -322,7 +316,7 @@ async function createSTTService() {
|
|||
*/
|
||||
async function speechToText(req, res) {
|
||||
const sttService = await createSTTService();
|
||||
await sttService.processTextToSpeech(req, res);
|
||||
await sttService.processSpeechToText(req, res);
|
||||
}
|
||||
|
||||
module.exports = { speechToText };
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
const axios = require('axios');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { genAzureEndpoint } = require('@librechat/api');
|
||||
const { extractEnvVariable, TTSProviders } = require('librechat-data-provider');
|
||||
const { getRandomVoiceId, createChunkProcessor, splitTextIntoChunks } = require('./streamAudio');
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { logger } = require('~/config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
/**
|
||||
* Service class for handling Text-to-Speech (TTS) operations.
|
||||
|
|
@ -12,10 +12,8 @@ const { logger } = require('~/config');
|
|||
class TTSService {
|
||||
/**
|
||||
* Creates an instance of TTSService.
|
||||
* @param {Object} customConfig - The custom configuration object.
|
||||
*/
|
||||
constructor(customConfig) {
|
||||
this.customConfig = customConfig;
|
||||
constructor() {
|
||||
this.providerStrategies = {
|
||||
[TTSProviders.OPENAI]: this.openAIProvider.bind(this),
|
||||
[TTSProviders.AZURE_OPENAI]: this.azureOpenAIProvider.bind(this),
|
||||
|
|
@ -32,11 +30,7 @@ class TTSService {
|
|||
* @throws {Error} If the custom config is not found.
|
||||
*/
|
||||
static async getInstance() {
|
||||
const customConfig = await getCustomConfig();
|
||||
if (!customConfig) {
|
||||
throw new Error('Custom config not found');
|
||||
}
|
||||
return new TTSService(customConfig);
|
||||
return new TTSService();
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -293,10 +287,13 @@ class TTSService {
|
|||
return res.status(400).send('Missing text in request body');
|
||||
}
|
||||
|
||||
const appConfig = await getAppConfig({
|
||||
role: req.user?.role,
|
||||
});
|
||||
try {
|
||||
res.setHeader('Content-Type', 'audio/mpeg');
|
||||
const provider = this.getProvider();
|
||||
const ttsSchema = this.customConfig.speech.tts[provider];
|
||||
const ttsSchema = appConfig?.speech?.tts?.[provider];
|
||||
const voice = await this.getVoice(ttsSchema, requestVoice);
|
||||
|
||||
if (input.length < 4096) {
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { logger } = require('~/config');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
|
||||
/**
|
||||
* This function retrieves the speechTab settings from the custom configuration
|
||||
|
|
@ -15,26 +15,28 @@ const { logger } = require('~/config');
|
|||
*/
|
||||
async function getCustomConfigSpeech(req, res) {
|
||||
try {
|
||||
const customConfig = await getCustomConfig();
|
||||
const appConfig = await getAppConfig({
|
||||
role: req.user?.role,
|
||||
});
|
||||
|
||||
if (!customConfig) {
|
||||
if (!appConfig) {
|
||||
return res.status(200).send({
|
||||
message: 'not_found',
|
||||
});
|
||||
}
|
||||
|
||||
const sttExternal = !!customConfig.speech?.stt;
|
||||
const ttsExternal = !!customConfig.speech?.tts;
|
||||
const sttExternal = !!appConfig.speech?.stt;
|
||||
const ttsExternal = !!appConfig.speech?.tts;
|
||||
let settings = {
|
||||
sttExternal,
|
||||
ttsExternal,
|
||||
};
|
||||
|
||||
if (!customConfig.speech?.speechTab) {
|
||||
if (!appConfig.speech?.speechTab) {
|
||||
return res.status(200).send(settings);
|
||||
}
|
||||
|
||||
const speechTab = customConfig.speech.speechTab;
|
||||
const speechTab = appConfig.speech.speechTab;
|
||||
|
||||
if (speechTab.advancedMode !== undefined) {
|
||||
settings.advancedMode = speechTab.advancedMode;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
const { TTSProviders } = require('librechat-data-provider');
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { getAppConfig } = require('~/server/services/Config');
|
||||
const { getProvider } = require('./TTSService');
|
||||
|
||||
/**
|
||||
|
|
@ -14,13 +14,15 @@ const { getProvider } = require('./TTSService');
|
|||
*/
|
||||
async function getVoices(req, res) {
|
||||
try {
|
||||
const customConfig = await getCustomConfig();
|
||||
const appConfig = await getAppConfig({
|
||||
role: req.user?.role,
|
||||
});
|
||||
|
||||
if (!customConfig || !customConfig?.speech?.tts) {
|
||||
if (!appConfig || !appConfig?.speech?.tts) {
|
||||
throw new Error('Configuration or TTS schema is missing');
|
||||
}
|
||||
|
||||
const ttsSchema = customConfig?.speech?.tts;
|
||||
const ttsSchema = appConfig?.speech?.tts;
|
||||
const provider = await getProvider(ttsSchema);
|
||||
let voices;
|
||||
|
||||
|
|
|
|||
|
|
@ -30,6 +30,7 @@ async function uploadImageToAzure({
|
|||
containerName,
|
||||
}) {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const inputFilePath = file.path;
|
||||
const inputBuffer = await fs.promises.readFile(inputFilePath);
|
||||
const {
|
||||
|
|
@ -41,12 +42,12 @@ async function uploadImageToAzure({
|
|||
const userId = req.user.id;
|
||||
let webPBuffer;
|
||||
let fileName = `${file_id}__${path.basename(inputFilePath)}`;
|
||||
const targetExtension = `.${req.app.locals.imageOutputType}`;
|
||||
const targetExtension = `.${appConfig.imageOutputType}`;
|
||||
|
||||
if (extension.toLowerCase() === targetExtension) {
|
||||
webPBuffer = resizedBuffer;
|
||||
} else {
|
||||
webPBuffer = await sharp(resizedBuffer).toFormat(req.app.locals.imageOutputType).toBuffer();
|
||||
webPBuffer = await sharp(resizedBuffer).toFormat(appConfig.imageOutputType).toBuffer();
|
||||
const extRegExp = new RegExp(path.extname(fileName) + '$');
|
||||
fileName = fileName.replace(extRegExp, targetExtension);
|
||||
if (!path.extname(fileName)) {
|
||||
|
|
|
|||
|
|
@ -1,21 +1,27 @@
|
|||
const { nanoid } = require('nanoid');
|
||||
const { checkAccess } = require('@librechat/api');
|
||||
const { Tools, PermissionTypes, Permissions } = require('librechat-data-provider');
|
||||
const { getCustomConfig } = require('~/server/services/Config/getCustomConfig');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const {
|
||||
Tools,
|
||||
Permissions,
|
||||
FileSources,
|
||||
EModelEndpoint,
|
||||
PermissionTypes,
|
||||
} = require('librechat-data-provider');
|
||||
const { getRoleByName } = require('~/models/Role');
|
||||
const { logger } = require('~/config');
|
||||
const { Files } = require('~/models');
|
||||
|
||||
/**
|
||||
* Process file search results from tool calls
|
||||
* @param {Object} options
|
||||
* @param {IUser} options.user - The user object
|
||||
* @param {AppConfig} options.appConfig - The app configuration object
|
||||
* @param {GraphRunnableConfig['configurable']} options.metadata - The metadata
|
||||
* @param {any} options.toolArtifact - The tool artifact containing structured data
|
||||
* @param {string} options.toolCallId - The tool call ID
|
||||
* @returns {Promise<Object|null>} The file search attachment or null
|
||||
*/
|
||||
async function processFileCitations({ user, toolArtifact, toolCallId, metadata }) {
|
||||
async function processFileCitations({ user, appConfig, toolArtifact, toolCallId, metadata }) {
|
||||
try {
|
||||
if (!toolArtifact?.[Tools.file_search]?.sources) {
|
||||
return null;
|
||||
|
|
@ -44,10 +50,11 @@ async function processFileCitations({ user, toolArtifact, toolCallId, metadata }
|
|||
}
|
||||
}
|
||||
|
||||
const customConfig = await getCustomConfig();
|
||||
const maxCitations = customConfig?.endpoints?.agents?.maxCitations ?? 30;
|
||||
const maxCitationsPerFile = customConfig?.endpoints?.agents?.maxCitationsPerFile ?? 5;
|
||||
const minRelevanceScore = customConfig?.endpoints?.agents?.minRelevanceScore ?? 0.45;
|
||||
const maxCitations = appConfig.endpoints?.[EModelEndpoint.agents]?.maxCitations ?? 30;
|
||||
const maxCitationsPerFile =
|
||||
appConfig.endpoints?.[EModelEndpoint.agents]?.maxCitationsPerFile ?? 5;
|
||||
const minRelevanceScore =
|
||||
appConfig.endpoints?.[EModelEndpoint.agents]?.minRelevanceScore ?? 0.45;
|
||||
|
||||
const sources = toolArtifact[Tools.file_search].sources || [];
|
||||
const filteredSources = sources.filter((source) => source.relevance >= minRelevanceScore);
|
||||
|
|
@ -59,7 +66,7 @@ async function processFileCitations({ user, toolArtifact, toolCallId, metadata }
|
|||
}
|
||||
|
||||
const selectedSources = applyCitationLimits(filteredSources, maxCitations, maxCitationsPerFile);
|
||||
const enhancedSources = await enhanceSourcesWithMetadata(selectedSources, customConfig);
|
||||
const enhancedSources = await enhanceSourcesWithMetadata(selectedSources, appConfig);
|
||||
|
||||
if (enhancedSources.length > 0) {
|
||||
const fileSearchAttachment = {
|
||||
|
|
@ -110,10 +117,10 @@ function applyCitationLimits(sources, maxCitations, maxCitationsPerFile) {
|
|||
/**
|
||||
* Enhance sources with file metadata from database
|
||||
* @param {Array} sources - Selected sources
|
||||
* @param {Object} customConfig - Custom configuration
|
||||
* @param {AppConfig} appConfig - Custom configuration
|
||||
* @returns {Promise<Array>} Enhanced sources
|
||||
*/
|
||||
async function enhanceSourcesWithMetadata(sources, customConfig) {
|
||||
async function enhanceSourcesWithMetadata(sources, appConfig) {
|
||||
const fileIds = [...new Set(sources.map((source) => source.fileId))];
|
||||
|
||||
let fileMetadataMap = {};
|
||||
|
|
@ -129,7 +136,7 @@ async function enhanceSourcesWithMetadata(sources, customConfig) {
|
|||
|
||||
return sources.map((source) => {
|
||||
const fileRecord = fileMetadataMap[source.fileId] || {};
|
||||
const configuredStorageType = fileRecord.source || customConfig?.fileStrategy || 'local';
|
||||
const configuredStorageType = fileRecord.source || appConfig?.fileStrategy || FileSources.local;
|
||||
|
||||
return {
|
||||
...source,
|
||||
|
|
|
|||
|
|
@ -43,8 +43,7 @@ async function getCodeOutputDownloadStream(fileIdentifier, apiKey) {
|
|||
/**
|
||||
* Uploads a file to the Code Environment server.
|
||||
* @param {Object} params - The params object.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id`
|
||||
* representing the user, and an `app.locals.paths` object with an `uploads` path.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id` representing the user
|
||||
* @param {import('fs').ReadStream | import('stream').Readable} params.stream - The read stream for the file.
|
||||
* @param {string} params.filename - The name of the file.
|
||||
* @param {string} params.apiKey - The API key for authentication.
|
||||
|
|
|
|||
|
|
@ -38,6 +38,7 @@ const processCodeOutput = async ({
|
|||
messageId,
|
||||
session_id,
|
||||
}) => {
|
||||
const appConfig = req.config;
|
||||
const currentDate = new Date();
|
||||
const baseURL = getCodeBaseURL();
|
||||
const fileExt = path.extname(name);
|
||||
|
|
@ -77,10 +78,10 @@ const processCodeOutput = async ({
|
|||
filename: name,
|
||||
conversationId,
|
||||
user: req.user.id,
|
||||
type: `image/${req.app.locals.imageOutputType}`,
|
||||
type: `image/${appConfig.imageOutputType}`,
|
||||
createdAt: formattedDate,
|
||||
updatedAt: formattedDate,
|
||||
source: req.app.locals.fileStrategy,
|
||||
source: appConfig.fileStrategy,
|
||||
context: FileContext.execute_code,
|
||||
};
|
||||
createFile(file, true);
|
||||
|
|
|
|||
|
|
@ -11,8 +11,7 @@ const { saveBufferToFirebase } = require('./crud');
|
|||
* resolution.
|
||||
*
|
||||
* @param {Object} params - The params object.
|
||||
* @param {Express.Request} params.req - The request object from Express. It should have a `user` property with an `id`
|
||||
* representing the user, and an `app.locals.paths` object with an `imageOutput` path.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id` representing the user
|
||||
* @param {Express.Multer.File} params.file - The file object, which is part of the request. The file object should
|
||||
* have a `path` property that points to the location of the uploaded file.
|
||||
* @param {EModelEndpoint} params.endpoint - The params object.
|
||||
|
|
@ -26,6 +25,7 @@ const { saveBufferToFirebase } = require('./crud');
|
|||
* - height: The height of the converted image.
|
||||
*/
|
||||
async function uploadImageToFirebase({ req, file, file_id, endpoint, resolution = 'high' }) {
|
||||
const appConfig = req.config;
|
||||
const inputFilePath = file.path;
|
||||
const inputBuffer = await fs.promises.readFile(inputFilePath);
|
||||
const {
|
||||
|
|
@ -38,11 +38,11 @@ async function uploadImageToFirebase({ req, file, file_id, endpoint, resolution
|
|||
|
||||
let webPBuffer;
|
||||
let fileName = `${file_id}__${path.basename(inputFilePath)}`;
|
||||
const targetExtension = `.${req.app.locals.imageOutputType}`;
|
||||
const targetExtension = `.${appConfig.imageOutputType}`;
|
||||
if (extension.toLowerCase() === targetExtension) {
|
||||
webPBuffer = resizedBuffer;
|
||||
} else {
|
||||
webPBuffer = await sharp(resizedBuffer).toFormat(req.app.locals.imageOutputType).toBuffer();
|
||||
webPBuffer = await sharp(resizedBuffer).toFormat(appConfig.imageOutputType).toBuffer();
|
||||
// Replace or append the correct extension
|
||||
const extRegExp = new RegExp(path.extname(fileName) + '$');
|
||||
fileName = fileName.replace(extRegExp, targetExtension);
|
||||
|
|
|
|||
|
|
@ -38,14 +38,15 @@ async function saveLocalFile(file, outputPath, outputFilename) {
|
|||
/**
|
||||
* Saves an uploaded image file to a specified directory based on the user's ID and a filename.
|
||||
*
|
||||
* @param {Express.Request} req - The Express request object, containing the user's information and app configuration.
|
||||
* @param {ServerRequest} req - The Express request object, containing the user's information and app configuration.
|
||||
* @param {Express.Multer.File} file - The uploaded file object.
|
||||
* @param {string} filename - The new filename to assign to the saved image (without extension).
|
||||
* @returns {Promise<void>}
|
||||
* @throws Will throw an error if the image saving process fails.
|
||||
*/
|
||||
const saveLocalImage = async (req, file, filename) => {
|
||||
const imagePath = req.app.locals.paths.imageOutput;
|
||||
const appConfig = req.config;
|
||||
const imagePath = appConfig.paths.imageOutput;
|
||||
const outputPath = path.join(imagePath, req.user.id ?? '');
|
||||
await saveLocalFile(file, outputPath, filename);
|
||||
};
|
||||
|
|
@ -162,7 +163,7 @@ async function getLocalFileURL({ fileName, basePath = 'images' }) {
|
|||
* the expected base path using the base, subfolder, and user id from the request, and then checks if the
|
||||
* provided filepath starts with this constructed base path.
|
||||
*
|
||||
* @param {Express.Request} req - The request object from Express. It should contain a `user` property with an `id`.
|
||||
* @param {ServerRequest} req - The request object from Express. It should contain a `user` property with an `id`.
|
||||
* @param {string} base - The base directory path.
|
||||
* @param {string} subfolder - The subdirectory under the base path.
|
||||
* @param {string} filepath - The complete file path to be validated.
|
||||
|
|
@ -191,8 +192,7 @@ const unlinkFile = async (filepath) => {
|
|||
* Deletes a file from the filesystem. This function takes a file object, constructs the full path, and
|
||||
* verifies the path's validity before deleting the file. If the path is invalid, an error is thrown.
|
||||
*
|
||||
* @param {Express.Request} req - The request object from Express. It should have an `app.locals.paths` object with
|
||||
* a `publicPath` property.
|
||||
* @param {ServerRequest} req - The request object from Express.
|
||||
* @param {MongoFile} file - The file object to be deleted. It should have a `filepath` property that is
|
||||
* a string representing the path of the file relative to the publicPath.
|
||||
*
|
||||
|
|
@ -201,7 +201,8 @@ const unlinkFile = async (filepath) => {
|
|||
* file path is invalid or if there is an error in deletion.
|
||||
*/
|
||||
const deleteLocalFile = async (req, file) => {
|
||||
const { publicPath, uploads } = req.app.locals.paths;
|
||||
const appConfig = req.config;
|
||||
const { publicPath, uploads } = appConfig.paths;
|
||||
|
||||
/** Filepath stripped of query parameters (e.g., ?manual=true) */
|
||||
const cleanFilepath = file.filepath.split('?')[0];
|
||||
|
|
@ -256,8 +257,7 @@ const deleteLocalFile = async (req, file) => {
|
|||
* Uploads a file to the specified upload directory.
|
||||
*
|
||||
* @param {Object} params - The params object.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id`
|
||||
* representing the user, and an `app.locals.paths` object with an `uploads` path.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id` representing the user
|
||||
* @param {Express.Multer.File} params.file - The file object, which is part of the request. The file object should
|
||||
* have a `path` property that points to the location of the uploaded file.
|
||||
* @param {string} params.file_id - The file ID.
|
||||
|
|
@ -268,11 +268,12 @@ const deleteLocalFile = async (req, file) => {
|
|||
* - bytes: The size of the file in bytes.
|
||||
*/
|
||||
async function uploadLocalFile({ req, file, file_id }) {
|
||||
const appConfig = req.config;
|
||||
const inputFilePath = file.path;
|
||||
const inputBuffer = await fs.promises.readFile(inputFilePath);
|
||||
const bytes = Buffer.byteLength(inputBuffer);
|
||||
|
||||
const { uploads } = req.app.locals.paths;
|
||||
const { uploads } = appConfig.paths;
|
||||
const userPath = path.join(uploads, req.user.id);
|
||||
|
||||
if (!fs.existsSync(userPath)) {
|
||||
|
|
@ -295,8 +296,9 @@ async function uploadLocalFile({ req, file, file_id }) {
|
|||
* @param {string} filepath - The filepath.
|
||||
* @returns {ReadableStream} A readable stream of the file.
|
||||
*/
|
||||
function getLocalFileStream(req, filepath) {
|
||||
async function getLocalFileStream(req, filepath) {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
if (filepath.includes('/uploads/')) {
|
||||
const basePath = filepath.split('/uploads/')[1];
|
||||
|
||||
|
|
@ -305,8 +307,8 @@ function getLocalFileStream(req, filepath) {
|
|||
throw new Error(`Invalid file path: ${filepath}`);
|
||||
}
|
||||
|
||||
const fullPath = path.join(req.app.locals.paths.uploads, basePath);
|
||||
const uploadsDir = req.app.locals.paths.uploads;
|
||||
const fullPath = path.join(appConfig.paths.uploads, basePath);
|
||||
const uploadsDir = appConfig.paths.uploads;
|
||||
|
||||
const rel = path.relative(uploadsDir, fullPath);
|
||||
if (rel.startsWith('..') || path.isAbsolute(rel) || rel.includes(`..${path.sep}`)) {
|
||||
|
|
@ -323,8 +325,8 @@ function getLocalFileStream(req, filepath) {
|
|||
throw new Error(`Invalid file path: ${filepath}`);
|
||||
}
|
||||
|
||||
const fullPath = path.join(req.app.locals.paths.imageOutput, basePath);
|
||||
const publicDir = req.app.locals.paths.imageOutput;
|
||||
const fullPath = path.join(appConfig.paths.imageOutput, basePath);
|
||||
const publicDir = appConfig.paths.imageOutput;
|
||||
|
||||
const rel = path.relative(publicDir, fullPath);
|
||||
if (rel.startsWith('..') || path.isAbsolute(rel) || rel.includes(`..${path.sep}`)) {
|
||||
|
|
|
|||
|
|
@ -13,8 +13,7 @@ const { updateUser, updateFile } = require('~/models');
|
|||
*
|
||||
* The original image is deleted after conversion.
|
||||
* @param {Object} params - The params object.
|
||||
* @param {Object} params.req - The request object from Express. It should have a `user` property with an `id`
|
||||
* representing the user, and an `app.locals.paths` object with an `imageOutput` path.
|
||||
* @param {Object} params.req - The request object from Express. It should have a `user` property with an `id` representing the user
|
||||
* @param {Express.Multer.File} params.file - The file object, which is part of the request. The file object should
|
||||
* have a `path` property that points to the location of the uploaded file.
|
||||
* @param {string} params.file_id - The file ID.
|
||||
|
|
@ -29,6 +28,7 @@ const { updateUser, updateFile } = require('~/models');
|
|||
* - height: The height of the converted image.
|
||||
*/
|
||||
async function uploadLocalImage({ req, file, file_id, endpoint, resolution = 'high' }) {
|
||||
const appConfig = req.config;
|
||||
const inputFilePath = file.path;
|
||||
const inputBuffer = await fs.promises.readFile(inputFilePath);
|
||||
const {
|
||||
|
|
@ -38,7 +38,7 @@ async function uploadLocalImage({ req, file, file_id, endpoint, resolution = 'hi
|
|||
} = await resizeImageBuffer(inputBuffer, resolution, endpoint);
|
||||
const extension = path.extname(inputFilePath);
|
||||
|
||||
const { imageOutput } = req.app.locals.paths;
|
||||
const { imageOutput } = appConfig.paths;
|
||||
const userPath = path.join(imageOutput, req.user.id);
|
||||
|
||||
if (!fs.existsSync(userPath)) {
|
||||
|
|
@ -47,7 +47,7 @@ async function uploadLocalImage({ req, file, file_id, endpoint, resolution = 'hi
|
|||
|
||||
const fileName = `${file_id}__${path.basename(inputFilePath)}`;
|
||||
const newPath = path.join(userPath, fileName);
|
||||
const targetExtension = `.${req.app.locals.imageOutputType}`;
|
||||
const targetExtension = `.${appConfig.imageOutputType}`;
|
||||
|
||||
if (extension.toLowerCase() === targetExtension) {
|
||||
const bytes = Buffer.byteLength(resizedBuffer);
|
||||
|
|
@ -57,7 +57,7 @@ async function uploadLocalImage({ req, file, file_id, endpoint, resolution = 'hi
|
|||
}
|
||||
|
||||
const outputFilePath = newPath.replace(extension, targetExtension);
|
||||
const data = await sharp(resizedBuffer).toFormat(req.app.locals.imageOutputType).toBuffer();
|
||||
const data = await sharp(resizedBuffer).toFormat(appConfig.imageOutputType).toBuffer();
|
||||
await fs.promises.writeFile(outputFilePath, data);
|
||||
const bytes = Buffer.byteLength(data);
|
||||
const filepath = path.posix.join('/', 'images', req.user.id, path.basename(outputFilePath));
|
||||
|
|
@ -90,7 +90,8 @@ function encodeImage(imagePath) {
|
|||
* @returns {Promise<[MongoFile, string]>} - A promise that resolves to an array of results from updateFile and encodeImage.
|
||||
*/
|
||||
async function prepareImagesLocal(req, file) {
|
||||
const { publicPath, imageOutput } = req.app.locals.paths;
|
||||
const appConfig = req.config;
|
||||
const { publicPath, imageOutput } = appConfig.paths;
|
||||
const userPath = path.join(imageOutput, req.user.id);
|
||||
|
||||
if (!fs.existsSync(userPath)) {
|
||||
|
|
|
|||
|
|
@ -7,8 +7,7 @@ const { logger } = require('~/config');
|
|||
* Uploads a file that can be used across various OpenAI services.
|
||||
*
|
||||
* @param {Object} params - The params object.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id`
|
||||
* representing the user, and an `app.locals.paths` object with an `imageOutput` path.
|
||||
* @param {ServerRequest} params.req - The request object from Express. It should have a `user` property with an `id` representing the user
|
||||
* @param {Express.Multer.File} params.file - The file uploaded to the server via multer.
|
||||
* @param {OpenAIClient} params.openai - The initialized OpenAI client.
|
||||
* @returns {Promise<OpenAIFile>}
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ const defaultBasePath = 'images';
|
|||
* Resizes, converts, and uploads an image file to S3.
|
||||
*
|
||||
* @param {Object} params
|
||||
* @param {import('express').Request} params.req - Express request (expects user and app.locals.imageOutputType).
|
||||
* @param {import('express').Request} params.req - Express request (expects `user` and `appConfig.imageOutputType`).
|
||||
* @param {Express.Multer.File} params.file - File object from Multer.
|
||||
* @param {string} params.file_id - Unique file identifier.
|
||||
* @param {any} params.endpoint - Endpoint identifier used in image processing.
|
||||
|
|
@ -29,6 +29,7 @@ async function uploadImageToS3({
|
|||
basePath = defaultBasePath,
|
||||
}) {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
const inputFilePath = file.path;
|
||||
const inputBuffer = await fs.promises.readFile(inputFilePath);
|
||||
const {
|
||||
|
|
@ -41,14 +42,12 @@ async function uploadImageToS3({
|
|||
|
||||
let processedBuffer;
|
||||
let fileName = `${file_id}__${path.basename(inputFilePath)}`;
|
||||
const targetExtension = `.${req.app.locals.imageOutputType}`;
|
||||
const targetExtension = `.${appConfig.imageOutputType}`;
|
||||
|
||||
if (extension.toLowerCase() === targetExtension) {
|
||||
processedBuffer = resizedBuffer;
|
||||
} else {
|
||||
processedBuffer = await sharp(resizedBuffer)
|
||||
.toFormat(req.app.locals.imageOutputType)
|
||||
.toBuffer();
|
||||
processedBuffer = await sharp(resizedBuffer).toFormat(appConfig.imageOutputType).toBuffer();
|
||||
fileName = fileName.replace(new RegExp(path.extname(fileName) + '$'), targetExtension);
|
||||
if (!path.extname(fileName)) {
|
||||
fileName += targetExtension;
|
||||
|
|
|
|||
|
|
@ -10,8 +10,7 @@ const { generateShortLivedToken } = require('~/server/services/AuthService');
|
|||
* Deletes a file from the vector database. This function takes a file object, constructs the full path, and
|
||||
* verifies the path's validity before deleting the file. If the path is invalid, an error is thrown.
|
||||
*
|
||||
* @param {ServerRequest} req - The request object from Express. It should have an `app.locals.paths` object with
|
||||
* a `publicPath` property.
|
||||
* @param {ServerRequest} req - The request object from Express.
|
||||
* @param {MongoFile} file - The file object to be deleted. It should have a `filepath` property that is
|
||||
* a string representing the path of the file relative to the publicPath.
|
||||
*
|
||||
|
|
@ -54,8 +53,7 @@ const deleteVectors = async (req, file) => {
|
|||
* Uploads a file to the configured Vector database
|
||||
*
|
||||
* @param {Object} params - The params object.
|
||||
* @param {Object} params.req - The request object from Express. It should have a `user` property with an `id`
|
||||
* representing the user, and an `app.locals.paths` object with an `uploads` path.
|
||||
* @param {Object} params.req - The request object from Express. It should have a `user` property with an `id` representing the user
|
||||
* @param {Express.Multer.File} params.file - The file object, which is part of the request. The file object should
|
||||
* have a `path` property that points to the location of the uploaded file.
|
||||
* @param {string} params.file_id - The file ID.
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const sharp = require('sharp');
|
||||
const { resizeImageBuffer } = require('./resize');
|
||||
const { getStrategyFunctions } = require('../strategies');
|
||||
const { resizeImageBuffer } = require('./resize');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
/**
|
||||
* Converts an image file or buffer to target output type with specified resolution.
|
||||
*
|
||||
* @param {Express.Request} req - The request object, containing user and app configuration data.
|
||||
* @param {ServerRequest} req - The request object, containing user and app configuration data.
|
||||
* @param {Buffer | Express.Multer.File} file - The file object, containing either a path or a buffer.
|
||||
* @param {'low' | 'high'} [resolution='high'] - The desired resolution for the output image.
|
||||
* @param {string} [basename=''] - The basename of the input file, if it is a buffer.
|
||||
|
|
@ -17,6 +17,7 @@ const { logger } = require('~/config');
|
|||
*/
|
||||
async function convertImage(req, file, resolution = 'high', basename = '') {
|
||||
try {
|
||||
const appConfig = req.config;
|
||||
let inputBuffer;
|
||||
let outputBuffer;
|
||||
let extension = path.extname(file.path ?? basename).toLowerCase();
|
||||
|
|
@ -39,11 +40,11 @@ async function convertImage(req, file, resolution = 'high', basename = '') {
|
|||
} = await resizeImageBuffer(inputBuffer, resolution);
|
||||
|
||||
// Check if the file is already in target format; if it isn't, convert it:
|
||||
const targetExtension = `.${req.app.locals.imageOutputType}`;
|
||||
const targetExtension = `.${appConfig.imageOutputType}`;
|
||||
if (extension === targetExtension) {
|
||||
outputBuffer = resizedBuffer;
|
||||
} else {
|
||||
outputBuffer = await sharp(resizedBuffer).toFormat(req.app.locals.imageOutputType).toBuffer();
|
||||
outputBuffer = await sharp(resizedBuffer).toFormat(appConfig.imageOutputType).toBuffer();
|
||||
extension = targetExtension;
|
||||
}
|
||||
|
||||
|
|
@ -51,7 +52,7 @@ async function convertImage(req, file, resolution = 'high', basename = '') {
|
|||
const newFileName =
|
||||
path.basename(file.path ?? basename, path.extname(file.path ?? basename)) + extension;
|
||||
|
||||
const { saveBuffer } = getStrategyFunctions(req.app.locals.fileStrategy);
|
||||
const { saveBuffer } = getStrategyFunctions(appConfig.fileStrategy);
|
||||
|
||||
const savedFilePath = await saveBuffer({
|
||||
userId: req.user.id,
|
||||
|
|
|
|||
|
|
@ -81,7 +81,7 @@ const blobStorageSources = new Set([FileSources.azure_blob, FileSources.s3]);
|
|||
|
||||
/**
|
||||
* Encodes and formats the given files.
|
||||
* @param {Express.Request} req - The request object.
|
||||
* @param {ServerRequest} req - The request object.
|
||||
* @param {Array<MongoFile>} files - The array of files to encode and format.
|
||||
* @param {EModelEndpoint} [endpoint] - Optional: The endpoint for the image.
|
||||
* @param {string} [mode] - Optional: The endpoint mode for the image.
|
||||
|
|
|
|||
|
|
@ -1,13 +1,11 @@
|
|||
const avatar = require('./avatar');
|
||||
const convert = require('./convert');
|
||||
const encode = require('./encode');
|
||||
const parse = require('./parse');
|
||||
const resize = require('./resize');
|
||||
|
||||
module.exports = {
|
||||
...convert,
|
||||
...encode,
|
||||
...parse,
|
||||
...resize,
|
||||
avatar,
|
||||
};
|
||||
|
|
|
|||
|
|
@ -1,45 +0,0 @@
|
|||
const URL = require('url').URL;
|
||||
const path = require('path');
|
||||
|
||||
const imageExtensionRegex = /\.(jpg|jpeg|png|gif|bmp|tiff|svg|webp)$/i;
|
||||
|
||||
/**
|
||||
* Extracts the image basename from a given URL.
|
||||
*
|
||||
* @param {string} urlString - The URL string from which the image basename is to be extracted.
|
||||
* @returns {string} The basename of the image file from the URL.
|
||||
* Returns an empty string if the URL does not contain a valid image basename.
|
||||
*/
|
||||
function getImageBasename(urlString) {
|
||||
try {
|
||||
const url = new URL(urlString);
|
||||
const basename = path.basename(url.pathname);
|
||||
|
||||
return imageExtensionRegex.test(basename) ? basename : '';
|
||||
} catch (error) {
|
||||
// If URL parsing fails, return an empty string
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extracts the basename of a file from a given URL.
|
||||
*
|
||||
* @param {string} urlString - The URL string from which the file basename is to be extracted.
|
||||
* @returns {string} The basename of the file from the URL.
|
||||
* Returns an empty string if the URL parsing fails.
|
||||
*/
|
||||
function getFileBasename(urlString) {
|
||||
try {
|
||||
const url = new URL(urlString);
|
||||
return path.basename(url.pathname);
|
||||
} catch (error) {
|
||||
// If URL parsing fails, return an empty string
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
getImageBasename,
|
||||
getFileBasename,
|
||||
};
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue