mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-09-21 21:50:49 +02:00
🪐 feat: MCP OAuth 2.0 Discovery Support (#7924)
* chore: Update @modelcontextprotocol/sdk to version 1.12.3 in package.json and package-lock.json - Bump version of @modelcontextprotocol/sdk to 1.12.3 to incorporate recent updates. - Update dependencies for ajv and cross-spawn to their latest versions. - Add ajv as a new dependency in the sdk module. - Include json-schema-traverse as a new dependency in the sdk module. * feat: @librechat/auth * feat: Add crypto module exports to auth package - Introduced a new crypto module by creating index.ts in the crypto directory. - Updated the main index.ts of the auth package to export from the new crypto module. * feat: Update package dependencies and build scripts for auth package - Added @librechat/auth as a dependency in package.json and package-lock.json. - Updated build scripts to include the auth package in both frontend and bun build processes. - Removed unused mongoose and openid-client dependencies from package-lock.json for cleaner dependency management. * refactor: Migrate crypto utility functions to @librechat/auth - Replaced local crypto utility imports with the new @librechat/auth package across multiple files. - Removed the obsolete crypto.js file and its exports. - Updated relevant services and models to utilize the new encryption and decryption methods from @librechat/auth. * feat: Enhance OAuth token handling and update dependencies in auth package * chore: Remove Token model and TokenService due to restructuring of OAuth handling - Deleted the Token.js model and TokenService.js, which were responsible for managing OAuth tokens. - This change is part of a broader refactor to streamline OAuth token management and improve code organization. * refactor: imports from '@librechat/auth' to '@librechat/api' and add OAuth token handling functionality * refactor: Simplify logger usage in MCP and FlowStateManager classes * chore: fix imports * feat: Add OAuth configuration schema to MCP with token exchange method support * feat: FIRST PASS Implement MCP OAuth flow with token management and error handling - Added a new route for handling OAuth callbacks and token retrieval. - Integrated OAuth token storage and retrieval mechanisms. - Enhanced MCP connection to support automatic OAuth flow initiation on 401 errors. - Implemented dynamic client registration and metadata discovery for OAuth. - Updated MCPManager to manage OAuth tokens and handle authentication requirements. - Introduced comprehensive logging for OAuth processes and error handling. * refactor: Update MCPConnection and MCPManager to utilize new URL handling - Added a `url` property to MCPConnection for better URL management. - Refactored MCPManager to use the new `url` property instead of a deprecated method for OAuth handling. - Changed logging from info to debug level for flow manager and token methods initialization. - Improved comments for clarity on existing tokens and OAuth event listener setup. * refactor: Improve connection timeout error messages in MCPConnection and MCPManager and use initTimeout for connection - Updated the connection timeout error messages to include the duration of the timeout. - Introduced a configurable `connectTimeout` variable in both MCPConnection and MCPManager for better flexibility. * chore: cleanup MCP OAuth Token exchange handling; fix: erroneous use of flowsCache and remove verbose logs * refactor: Update MCPManager and MCPTokenStorage to use TokenMethods for token management - Removed direct token storage handling in MCPManager and replaced it with TokenMethods for better abstraction. - Refactored MCPTokenStorage methods to accept parameters for token operations, enhancing flexibility and readability. - Improved logging messages related to token persistence and retrieval processes. * refactor: Update MCP OAuth handling to use static methods and improve flow management - Refactored MCPOAuthHandler to utilize static methods for initiating and completing OAuth flows, enhancing clarity and reducing instance dependencies. - Updated MCPManager to pass flowManager explicitly to OAuth handling methods, improving flexibility in flow state management. - Enhanced comments and logging for better understanding of OAuth processes and flow state retrieval. * refactor: Integrate token methods into createMCPTool for enhanced token management * refactor: Change logging from info to debug level in MCPOAuthHandler for improved log management * chore: clean up logging * feat: first pass, auth URL from MCP OAuth flow * chore: Improve logging format for OAuth authentication URL display * chore: cleanup mcp manager comments * feat: add connection reconnection logic in MCPManager * refactor: reorganize token storage handling in MCP - Moved token storage logic from MCPManager to a new MCPTokenStorage class for better separation of concerns. - Updated imports to reflect the new token storage structure. - Enhanced methods for storing, retrieving, updating, and deleting OAuth tokens, improving overall token management. * chore: update comment for SYSTEM_USER_ID in MCPManager for clarity * feat: implement refresh token functionality in MCP - Added refresh token handling in MCPManager to support token renewal for both app-level and user-specific connections. - Introduced a refreshTokens function to facilitate token refresh logic. - Enhanced MCPTokenStorage to manage client information and refresh token processes. - Updated logging for better traceability during token operations. * chore: cleanup @librechat/auth * feat: implement MCP server initialization in a separate service - Added a new service to handle the initialization of MCP servers, improving code organization and readability. - Refactored the server startup logic to utilize the new initializeMCP function. - Removed redundant MCP initialization code from the main server file. * fix: don't log auth url for user connections * feat: enhance OAuth flow with success and error handling components - Updated OAuth callback routes to redirect to new success and error pages instead of sending status messages. - Introduced `OAuthSuccess` and `OAuthError` components to provide user feedback during authentication. - Added localization support for success and error messages in the translation files. - Implemented countdown functionality in the success component for a better user experience. * fix: refresh token handling for user connections, add missing URL and methods - add standard enum for system user id and helper for determining app-lvel vs. user-level connections * refactor: update token handling in MCPManager and MCPTokenStorage * fix: improve error logging in OAuth authentication handler * fix: concurrency issues for both login url emission and concurrency of oauth flows for shared flows (same user, same server, multiple calls for same server) * fix: properly fail shared flows for concurrent server calls and prevent duplication of tokens * chore: remove unused auth package directory from update configuration * ci: fix mocks in samlStrategy tests * ci: add mcpConfig to AppService test setup * chore: remove obsolete MCP OAuth implementation documentation * fix: update build script for API to use correct command * chore: bump version of @librechat/api to 1.2.4 * fix: update abort signal handling in createMCPTool function * fix: add optional clientInfo parameter to refreshTokensFunction metadata * refactor: replace app.locals.availableTools with getCachedTools in multiple services and controllers for improved tool management * fix: concurrent refresh token handling issue * refactor: add signal parameter to getUserConnection method for improved abort handling * chore: JSDoc typing for `loadEphemeralAgent` * refactor: update isConnectionActive method to use destructured parameters for improved readability * feat: implement caching for MCP tools to handle app-level disconnects for loading list of tools * ci: fix agent test
This commit is contained in:
parent
b412455e9d
commit
ec7370dfe9
60 changed files with 3399 additions and 764 deletions
|
@ -1,3 +1,4 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { SerpAPI } = require('@langchain/community/tools/serpapi');
|
||||
const { Calculator } = require('@langchain/community/tools/calculator');
|
||||
const { EnvVar, createCodeExecutionTool, createSearchTool } = require('@librechat/agents');
|
||||
|
@ -29,8 +30,8 @@ const {
|
|||
const { primeFiles: primeCodeFiles } = require('~/server/services/Files/Code/process');
|
||||
const { createFileSearchTool, primeFiles: primeSearchFiles } = require('./fileSearch');
|
||||
const { loadAuthValues } = require('~/server/services/Tools/credentials');
|
||||
const { getCachedTools } = require('~/server/services/Config');
|
||||
const { createMCPTool } = require('~/server/services/MCP');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
const mcpToolPattern = new RegExp(`^.+${Constants.mcp_delimiter}.+$`);
|
||||
|
||||
|
@ -236,7 +237,7 @@ const loadTools = async ({
|
|||
|
||||
/** @type {Record<string, string>} */
|
||||
const toolContextMap = {};
|
||||
const appTools = options.req?.app?.locals?.availableTools ?? {};
|
||||
const appTools = (await getCachedTools({ includeGlobal: true })) ?? {};
|
||||
|
||||
for (const tool of tools) {
|
||||
if (tool === Tools.execute_code) {
|
||||
|
@ -299,6 +300,7 @@ Current Date & Time: ${replaceSpecialVars({ text: '{{iso_datetime}}' })}
|
|||
requestedTools[tool] = async () =>
|
||||
createMCPTool({
|
||||
req: options.req,
|
||||
res: options.res,
|
||||
toolKey: tool,
|
||||
model: agent?.model ?? model,
|
||||
provider: agent?.provider ?? endpoint,
|
||||
|
|
5
api/cache/getLogStores.js
vendored
5
api/cache/getLogStores.js
vendored
|
@ -29,6 +29,10 @@ const roles = isRedisEnabled
|
|||
? new Keyv({ store: keyvRedis })
|
||||
: new Keyv({ namespace: CacheKeys.ROLES });
|
||||
|
||||
const mcpTools = isRedisEnabled
|
||||
? new Keyv({ store: keyvRedis })
|
||||
: new Keyv({ namespace: CacheKeys.MCP_TOOLS });
|
||||
|
||||
const audioRuns = isRedisEnabled
|
||||
? new Keyv({ store: keyvRedis, ttl: Time.TEN_MINUTES })
|
||||
: new Keyv({ namespace: CacheKeys.AUDIO_RUNS, ttl: Time.TEN_MINUTES });
|
||||
|
@ -67,6 +71,7 @@ const openIdExchangedTokensCache = isRedisEnabled
|
|||
|
||||
const namespaces = {
|
||||
[CacheKeys.ROLES]: roles,
|
||||
[CacheKeys.MCP_TOOLS]: mcpTools,
|
||||
[CacheKeys.CONFIG_STORE]: config,
|
||||
[CacheKeys.PENDING_REQ]: pending_req,
|
||||
[ViolationTypes.BAN]: new Keyv({ store: keyvMongo, namespace: CacheKeys.BANS, ttl: duration }),
|
||||
|
|
|
@ -15,7 +15,7 @@ let flowManager = null;
|
|||
*/
|
||||
function getMCPManager(userId) {
|
||||
if (!mcpManager) {
|
||||
mcpManager = MCPManager.getInstance(logger);
|
||||
mcpManager = MCPManager.getInstance();
|
||||
} else {
|
||||
mcpManager.checkIdleConnections(userId);
|
||||
}
|
||||
|
@ -30,7 +30,6 @@ function getFlowStateManager(flowsCache) {
|
|||
if (!flowManager) {
|
||||
flowManager = new FlowStateManager(flowsCache, {
|
||||
ttl: Time.ONE_MINUTE * 3,
|
||||
logger,
|
||||
});
|
||||
}
|
||||
return flowManager;
|
||||
|
|
|
@ -11,6 +11,7 @@ const {
|
|||
removeAgentIdsFromProject,
|
||||
removeAgentFromAllProjects,
|
||||
} = require('./Project');
|
||||
const { getCachedTools } = require('~/server/services/Config');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
const { getActions } = require('./Action');
|
||||
const { Agent } = require('~/db/models');
|
||||
|
@ -55,12 +56,12 @@ const getAgent = async (searchParameter) => await Agent.findOne(searchParameter)
|
|||
* @param {string} params.agent_id
|
||||
* @param {string} params.endpoint
|
||||
* @param {import('@librechat/agents').ClientOptions} [params.model_parameters]
|
||||
* @returns {Agent|null} The agent document as a plain object, or null if not found.
|
||||
* @returns {Promise<Agent|null>} The agent document as a plain object, or null if not found.
|
||||
*/
|
||||
const loadEphemeralAgent = ({ req, agent_id, endpoint, model_parameters: _m }) => {
|
||||
const loadEphemeralAgent = async ({ req, agent_id, endpoint, model_parameters: _m }) => {
|
||||
const { model, ...model_parameters } = _m;
|
||||
/** @type {Record<string, FunctionTool>} */
|
||||
const availableTools = req.app.locals.availableTools;
|
||||
const availableTools = await getCachedTools({ includeGlobal: true });
|
||||
/** @type {TEphemeralAgent | null} */
|
||||
const ephemeralAgent = req.body.ephemeralAgent;
|
||||
const mcpServers = new Set(ephemeralAgent?.mcp);
|
||||
|
@ -111,7 +112,7 @@ const loadAgent = async ({ req, agent_id, endpoint, model_parameters }) => {
|
|||
return null;
|
||||
}
|
||||
if (agent_id === EPHEMERAL_AGENT_ID) {
|
||||
return loadEphemeralAgent({ req, agent_id, endpoint, model_parameters });
|
||||
return await loadEphemeralAgent({ req, agent_id, endpoint, model_parameters });
|
||||
}
|
||||
const agent = await getAgent({
|
||||
id: agent_id,
|
||||
|
|
|
@ -6,6 +6,10 @@ const originalEnv = {
|
|||
process.env.CREDS_KEY = '0123456789abcdef0123456789abcdef';
|
||||
process.env.CREDS_IV = '0123456789abcdef';
|
||||
|
||||
jest.mock('~/server/services/Config', () => ({
|
||||
getCachedTools: jest.fn(),
|
||||
}));
|
||||
|
||||
const mongoose = require('mongoose');
|
||||
const { v4: uuidv4 } = require('uuid');
|
||||
const { agentSchema } = require('@librechat/data-schemas');
|
||||
|
@ -23,6 +27,7 @@ const {
|
|||
generateActionMetadataHash,
|
||||
revertAgentVersion,
|
||||
} = require('./Agent');
|
||||
const { getCachedTools } = require('~/server/services/Config');
|
||||
|
||||
/**
|
||||
* @type {import('mongoose').Model<import('@librechat/data-schemas').IAgent>}
|
||||
|
@ -406,6 +411,7 @@ describe('models/Agent', () => {
|
|||
beforeAll(async () => {
|
||||
mongoServer = await MongoMemoryServer.create();
|
||||
const mongoUri = mongoServer.getUri();
|
||||
Agent = mongoose.models.Agent || mongoose.model('Agent', agentSchema);
|
||||
await mongoose.connect(mongoUri);
|
||||
});
|
||||
|
||||
|
@ -1546,6 +1552,12 @@ describe('models/Agent', () => {
|
|||
test('should test ephemeral agent loading logic', async () => {
|
||||
const { EPHEMERAL_AGENT_ID } = require('librechat-data-provider').Constants;
|
||||
|
||||
getCachedTools.mockResolvedValue({
|
||||
tool1_mcp_server1: {},
|
||||
tool2_mcp_server2: {},
|
||||
another_tool: {},
|
||||
});
|
||||
|
||||
const mockReq = {
|
||||
user: { id: 'user123' },
|
||||
body: {
|
||||
|
@ -1556,15 +1568,6 @@ describe('models/Agent', () => {
|
|||
mcp: ['server1', 'server2'],
|
||||
},
|
||||
},
|
||||
app: {
|
||||
locals: {
|
||||
availableTools: {
|
||||
tool1_mcp_server1: {},
|
||||
tool2_mcp_server2: {},
|
||||
another_tool: {},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const result = await loadAgent({
|
||||
|
@ -1657,6 +1660,8 @@ describe('models/Agent', () => {
|
|||
test('should handle ephemeral agent with no MCP servers', async () => {
|
||||
const { EPHEMERAL_AGENT_ID } = require('librechat-data-provider').Constants;
|
||||
|
||||
getCachedTools.mockResolvedValue({});
|
||||
|
||||
const mockReq = {
|
||||
user: { id: 'user123' },
|
||||
body: {
|
||||
|
@ -1667,11 +1672,6 @@ describe('models/Agent', () => {
|
|||
mcp: [],
|
||||
},
|
||||
},
|
||||
app: {
|
||||
locals: {
|
||||
availableTools: {},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const result = await loadAgent({
|
||||
|
@ -1692,16 +1692,13 @@ describe('models/Agent', () => {
|
|||
test('should handle ephemeral agent with undefined ephemeralAgent in body', async () => {
|
||||
const { EPHEMERAL_AGENT_ID } = require('librechat-data-provider').Constants;
|
||||
|
||||
getCachedTools.mockResolvedValue({});
|
||||
|
||||
const mockReq = {
|
||||
user: { id: 'user123' },
|
||||
body: {
|
||||
promptPrefix: 'Basic instructions',
|
||||
},
|
||||
app: {
|
||||
locals: {
|
||||
availableTools: {},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const result = await loadAgent({
|
||||
|
@ -1734,6 +1731,13 @@ describe('models/Agent', () => {
|
|||
const { EPHEMERAL_AGENT_ID } = require('librechat-data-provider').Constants;
|
||||
|
||||
const largeToolList = Array.from({ length: 100 }, (_, i) => `tool_${i}_mcp_server1`);
|
||||
const availableTools = largeToolList.reduce((acc, tool) => {
|
||||
acc[tool] = {};
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
getCachedTools.mockResolvedValue(availableTools);
|
||||
|
||||
const mockReq = {
|
||||
user: { id: 'user123' },
|
||||
body: {
|
||||
|
@ -1744,14 +1748,6 @@ describe('models/Agent', () => {
|
|||
mcp: ['server1'],
|
||||
},
|
||||
},
|
||||
app: {
|
||||
locals: {
|
||||
availableTools: largeToolList.reduce((acc, tool) => {
|
||||
acc[tool] = {};
|
||||
return acc;
|
||||
}, {}),
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const result = await loadAgent({
|
||||
|
@ -2272,6 +2268,13 @@ describe('models/Agent', () => {
|
|||
test('should handle loadEphemeralAgent with malformed MCP tool names', async () => {
|
||||
const { EPHEMERAL_AGENT_ID } = require('librechat-data-provider').Constants;
|
||||
|
||||
getCachedTools.mockResolvedValue({
|
||||
malformed_tool_name: {}, // No mcp delimiter
|
||||
tool__server1: {}, // Wrong delimiter
|
||||
tool_mcp_server1: {}, // Correct format
|
||||
tool_mcp_server2: {}, // Different server
|
||||
});
|
||||
|
||||
const mockReq = {
|
||||
user: { id: 'user123' },
|
||||
body: {
|
||||
|
@ -2282,16 +2285,6 @@ describe('models/Agent', () => {
|
|||
mcp: ['server1'],
|
||||
},
|
||||
},
|
||||
app: {
|
||||
locals: {
|
||||
availableTools: {
|
||||
malformed_tool_name: {}, // No mcp delimiter
|
||||
tool__server1: {}, // Wrong delimiter
|
||||
tool_mcp_server1: {}, // Correct format
|
||||
tool_mcp_server2: {}, // Different server
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const result = await loadAgent({
|
||||
|
|
|
@ -1,42 +0,0 @@
|
|||
const { findToken, updateToken, createToken } = require('~/models');
|
||||
const { encryptV2 } = require('~/server/utils/crypto');
|
||||
|
||||
/**
|
||||
* Handles the OAuth token by creating or updating the token.
|
||||
* @param {object} fields
|
||||
* @param {string} fields.userId - The user's ID.
|
||||
* @param {string} fields.token - The full token to store.
|
||||
* @param {string} fields.identifier - Unique, alternative identifier for the token.
|
||||
* @param {number} fields.expiresIn - The number of seconds until the token expires.
|
||||
* @param {object} fields.metadata - Additional metadata to store with the token.
|
||||
* @param {string} [fields.type="oauth"] - The type of token. Default is 'oauth'.
|
||||
*/
|
||||
async function handleOAuthToken({
|
||||
token,
|
||||
userId,
|
||||
identifier,
|
||||
expiresIn,
|
||||
metadata,
|
||||
type = 'oauth',
|
||||
}) {
|
||||
const encrypedToken = await encryptV2(token);
|
||||
const tokenData = {
|
||||
type,
|
||||
userId,
|
||||
metadata,
|
||||
identifier,
|
||||
token: encrypedToken,
|
||||
expiresIn: parseInt(expiresIn, 10) || 3600,
|
||||
};
|
||||
|
||||
const existingToken = await findToken({ userId, identifier });
|
||||
if (existingToken) {
|
||||
return await updateToken({ identifier }, tokenData);
|
||||
} else {
|
||||
return await createToken(tokenData);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
handleOAuthToken,
|
||||
};
|
|
@ -1,6 +1,6 @@
|
|||
const mongoose = require('mongoose');
|
||||
const { getRandomValues } = require('@librechat/api');
|
||||
const { logger, hashToken } = require('@librechat/data-schemas');
|
||||
const { getRandomValues } = require('~/server/utils/crypto');
|
||||
const { createToken, findToken } = require('~/models');
|
||||
|
||||
/**
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { CacheKeys, AuthType } = require('librechat-data-provider');
|
||||
const { getCustomConfig, getCachedTools } = require('~/server/services/Config');
|
||||
const { getToolkitKey } = require('~/server/services/ToolService');
|
||||
const { getCustomConfig } = require('~/server/services/Config');
|
||||
const { getMCPManager, getFlowStateManager } = require('~/config');
|
||||
const { availableTools } = require('~/app/clients/tools');
|
||||
const { getMCPManager } = require('~/config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
/**
|
||||
|
@ -84,6 +85,45 @@ const getAvailablePluginsController = async (req, res) => {
|
|||
}
|
||||
};
|
||||
|
||||
function createServerToolsCallback() {
|
||||
/**
|
||||
* @param {string} serverName
|
||||
* @param {TPlugin[] | null} serverTools
|
||||
*/
|
||||
return async function (serverName, serverTools) {
|
||||
try {
|
||||
const mcpToolsCache = getLogStores(CacheKeys.MCP_TOOLS);
|
||||
if (!serverName || !mcpToolsCache) {
|
||||
return;
|
||||
}
|
||||
await mcpToolsCache.set(serverName, serverTools);
|
||||
logger.debug(`MCP tools for ${serverName} added to cache.`);
|
||||
} catch (error) {
|
||||
logger.error('Error retrieving MCP tools from cache:', error);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
function createGetServerTools() {
|
||||
/**
|
||||
* Retrieves cached server tools
|
||||
* @param {string} serverName
|
||||
* @returns {Promise<TPlugin[] | null>}
|
||||
*/
|
||||
return async function (serverName) {
|
||||
try {
|
||||
const mcpToolsCache = getLogStores(CacheKeys.MCP_TOOLS);
|
||||
if (!mcpToolsCache) {
|
||||
return null;
|
||||
}
|
||||
return await mcpToolsCache.get(serverName);
|
||||
} catch (error) {
|
||||
logger.error('Error retrieving MCP tools from cache:', error);
|
||||
return null;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves and returns a list of available tools, either from a cache or by reading a plugin manifest file.
|
||||
*
|
||||
|
@ -109,7 +149,16 @@ const getAvailableTools = async (req, res) => {
|
|||
const customConfig = await getCustomConfig();
|
||||
if (customConfig?.mcpServers != null) {
|
||||
const mcpManager = getMCPManager();
|
||||
pluginManifest = await mcpManager.loadManifestTools(pluginManifest);
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = flowsCache ? getFlowStateManager(flowsCache) : null;
|
||||
const serverToolsCallback = createServerToolsCallback();
|
||||
const getServerTools = createGetServerTools();
|
||||
const mcpTools = await mcpManager.loadManifestTools({
|
||||
flowManager,
|
||||
serverToolsCallback,
|
||||
getServerTools,
|
||||
});
|
||||
pluginManifest = [...mcpTools, ...pluginManifest];
|
||||
}
|
||||
|
||||
/** @type {TPlugin[]} */
|
||||
|
@ -123,7 +172,7 @@ const getAvailableTools = async (req, res) => {
|
|||
}
|
||||
});
|
||||
|
||||
const toolDefinitions = req.app.locals.availableTools;
|
||||
const toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
const tools = authenticatedPlugins.filter(
|
||||
(plugin) =>
|
||||
toolDefinitions[plugin.pluginKey] !== undefined ||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
const { encryptV3 } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const {
|
||||
verifyTOTP,
|
||||
|
@ -7,7 +8,6 @@ const {
|
|||
generateBackupCodes,
|
||||
} = require('~/server/services/twoFactorService');
|
||||
const { getUserById, updateUser } = require('~/models');
|
||||
const { encryptV3 } = require('~/server/utils/crypto');
|
||||
|
||||
const safeAppTitle = (process.env.APP_TITLE || 'LibreChat').replace(/\s+/g, '');
|
||||
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
const fs = require('fs').promises;
|
||||
const { nanoid } = require('nanoid');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const {
|
||||
Tools,
|
||||
Constants,
|
||||
FileContext,
|
||||
FileSources,
|
||||
SystemRoles,
|
||||
EToolResources,
|
||||
|
@ -16,16 +16,16 @@ const {
|
|||
deleteAgent,
|
||||
getListAgents,
|
||||
} = require('~/models/Agent');
|
||||
const { uploadImageBuffer, filterFile } = require('~/server/services/Files/process');
|
||||
const { getStrategyFunctions } = require('~/server/services/Files/strategies');
|
||||
const { resizeAvatar } = require('~/server/services/Files/images/avatar');
|
||||
const { refreshS3Url } = require('~/server/services/Files/S3/crud');
|
||||
const { filterFile } = require('~/server/services/Files/process');
|
||||
const { updateAction, getActions } = require('~/models/Action');
|
||||
const { getCachedTools } = require('~/server/services/Config');
|
||||
const { updateAgentProjects } = require('~/models/Agent');
|
||||
const { getProjectByName } = require('~/models/Project');
|
||||
const { deleteFileByFilter } = require('~/models/File');
|
||||
const { revertAgentVersion } = require('~/models/Agent');
|
||||
const { logger } = require('~/config');
|
||||
const { deleteFileByFilter } = require('~/models/File');
|
||||
|
||||
const systemTools = {
|
||||
[Tools.execute_code]: true,
|
||||
|
@ -47,8 +47,9 @@ const createAgentHandler = async (req, res) => {
|
|||
|
||||
agentData.tools = [];
|
||||
|
||||
const availableTools = await getCachedTools({ includeGlobal: true });
|
||||
for (const tool of tools) {
|
||||
if (req.app.locals.availableTools[tool]) {
|
||||
if (availableTools[tool]) {
|
||||
agentData.tools.push(tool);
|
||||
}
|
||||
|
||||
|
@ -445,7 +446,7 @@ const uploadAgentAvatarHandler = async (req, res) => {
|
|||
try {
|
||||
await fs.unlink(req.file.path);
|
||||
logger.debug('[/:agent_id/avatar] Temp. image upload file deleted');
|
||||
} catch (error) {
|
||||
} catch {
|
||||
logger.debug('[/:agent_id/avatar] Temp. image upload file already deleted');
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
const fs = require('fs').promises;
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { FileContext } = require('librechat-data-provider');
|
||||
const { uploadImageBuffer, filterFile } = require('~/server/services/Files/process');
|
||||
const validateAuthor = require('~/server/middleware/assistants/validateAuthor');
|
||||
|
@ -6,9 +7,9 @@ const { getStrategyFunctions } = require('~/server/services/Files/strategies');
|
|||
const { deleteAssistantActions } = require('~/server/services/ActionService');
|
||||
const { updateAssistantDoc, getAssistants } = require('~/models/Assistant');
|
||||
const { getOpenAIClient, fetchAssistants } = require('./helpers');
|
||||
const { getCachedTools } = require('~/server/services/Config');
|
||||
const { manifestToolMap } = require('~/app/clients/tools');
|
||||
const { deleteFileByFilter } = require('~/models/File');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
/**
|
||||
* Create an assistant.
|
||||
|
@ -30,21 +31,20 @@ const createAssistant = async (req, res) => {
|
|||
delete assistantData.conversation_starters;
|
||||
delete assistantData.append_current_datetime;
|
||||
|
||||
const toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
|
||||
assistantData.tools = tools
|
||||
.map((tool) => {
|
||||
if (typeof tool !== 'string') {
|
||||
return tool;
|
||||
}
|
||||
|
||||
const toolDefinitions = req.app.locals.availableTools;
|
||||
const toolDef = toolDefinitions[tool];
|
||||
if (!toolDef && manifestToolMap[tool] && manifestToolMap[tool].toolkit === true) {
|
||||
return (
|
||||
Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
.map(([_, val]) => val)
|
||||
);
|
||||
return Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
|
||||
.map(([_, val]) => val);
|
||||
}
|
||||
|
||||
return toolDef;
|
||||
|
@ -135,21 +135,21 @@ const patchAssistant = async (req, res) => {
|
|||
append_current_datetime,
|
||||
...updateData
|
||||
} = req.body;
|
||||
|
||||
const toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
|
||||
updateData.tools = (updateData.tools ?? [])
|
||||
.map((tool) => {
|
||||
if (typeof tool !== 'string') {
|
||||
return tool;
|
||||
}
|
||||
|
||||
const toolDefinitions = req.app.locals.availableTools;
|
||||
const toolDef = toolDefinitions[tool];
|
||||
if (!toolDef && manifestToolMap[tool] && manifestToolMap[tool].toolkit === true) {
|
||||
return (
|
||||
Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
.map(([_, val]) => val)
|
||||
);
|
||||
return Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
|
||||
.map(([_, val]) => val);
|
||||
}
|
||||
|
||||
return toolDef;
|
||||
|
|
|
@ -1,10 +1,11 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { ToolCallTypes } = require('librechat-data-provider');
|
||||
const validateAuthor = require('~/server/middleware/assistants/validateAuthor');
|
||||
const { validateAndUpdateTool } = require('~/server/services/ActionService');
|
||||
const { getCachedTools } = require('~/server/services/Config');
|
||||
const { updateAssistantDoc } = require('~/models/Assistant');
|
||||
const { manifestToolMap } = require('~/app/clients/tools');
|
||||
const { getOpenAIClient } = require('./helpers');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
/**
|
||||
* Create an assistant.
|
||||
|
@ -27,21 +28,20 @@ const createAssistant = async (req, res) => {
|
|||
delete assistantData.conversation_starters;
|
||||
delete assistantData.append_current_datetime;
|
||||
|
||||
const toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
|
||||
assistantData.tools = tools
|
||||
.map((tool) => {
|
||||
if (typeof tool !== 'string') {
|
||||
return tool;
|
||||
}
|
||||
|
||||
const toolDefinitions = req.app.locals.availableTools;
|
||||
const toolDef = toolDefinitions[tool];
|
||||
if (!toolDef && manifestToolMap[tool] && manifestToolMap[tool].toolkit === true) {
|
||||
return (
|
||||
Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
.map(([_, val]) => val)
|
||||
);
|
||||
return Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
|
||||
.map(([_, val]) => val);
|
||||
}
|
||||
|
||||
return toolDef;
|
||||
|
@ -125,13 +125,13 @@ const updateAssistant = async ({ req, openai, assistant_id, updateData }) => {
|
|||
|
||||
let hasFileSearch = false;
|
||||
for (const tool of updateData.tools ?? []) {
|
||||
const toolDefinitions = req.app.locals.availableTools;
|
||||
const toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
let actualTool = typeof tool === 'string' ? toolDefinitions[tool] : tool;
|
||||
|
||||
if (!actualTool && manifestToolMap[tool] && manifestToolMap[tool].toolkit === true) {
|
||||
actualTool = Object.entries(toolDefinitions)
|
||||
.filter(([key]) => key.startsWith(`${tool}_`))
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
|
||||
.map(([_, val]) => val);
|
||||
} else if (!actualTool) {
|
||||
continue;
|
||||
|
|
|
@ -1,22 +1,22 @@
|
|||
require('dotenv').config();
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
require('module-alias')({ base: path.resolve(__dirname, '..') });
|
||||
const cors = require('cors');
|
||||
const axios = require('axios');
|
||||
const express = require('express');
|
||||
const compression = require('compression');
|
||||
const passport = require('passport');
|
||||
const mongoSanitize = require('express-mongo-sanitize');
|
||||
const fs = require('fs');
|
||||
const compression = require('compression');
|
||||
const cookieParser = require('cookie-parser');
|
||||
const { isEnabled } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const mongoSanitize = require('express-mongo-sanitize');
|
||||
const { connectDb, indexSync } = require('~/db');
|
||||
|
||||
const { jwtLogin, passportLogin } = require('~/strategies');
|
||||
const { isEnabled } = require('~/server/utils');
|
||||
const { ldapLogin } = require('~/strategies');
|
||||
const { logger } = require('~/config');
|
||||
const validateImageRequest = require('./middleware/validateImageRequest');
|
||||
const { jwtLogin, ldapLogin, passportLogin } = require('~/strategies');
|
||||
const errorController = require('./controllers/ErrorController');
|
||||
const initializeMCP = require('./services/initializeMCP');
|
||||
const configureSocialLogins = require('./socialLogins');
|
||||
const AppService = require('./services/AppService');
|
||||
const staticCache = require('./utils/staticCache');
|
||||
|
@ -119,6 +119,7 @@ const startServer = async () => {
|
|||
app.use('/api/bedrock', routes.bedrock);
|
||||
app.use('/api/memories', routes.memories);
|
||||
app.use('/api/tags', routes.tags);
|
||||
app.use('/api/mcp', routes.mcp);
|
||||
|
||||
app.use((req, res) => {
|
||||
res.set({
|
||||
|
@ -142,6 +143,8 @@ const startServer = async () => {
|
|||
} else {
|
||||
logger.info(`Server listening at http://${host == '0.0.0.0' ? 'localhost' : host}:${port}`);
|
||||
}
|
||||
|
||||
initializeMCP(app);
|
||||
});
|
||||
};
|
||||
|
||||
|
@ -184,5 +187,5 @@ process.on('uncaughtException', (err) => {
|
|||
process.exit(1);
|
||||
});
|
||||
|
||||
// export app for easier testing purposes
|
||||
/** Export app for easier testing purposes */
|
||||
module.exports = app;
|
||||
|
|
|
@ -1,8 +1,10 @@
|
|||
const express = require('express');
|
||||
const jwt = require('jsonwebtoken');
|
||||
const { getAccessToken } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { CacheKeys } = require('librechat-data-provider');
|
||||
const { getAccessToken } = require('~/server/services/TokenService');
|
||||
const { logger, getFlowStateManager } = require('~/config');
|
||||
const { findToken, updateToken, createToken } = require('~/models');
|
||||
const { getFlowStateManager } = require('~/config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
const router = express.Router();
|
||||
|
@ -28,18 +30,19 @@ router.get('/:action_id/oauth/callback', async (req, res) => {
|
|||
try {
|
||||
decodedState = jwt.verify(state, JWT_SECRET);
|
||||
} catch (err) {
|
||||
logger.error('Error verifying state parameter:', err);
|
||||
await flowManager.failFlow(identifier, 'oauth', 'Invalid or expired state parameter');
|
||||
return res.status(400).send('Invalid or expired state parameter');
|
||||
return res.redirect('/oauth/error?error=invalid_state');
|
||||
}
|
||||
|
||||
if (decodedState.action_id !== action_id) {
|
||||
await flowManager.failFlow(identifier, 'oauth', 'Mismatched action ID in state parameter');
|
||||
return res.status(400).send('Mismatched action ID in state parameter');
|
||||
return res.redirect('/oauth/error?error=invalid_state');
|
||||
}
|
||||
|
||||
if (!decodedState.user) {
|
||||
await flowManager.failFlow(identifier, 'oauth', 'Invalid user ID in state parameter');
|
||||
return res.status(400).send('Invalid user ID in state parameter');
|
||||
return res.redirect('/oauth/error?error=invalid_state');
|
||||
}
|
||||
identifier = `${decodedState.user}:${action_id}`;
|
||||
const flowState = await flowManager.getFlowState(identifier, 'oauth');
|
||||
|
@ -47,91 +50,34 @@ router.get('/:action_id/oauth/callback', async (req, res) => {
|
|||
throw new Error('OAuth flow not found');
|
||||
}
|
||||
|
||||
const tokenData = await getAccessToken({
|
||||
code,
|
||||
userId: decodedState.user,
|
||||
identifier,
|
||||
client_url: flowState.metadata.client_url,
|
||||
redirect_uri: flowState.metadata.redirect_uri,
|
||||
token_exchange_method: flowState.metadata.token_exchange_method,
|
||||
/** Encrypted values */
|
||||
encrypted_oauth_client_id: flowState.metadata.encrypted_oauth_client_id,
|
||||
encrypted_oauth_client_secret: flowState.metadata.encrypted_oauth_client_secret,
|
||||
});
|
||||
const tokenData = await getAccessToken(
|
||||
{
|
||||
code,
|
||||
userId: decodedState.user,
|
||||
identifier,
|
||||
client_url: flowState.metadata.client_url,
|
||||
redirect_uri: flowState.metadata.redirect_uri,
|
||||
token_exchange_method: flowState.metadata.token_exchange_method,
|
||||
/** Encrypted values */
|
||||
encrypted_oauth_client_id: flowState.metadata.encrypted_oauth_client_id,
|
||||
encrypted_oauth_client_secret: flowState.metadata.encrypted_oauth_client_secret,
|
||||
},
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
},
|
||||
);
|
||||
await flowManager.completeFlow(identifier, 'oauth', tokenData);
|
||||
res.send(`
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>Authentication Successful</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
|
||||
<style>
|
||||
body {
|
||||
font-family: ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont;
|
||||
background-color: rgb(249, 250, 251);
|
||||
margin: 0;
|
||||
padding: 2rem;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
min-height: 100vh;
|
||||
}
|
||||
.card {
|
||||
background-color: white;
|
||||
border-radius: 0.5rem;
|
||||
padding: 2rem;
|
||||
max-width: 28rem;
|
||||
width: 100%;
|
||||
box-shadow: 0 4px 6px -1px rgb(0 0 0 / 0.1), 0 2px 4px -2px rgb(0 0 0 / 0.1);
|
||||
text-align: center;
|
||||
}
|
||||
.heading {
|
||||
color: rgb(17, 24, 39);
|
||||
font-size: 1.875rem;
|
||||
font-weight: 700;
|
||||
margin: 0 0 1rem;
|
||||
}
|
||||
.description {
|
||||
color: rgb(75, 85, 99);
|
||||
font-size: 0.875rem;
|
||||
margin: 0.5rem 0;
|
||||
}
|
||||
.countdown {
|
||||
color: rgb(99, 102, 241);
|
||||
font-weight: 500;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="card">
|
||||
<h1 class="heading">Authentication Successful</h1>
|
||||
<p class="description">
|
||||
Your authentication was successful. This window will close in
|
||||
<span class="countdown" id="countdown">3</span> seconds.
|
||||
</p>
|
||||
</div>
|
||||
<script>
|
||||
let secondsLeft = 3;
|
||||
const countdownElement = document.getElementById('countdown');
|
||||
|
||||
const countdown = setInterval(() => {
|
||||
secondsLeft--;
|
||||
countdownElement.textContent = secondsLeft;
|
||||
|
||||
if (secondsLeft <= 0) {
|
||||
clearInterval(countdown);
|
||||
window.close();
|
||||
}
|
||||
}, 1000);
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
`);
|
||||
|
||||
/** Redirect to React success page */
|
||||
const serverName = flowState.metadata?.action_name || `Action ${action_id}`;
|
||||
const redirectUrl = `/oauth/success?serverName=${encodeURIComponent(serverName)}`;
|
||||
res.redirect(redirectUrl);
|
||||
} catch (error) {
|
||||
logger.error('Error in OAuth callback:', error);
|
||||
await flowManager.failFlow(identifier, 'oauth', error);
|
||||
res.status(500).send('Authentication failed. Please try again.');
|
||||
res.redirect('/oauth/error?error=callback_failed');
|
||||
}
|
||||
});
|
||||
|
||||
|
|
|
@ -27,6 +27,7 @@ const edit = require('./edit');
|
|||
const keys = require('./keys');
|
||||
const user = require('./user');
|
||||
const ask = require('./ask');
|
||||
const mcp = require('./mcp');
|
||||
|
||||
module.exports = {
|
||||
ask,
|
||||
|
@ -58,4 +59,5 @@ module.exports = {
|
|||
assistants,
|
||||
categories,
|
||||
staticRoute,
|
||||
mcp,
|
||||
};
|
||||
|
|
205
api/server/routes/mcp.js
Normal file
205
api/server/routes/mcp.js
Normal file
|
@ -0,0 +1,205 @@
|
|||
const { Router } = require('express');
|
||||
const { MCPOAuthHandler } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { CacheKeys } = require('librechat-data-provider');
|
||||
const { requireJwtAuth } = require('~/server/middleware');
|
||||
const { getFlowStateManager } = require('~/config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
const router = Router();
|
||||
|
||||
/**
|
||||
* Initiate OAuth flow
|
||||
* This endpoint is called when the user clicks the auth link in the UI
|
||||
*/
|
||||
router.get('/:serverName/oauth/initiate', requireJwtAuth, async (req, res) => {
|
||||
try {
|
||||
const { serverName } = req.params;
|
||||
const { userId, flowId } = req.query;
|
||||
const user = req.user;
|
||||
|
||||
// Verify the userId matches the authenticated user
|
||||
if (userId !== user.id) {
|
||||
return res.status(403).json({ error: 'User mismatch' });
|
||||
}
|
||||
|
||||
logger.debug('[MCP OAuth] Initiate request', { serverName, userId, flowId });
|
||||
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = getFlowStateManager(flowsCache);
|
||||
|
||||
/** Flow state to retrieve OAuth config */
|
||||
const flowState = await flowManager.getFlowState(flowId, 'mcp_oauth');
|
||||
if (!flowState) {
|
||||
logger.error('[MCP OAuth] Flow state not found', { flowId });
|
||||
return res.status(404).json({ error: 'Flow not found' });
|
||||
}
|
||||
|
||||
const { serverUrl, oauth: oauthConfig } = flowState.metadata || {};
|
||||
if (!serverUrl || !oauthConfig) {
|
||||
logger.error('[MCP OAuth] Missing server URL or OAuth config in flow state');
|
||||
return res.status(400).json({ error: 'Invalid flow state' });
|
||||
}
|
||||
|
||||
const { authorizationUrl, flowId: oauthFlowId } = await MCPOAuthHandler.initiateOAuthFlow(
|
||||
serverName,
|
||||
serverUrl,
|
||||
userId,
|
||||
oauthConfig,
|
||||
);
|
||||
|
||||
logger.debug('[MCP OAuth] OAuth flow initiated', { oauthFlowId, authorizationUrl });
|
||||
|
||||
// Redirect user to the authorization URL
|
||||
res.redirect(authorizationUrl);
|
||||
} catch (error) {
|
||||
logger.error('[MCP OAuth] Failed to initiate OAuth', error);
|
||||
res.status(500).json({ error: 'Failed to initiate OAuth' });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* OAuth callback handler
|
||||
* This handles the OAuth callback after the user has authorized the application
|
||||
*/
|
||||
router.get('/:serverName/oauth/callback', async (req, res) => {
|
||||
try {
|
||||
const { serverName } = req.params;
|
||||
const { code, state, error: oauthError } = req.query;
|
||||
|
||||
logger.debug('[MCP OAuth] Callback received', {
|
||||
serverName,
|
||||
code: code ? 'present' : 'missing',
|
||||
state,
|
||||
error: oauthError,
|
||||
});
|
||||
|
||||
if (oauthError) {
|
||||
logger.error('[MCP OAuth] OAuth error received', { error: oauthError });
|
||||
return res.redirect(`/oauth/error?error=${encodeURIComponent(String(oauthError))}`);
|
||||
}
|
||||
|
||||
if (!code || typeof code !== 'string') {
|
||||
logger.error('[MCP OAuth] Missing or invalid code');
|
||||
return res.redirect('/oauth/error?error=missing_code');
|
||||
}
|
||||
|
||||
if (!state || typeof state !== 'string') {
|
||||
logger.error('[MCP OAuth] Missing or invalid state');
|
||||
return res.redirect('/oauth/error?error=missing_state');
|
||||
}
|
||||
|
||||
// Extract flow ID from state
|
||||
const flowId = state;
|
||||
logger.debug('[MCP OAuth] Using flow ID from state', { flowId });
|
||||
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = getFlowStateManager(flowsCache);
|
||||
|
||||
logger.debug('[MCP OAuth] Getting flow state for flowId: ' + flowId);
|
||||
const flowState = await MCPOAuthHandler.getFlowState(flowId, flowManager);
|
||||
|
||||
if (!flowState) {
|
||||
logger.error('[MCP OAuth] Flow state not found for flowId:', flowId);
|
||||
return res.redirect('/oauth/error?error=invalid_state');
|
||||
}
|
||||
|
||||
logger.debug('[MCP OAuth] Flow state details', {
|
||||
serverName: flowState.serverName,
|
||||
userId: flowState.userId,
|
||||
hasMetadata: !!flowState.metadata,
|
||||
hasClientInfo: !!flowState.clientInfo,
|
||||
hasCodeVerifier: !!flowState.codeVerifier,
|
||||
});
|
||||
|
||||
// Complete the OAuth flow
|
||||
logger.debug('[MCP OAuth] Completing OAuth flow');
|
||||
const tokens = await MCPOAuthHandler.completeOAuthFlow(flowId, code, flowManager);
|
||||
logger.info('[MCP OAuth] OAuth flow completed, tokens received in callback route');
|
||||
|
||||
// For system-level OAuth, we need to store the tokens and retry the connection
|
||||
if (flowState.userId === 'system') {
|
||||
logger.debug(`[MCP OAuth] System-level OAuth completed for ${serverName}`);
|
||||
}
|
||||
|
||||
/** ID of the flow that the tool/connection is waiting for */
|
||||
const toolFlowId = flowState.metadata?.toolFlowId;
|
||||
if (toolFlowId) {
|
||||
logger.debug('[MCP OAuth] Completing tool flow', { toolFlowId });
|
||||
await flowManager.completeFlow(toolFlowId, 'mcp_oauth', tokens);
|
||||
}
|
||||
|
||||
/** Redirect to success page with flowId and serverName */
|
||||
const redirectUrl = `/oauth/success?serverName=${encodeURIComponent(serverName)}`;
|
||||
res.redirect(redirectUrl);
|
||||
} catch (error) {
|
||||
logger.error('[MCP OAuth] OAuth callback error', error);
|
||||
res.redirect('/oauth/error?error=callback_failed');
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* Get OAuth tokens for a completed flow
|
||||
* This is primarily for user-level OAuth flows
|
||||
*/
|
||||
router.get('/oauth/tokens/:flowId', requireJwtAuth, async (req, res) => {
|
||||
try {
|
||||
const { flowId } = req.params;
|
||||
const user = req.user;
|
||||
|
||||
if (!user?.id) {
|
||||
return res.status(401).json({ error: 'User not authenticated' });
|
||||
}
|
||||
|
||||
// Allow system flows or user-owned flows
|
||||
if (!flowId.startsWith(`${user.id}:`) && !flowId.startsWith('system:')) {
|
||||
return res.status(403).json({ error: 'Access denied' });
|
||||
}
|
||||
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = getFlowStateManager(flowsCache);
|
||||
|
||||
const flowState = await flowManager.getFlowState(flowId, 'mcp_oauth');
|
||||
if (!flowState) {
|
||||
return res.status(404).json({ error: 'Flow not found' });
|
||||
}
|
||||
|
||||
if (flowState.status !== 'COMPLETED') {
|
||||
return res.status(400).json({ error: 'Flow not completed' });
|
||||
}
|
||||
|
||||
res.json({ tokens: flowState.result });
|
||||
} catch (error) {
|
||||
logger.error('[MCP OAuth] Failed to get tokens', error);
|
||||
res.status(500).json({ error: 'Failed to get tokens' });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* Check OAuth flow status
|
||||
* This endpoint can be used to poll the status of an OAuth flow
|
||||
*/
|
||||
router.get('/oauth/status/:flowId', async (req, res) => {
|
||||
try {
|
||||
const { flowId } = req.params;
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = getFlowStateManager(flowsCache);
|
||||
|
||||
const flowState = await flowManager.getFlowState(flowId, 'mcp_oauth');
|
||||
if (!flowState) {
|
||||
return res.status(404).json({ error: 'Flow not found' });
|
||||
}
|
||||
|
||||
res.json({
|
||||
status: flowState.status,
|
||||
completed: flowState.status === 'COMPLETED',
|
||||
failed: flowState.status === 'FAILED',
|
||||
error: flowState.error,
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('[MCP OAuth] Failed to get flow status', error);
|
||||
res.status(500).json({ error: 'Failed to get flow status' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
|
@ -47,7 +47,9 @@ const oauthHandler = async (req, res) => {
|
|||
|
||||
router.get('/error', (req, res) => {
|
||||
// A single error message is pushed by passport when authentication fails.
|
||||
logger.error('Error in OAuth authentication:', { message: req.session.messages.pop() });
|
||||
logger.error('Error in OAuth authentication:', {
|
||||
message: req.session?.messages?.pop() || 'Unknown error',
|
||||
});
|
||||
|
||||
// Redirect to login page with auth_failed parameter to prevent infinite redirect loops
|
||||
res.redirect(`${domains.client}/login?redirect=false`);
|
||||
|
|
|
@ -3,7 +3,13 @@ const { nanoid } = require('nanoid');
|
|||
const { tool } = require('@langchain/core/tools');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { GraphEvents, sleep } = require('@librechat/agents');
|
||||
const { sendEvent, logAxiosError } = require('@librechat/api');
|
||||
const {
|
||||
sendEvent,
|
||||
encryptV2,
|
||||
decryptV2,
|
||||
logAxiosError,
|
||||
refreshAccessToken,
|
||||
} = require('@librechat/api');
|
||||
const {
|
||||
Time,
|
||||
CacheKeys,
|
||||
|
@ -14,13 +20,11 @@ const {
|
|||
isImageVisionTool,
|
||||
actionDomainSeparator,
|
||||
} = require('librechat-data-provider');
|
||||
const { refreshAccessToken } = require('~/server/services/TokenService');
|
||||
const { encryptV2, decryptV2 } = require('~/server/utils/crypto');
|
||||
const { findToken, updateToken, createToken } = require('~/models');
|
||||
const { getActions, deleteActions } = require('~/models/Action');
|
||||
const { deleteAssistant } = require('~/models/Assistant');
|
||||
const { getFlowStateManager } = require('~/config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
const { findToken } = require('~/models');
|
||||
|
||||
const JWT_SECRET = process.env.JWT_SECRET;
|
||||
const toolNameRegex = /^[a-zA-Z0-9_-]+$/;
|
||||
|
@ -258,15 +262,22 @@ async function createActionTool({
|
|||
try {
|
||||
const refresh_token = await decryptV2(refreshTokenData.token);
|
||||
const refreshTokens = async () =>
|
||||
await refreshAccessToken({
|
||||
userId,
|
||||
identifier,
|
||||
refresh_token,
|
||||
client_url: metadata.auth.client_url,
|
||||
encrypted_oauth_client_id: encrypted.oauth_client_id,
|
||||
token_exchange_method: metadata.auth.token_exchange_method,
|
||||
encrypted_oauth_client_secret: encrypted.oauth_client_secret,
|
||||
});
|
||||
await refreshAccessToken(
|
||||
{
|
||||
userId,
|
||||
identifier,
|
||||
refresh_token,
|
||||
client_url: metadata.auth.client_url,
|
||||
encrypted_oauth_client_id: encrypted.oauth_client_id,
|
||||
token_exchange_method: metadata.auth.token_exchange_method,
|
||||
encrypted_oauth_client_secret: encrypted.oauth_client_secret,
|
||||
},
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
},
|
||||
);
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = getFlowStateManager(flowsCache);
|
||||
const refreshData = await flowManager.createFlowWithHandler(
|
||||
|
|
|
@ -1,7 +1,6 @@
|
|||
const {
|
||||
FileSources,
|
||||
loadOCRConfig,
|
||||
processMCPEnv,
|
||||
EModelEndpoint,
|
||||
loadMemoryConfig,
|
||||
getConfigDefaults,
|
||||
|
@ -28,7 +27,7 @@ const { initializeS3 } = require('./Files/S3/initialize');
|
|||
const { loadAndFormatTools } = require('./ToolService');
|
||||
const { isEnabled } = require('~/server/utils');
|
||||
const { initializeRoles } = require('~/models');
|
||||
const { getMCPManager } = require('~/config');
|
||||
const { setCachedTools } = require('./Config');
|
||||
const paths = require('~/config/paths');
|
||||
|
||||
/**
|
||||
|
@ -76,11 +75,10 @@ const AppService = async (app) => {
|
|||
directory: paths.structuredTools,
|
||||
});
|
||||
|
||||
if (config.mcpServers != null) {
|
||||
const mcpManager = getMCPManager();
|
||||
await mcpManager.initializeMCP(config.mcpServers, processMCPEnv);
|
||||
await mcpManager.mapAvailableTools(availableTools);
|
||||
}
|
||||
await setCachedTools(availableTools, { isGlobal: true });
|
||||
|
||||
// Store MCP config for later initialization
|
||||
const mcpConfig = config.mcpServers || null;
|
||||
|
||||
const socialLogins =
|
||||
config?.registration?.socialLogins ?? configDefaults?.registration?.socialLogins;
|
||||
|
@ -96,11 +94,11 @@ const AppService = async (app) => {
|
|||
socialLogins,
|
||||
filteredTools,
|
||||
includedTools,
|
||||
availableTools,
|
||||
imageOutputType,
|
||||
interfaceConfig,
|
||||
turnstileConfig,
|
||||
balance,
|
||||
mcpConfig,
|
||||
};
|
||||
|
||||
const agentsDefaults = agentsConfigSetup(config);
|
||||
|
|
|
@ -32,6 +32,25 @@ jest.mock('~/models', () => ({
|
|||
jest.mock('~/models/Role', () => ({
|
||||
updateAccessPermissions: jest.fn(),
|
||||
}));
|
||||
jest.mock('./Config', () => ({
|
||||
setCachedTools: jest.fn(),
|
||||
getCachedTools: jest.fn().mockResolvedValue({
|
||||
ExampleTool: {
|
||||
type: 'function',
|
||||
function: {
|
||||
description: 'Example tool function',
|
||||
name: 'exampleFunction',
|
||||
parameters: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
param1: { type: 'string', description: 'An example parameter' },
|
||||
},
|
||||
required: ['param1'],
|
||||
},
|
||||
},
|
||||
},
|
||||
}),
|
||||
}));
|
||||
jest.mock('./ToolService', () => ({
|
||||
loadAndFormatTools: jest.fn().mockReturnValue({
|
||||
ExampleTool: {
|
||||
|
@ -121,22 +140,9 @@ describe('AppService', () => {
|
|||
sidePanel: true,
|
||||
presets: true,
|
||||
}),
|
||||
mcpConfig: null,
|
||||
turnstileConfig: mockedTurnstileConfig,
|
||||
modelSpecs: undefined,
|
||||
availableTools: {
|
||||
ExampleTool: {
|
||||
type: 'function',
|
||||
function: expect.objectContaining({
|
||||
description: 'Example tool function',
|
||||
name: 'exampleFunction',
|
||||
parameters: expect.objectContaining({
|
||||
type: 'object',
|
||||
properties: expect.any(Object),
|
||||
required: expect.arrayContaining(['param1']),
|
||||
}),
|
||||
}),
|
||||
},
|
||||
},
|
||||
paths: expect.anything(),
|
||||
ocr: expect.anything(),
|
||||
imageOutputType: expect.any(String),
|
||||
|
@ -223,14 +229,41 @@ describe('AppService', () => {
|
|||
|
||||
it('should load and format tools accurately with defined structure', async () => {
|
||||
const { loadAndFormatTools } = require('./ToolService');
|
||||
const { setCachedTools, getCachedTools } = require('./Config');
|
||||
|
||||
await AppService(app);
|
||||
|
||||
expect(loadAndFormatTools).toHaveBeenCalledWith({
|
||||
adminFilter: undefined,
|
||||
adminIncluded: undefined,
|
||||
directory: expect.anything(),
|
||||
});
|
||||
|
||||
expect(app.locals.availableTools.ExampleTool).toBeDefined();
|
||||
expect(app.locals.availableTools.ExampleTool).toEqual({
|
||||
// Verify setCachedTools was called with the tools
|
||||
expect(setCachedTools).toHaveBeenCalledWith(
|
||||
{
|
||||
ExampleTool: {
|
||||
type: 'function',
|
||||
function: {
|
||||
description: 'Example tool function',
|
||||
name: 'exampleFunction',
|
||||
parameters: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
param1: { type: 'string', description: 'An example parameter' },
|
||||
},
|
||||
required: ['param1'],
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{ isGlobal: true },
|
||||
);
|
||||
|
||||
// Verify we can retrieve the tools from cache
|
||||
const cachedTools = await getCachedTools({ includeGlobal: true });
|
||||
expect(cachedTools.ExampleTool).toBeDefined();
|
||||
expect(cachedTools.ExampleTool).toEqual({
|
||||
type: 'function',
|
||||
function: {
|
||||
description: 'Example tool function',
|
||||
|
@ -535,7 +568,6 @@ describe('AppService updating app.locals and issuing warnings', () => {
|
|||
|
||||
expect(app.locals).toBeDefined();
|
||||
expect(app.locals.paths).toBeDefined();
|
||||
expect(app.locals.availableTools).toBeDefined();
|
||||
expect(app.locals.fileStrategy).toEqual(FileSources.local);
|
||||
expect(app.locals.socialLogins).toEqual(defaultSocialLogins);
|
||||
expect(app.locals.balance).toEqual(
|
||||
|
@ -568,7 +600,6 @@ describe('AppService updating app.locals and issuing warnings', () => {
|
|||
|
||||
expect(app.locals).toBeDefined();
|
||||
expect(app.locals.paths).toBeDefined();
|
||||
expect(app.locals.availableTools).toBeDefined();
|
||||
expect(app.locals.fileStrategy).toEqual(customConfig.fileStrategy);
|
||||
expect(app.locals.socialLogins).toEqual(customConfig.registration.socialLogins);
|
||||
expect(app.locals.balance).toEqual(customConfig.balance);
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
const bcrypt = require('bcryptjs');
|
||||
const { webcrypto } = require('node:crypto');
|
||||
const { isEnabled } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { SystemRoles, errorsToString } = require('librechat-data-provider');
|
||||
const {
|
||||
findUser,
|
||||
|
@ -17,11 +19,10 @@ const {
|
|||
deleteUserById,
|
||||
generateRefreshToken,
|
||||
} = require('~/models');
|
||||
const { isEnabled, checkEmailConfig, sendEmail } = require('~/server/utils');
|
||||
const { isEmailDomainAllowed } = require('~/server/services/domains');
|
||||
const { checkEmailConfig, sendEmail } = require('~/server/utils');
|
||||
const { getBalanceConfig } = require('~/server/services/Config');
|
||||
const { registerSchema } = require('~/strategies/validators');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
const domains = {
|
||||
client: process.env.DOMAIN_CLIENT,
|
||||
|
|
258
api/server/services/Config/getCachedTools.js
Normal file
258
api/server/services/Config/getCachedTools.js
Normal file
|
@ -0,0 +1,258 @@
|
|||
const { CacheKeys } = require('librechat-data-provider');
|
||||
const getLogStores = require('~/cache/getLogStores');
|
||||
|
||||
/**
|
||||
* Cache key generators for different tool access patterns
|
||||
* These will support future permission-based caching
|
||||
*/
|
||||
const ToolCacheKeys = {
|
||||
/** Global tools available to all users */
|
||||
GLOBAL: 'tools:global',
|
||||
/** Tools available to a specific user */
|
||||
USER: (userId) => `tools:user:${userId}`,
|
||||
/** Tools available to a specific role */
|
||||
ROLE: (roleId) => `tools:role:${roleId}`,
|
||||
/** Tools available to a specific group */
|
||||
GROUP: (groupId) => `tools:group:${groupId}`,
|
||||
/** Combined effective tools for a user (computed from all sources) */
|
||||
EFFECTIVE: (userId) => `tools:effective:${userId}`,
|
||||
};
|
||||
|
||||
/**
|
||||
* Retrieves available tools from cache
|
||||
* @function getCachedTools
|
||||
* @param {Object} options - Options for retrieving tools
|
||||
* @param {string} [options.userId] - User ID for user-specific tools
|
||||
* @param {string[]} [options.roleIds] - Role IDs for role-based tools
|
||||
* @param {string[]} [options.groupIds] - Group IDs for group-based tools
|
||||
* @param {boolean} [options.includeGlobal=true] - Whether to include global tools
|
||||
* @returns {Promise<Object|null>} The available tools object or null if not cached
|
||||
*/
|
||||
async function getCachedTools(options = {}) {
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
const { userId, roleIds = [], groupIds = [], includeGlobal = true } = options;
|
||||
|
||||
// For now, return global tools (current behavior)
|
||||
// This will be expanded to merge tools from different sources
|
||||
if (!userId && includeGlobal) {
|
||||
return await cache.get(ToolCacheKeys.GLOBAL);
|
||||
}
|
||||
|
||||
// Future implementation will merge tools from multiple sources
|
||||
// based on user permissions, roles, and groups
|
||||
if (userId) {
|
||||
// Check if we have pre-computed effective tools for this user
|
||||
const effectiveTools = await cache.get(ToolCacheKeys.EFFECTIVE(userId));
|
||||
if (effectiveTools) {
|
||||
return effectiveTools;
|
||||
}
|
||||
|
||||
// Otherwise, compute from individual sources
|
||||
const toolSources = [];
|
||||
|
||||
if (includeGlobal) {
|
||||
const globalTools = await cache.get(ToolCacheKeys.GLOBAL);
|
||||
if (globalTools) {
|
||||
toolSources.push(globalTools);
|
||||
}
|
||||
}
|
||||
|
||||
// User-specific tools
|
||||
const userTools = await cache.get(ToolCacheKeys.USER(userId));
|
||||
if (userTools) {
|
||||
toolSources.push(userTools);
|
||||
}
|
||||
|
||||
// Role-based tools
|
||||
for (const roleId of roleIds) {
|
||||
const roleTools = await cache.get(ToolCacheKeys.ROLE(roleId));
|
||||
if (roleTools) {
|
||||
toolSources.push(roleTools);
|
||||
}
|
||||
}
|
||||
|
||||
// Group-based tools
|
||||
for (const groupId of groupIds) {
|
||||
const groupTools = await cache.get(ToolCacheKeys.GROUP(groupId));
|
||||
if (groupTools) {
|
||||
toolSources.push(groupTools);
|
||||
}
|
||||
}
|
||||
|
||||
// Merge all tool sources (for now, simple merge - future will handle conflicts)
|
||||
if (toolSources.length > 0) {
|
||||
return mergeToolSources(toolSources);
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets available tools in cache
|
||||
* @function setCachedTools
|
||||
* @param {Object} tools - The tools object to cache
|
||||
* @param {Object} options - Options for caching tools
|
||||
* @param {string} [options.userId] - User ID for user-specific tools
|
||||
* @param {string} [options.roleId] - Role ID for role-based tools
|
||||
* @param {string} [options.groupId] - Group ID for group-based tools
|
||||
* @param {boolean} [options.isGlobal=false] - Whether these are global tools
|
||||
* @param {number} [options.ttl] - Time to live in milliseconds
|
||||
* @returns {Promise<boolean>} Whether the operation was successful
|
||||
*/
|
||||
async function setCachedTools(tools, options = {}) {
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
const { userId, roleId, groupId, isGlobal = false, ttl } = options;
|
||||
|
||||
let cacheKey;
|
||||
if (isGlobal || (!userId && !roleId && !groupId)) {
|
||||
cacheKey = ToolCacheKeys.GLOBAL;
|
||||
} else if (userId) {
|
||||
cacheKey = ToolCacheKeys.USER(userId);
|
||||
} else if (roleId) {
|
||||
cacheKey = ToolCacheKeys.ROLE(roleId);
|
||||
} else if (groupId) {
|
||||
cacheKey = ToolCacheKeys.GROUP(groupId);
|
||||
}
|
||||
|
||||
if (!cacheKey) {
|
||||
throw new Error('Invalid cache key options provided');
|
||||
}
|
||||
|
||||
return await cache.set(cacheKey, tools, ttl);
|
||||
}
|
||||
|
||||
/**
|
||||
* Invalidates cached tools
|
||||
* @function invalidateCachedTools
|
||||
* @param {Object} options - Options for invalidating tools
|
||||
* @param {string} [options.userId] - User ID to invalidate
|
||||
* @param {string} [options.roleId] - Role ID to invalidate
|
||||
* @param {string} [options.groupId] - Group ID to invalidate
|
||||
* @param {boolean} [options.invalidateGlobal=false] - Whether to invalidate global tools
|
||||
* @param {boolean} [options.invalidateEffective=true] - Whether to invalidate effective tools
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function invalidateCachedTools(options = {}) {
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
const { userId, roleId, groupId, invalidateGlobal = false, invalidateEffective = true } = options;
|
||||
|
||||
const keysToDelete = [];
|
||||
|
||||
if (invalidateGlobal) {
|
||||
keysToDelete.push(ToolCacheKeys.GLOBAL);
|
||||
}
|
||||
|
||||
if (userId) {
|
||||
keysToDelete.push(ToolCacheKeys.USER(userId));
|
||||
if (invalidateEffective) {
|
||||
keysToDelete.push(ToolCacheKeys.EFFECTIVE(userId));
|
||||
}
|
||||
}
|
||||
|
||||
if (roleId) {
|
||||
keysToDelete.push(ToolCacheKeys.ROLE(roleId));
|
||||
// TODO: In future, invalidate all users with this role
|
||||
}
|
||||
|
||||
if (groupId) {
|
||||
keysToDelete.push(ToolCacheKeys.GROUP(groupId));
|
||||
// TODO: In future, invalidate all users in this group
|
||||
}
|
||||
|
||||
await Promise.all(keysToDelete.map((key) => cache.delete(key)));
|
||||
}
|
||||
|
||||
/**
|
||||
* Computes and caches effective tools for a user
|
||||
* @function computeEffectiveTools
|
||||
* @param {string} userId - The user ID
|
||||
* @param {Object} context - Context containing user's roles and groups
|
||||
* @param {string[]} [context.roleIds=[]] - User's role IDs
|
||||
* @param {string[]} [context.groupIds=[]] - User's group IDs
|
||||
* @param {number} [ttl] - Time to live for the computed result
|
||||
* @returns {Promise<Object>} The computed effective tools
|
||||
*/
|
||||
async function computeEffectiveTools(userId, context = {}, ttl) {
|
||||
const { roleIds = [], groupIds = [] } = context;
|
||||
|
||||
// Get all tool sources
|
||||
const tools = await getCachedTools({
|
||||
userId,
|
||||
roleIds,
|
||||
groupIds,
|
||||
includeGlobal: true,
|
||||
});
|
||||
|
||||
if (tools) {
|
||||
// Cache the computed result
|
||||
const cache = getLogStores(CacheKeys.CONFIG_STORE);
|
||||
await cache.set(ToolCacheKeys.EFFECTIVE(userId), tools, ttl);
|
||||
}
|
||||
|
||||
return tools;
|
||||
}
|
||||
|
||||
/**
|
||||
* Merges multiple tool sources into a single tools object
|
||||
* @function mergeToolSources
|
||||
* @param {Object[]} sources - Array of tool objects to merge
|
||||
* @returns {Object} Merged tools object
|
||||
*/
|
||||
function mergeToolSources(sources) {
|
||||
// For now, simple merge that combines all tools
|
||||
// Future implementation will handle:
|
||||
// - Permission precedence (deny > allow)
|
||||
// - Tool property conflicts
|
||||
// - Metadata merging
|
||||
const merged = {};
|
||||
|
||||
for (const source of sources) {
|
||||
if (!source || typeof source !== 'object') {
|
||||
continue;
|
||||
}
|
||||
|
||||
for (const [toolId, toolConfig] of Object.entries(source)) {
|
||||
// Simple last-write-wins for now
|
||||
// Future: merge based on permission levels
|
||||
merged[toolId] = toolConfig;
|
||||
}
|
||||
}
|
||||
|
||||
return merged;
|
||||
}
|
||||
|
||||
/**
|
||||
* Middleware-friendly function to get tools for a request
|
||||
* @function getToolsForRequest
|
||||
* @param {Object} req - Express request object
|
||||
* @returns {Promise<Object|null>} Available tools for the request
|
||||
*/
|
||||
async function getToolsForRequest(req) {
|
||||
const userId = req.user?.id;
|
||||
|
||||
// For now, return global tools if no user
|
||||
if (!userId) {
|
||||
return getCachedTools({ includeGlobal: true });
|
||||
}
|
||||
|
||||
// Future: Extract roles and groups from req.user
|
||||
const roleIds = req.user?.roles || [];
|
||||
const groupIds = req.user?.groups || [];
|
||||
|
||||
return getCachedTools({
|
||||
userId,
|
||||
roleIds,
|
||||
groupIds,
|
||||
includeGlobal: true,
|
||||
});
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
ToolCacheKeys,
|
||||
getCachedTools,
|
||||
setCachedTools,
|
||||
getToolsForRequest,
|
||||
invalidateCachedTools,
|
||||
computeEffectiveTools,
|
||||
};
|
|
@ -1,4 +1,5 @@
|
|||
const { config } = require('./EndpointService');
|
||||
const getCachedTools = require('./getCachedTools');
|
||||
const getCustomConfig = require('./getCustomConfig');
|
||||
const loadCustomConfig = require('./loadCustomConfig');
|
||||
const loadConfigModels = require('./loadConfigModels');
|
||||
|
@ -14,6 +15,7 @@ module.exports = {
|
|||
loadDefaultModels,
|
||||
loadOverrideConfig,
|
||||
loadAsyncEndpoints,
|
||||
...getCachedTools,
|
||||
...getCustomConfig,
|
||||
...getEndpointsConfig,
|
||||
};
|
||||
|
|
|
@ -1,27 +1,111 @@
|
|||
const { z } = require('zod');
|
||||
const { tool } = require('@langchain/core/tools');
|
||||
const { normalizeServerName } = require('@librechat/api');
|
||||
const { Constants: AgentConstants, Providers } = require('@librechat/agents');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { Time, CacheKeys, StepTypes } = require('librechat-data-provider');
|
||||
const { sendEvent, normalizeServerName, MCPOAuthHandler } = require('@librechat/api');
|
||||
const { Constants: AgentConstants, Providers, GraphEvents } = require('@librechat/agents');
|
||||
const {
|
||||
Constants,
|
||||
ContentTypes,
|
||||
isAssistantsEndpoint,
|
||||
convertJsonSchemaToZod,
|
||||
} = require('librechat-data-provider');
|
||||
const { logger, getMCPManager } = require('~/config');
|
||||
const { getMCPManager, getFlowStateManager } = require('~/config');
|
||||
const { findToken, createToken, updateToken } = require('~/models');
|
||||
const { getCachedTools } = require('./Config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
|
||||
/**
|
||||
* @param {object} params
|
||||
* @param {ServerResponse} params.res - The Express response object for sending events.
|
||||
* @param {string} params.stepId - The ID of the step in the flow.
|
||||
* @param {ToolCallChunk} params.toolCall - The tool call object containing tool information.
|
||||
* @param {string} params.loginFlowId - The ID of the login flow.
|
||||
* @param {FlowStateManager<any>} params.flowManager - The flow manager instance.
|
||||
*/
|
||||
function createOAuthStart({ res, stepId, toolCall, loginFlowId, flowManager, signal }) {
|
||||
/**
|
||||
* Creates a function to handle OAuth login requests.
|
||||
* @param {string} authURL - The URL to redirect the user for OAuth authentication.
|
||||
* @returns {Promise<boolean>} Returns true to indicate the event was sent successfully.
|
||||
*/
|
||||
return async function (authURL) {
|
||||
/** @type {{ id: string; delta: AgentToolCallDelta }} */
|
||||
const data = {
|
||||
id: stepId,
|
||||
delta: {
|
||||
type: StepTypes.TOOL_CALLS,
|
||||
tool_calls: [{ ...toolCall, args: '' }],
|
||||
auth: authURL,
|
||||
expires_at: Date.now() + Time.TWO_MINUTES,
|
||||
},
|
||||
};
|
||||
/** Used to ensure the handler (use of `sendEvent`) is only invoked once */
|
||||
await flowManager.createFlowWithHandler(
|
||||
loginFlowId,
|
||||
'oauth_login',
|
||||
async () => {
|
||||
sendEvent(res, { event: GraphEvents.ON_RUN_STEP_DELTA, data });
|
||||
logger.debug('Sent OAuth login request to client');
|
||||
return true;
|
||||
},
|
||||
signal,
|
||||
);
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {object} params
|
||||
* @param {ServerResponse} params.res - The Express response object for sending events.
|
||||
* @param {string} params.stepId - The ID of the step in the flow.
|
||||
* @param {ToolCallChunk} params.toolCall - The tool call object containing tool information.
|
||||
* @param {string} params.loginFlowId - The ID of the login flow.
|
||||
* @param {FlowStateManager<any>} params.flowManager - The flow manager instance.
|
||||
*/
|
||||
function createOAuthEnd({ res, stepId, toolCall }) {
|
||||
return async function () {
|
||||
/** @type {{ id: string; delta: AgentToolCallDelta }} */
|
||||
const data = {
|
||||
id: stepId,
|
||||
delta: {
|
||||
type: StepTypes.TOOL_CALLS,
|
||||
tool_calls: [{ ...toolCall }],
|
||||
},
|
||||
};
|
||||
sendEvent(res, { event: GraphEvents.ON_RUN_STEP_DELTA, data });
|
||||
logger.debug('Sent OAuth login success to client');
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {object} params
|
||||
* @param {string} params.userId - The ID of the user.
|
||||
* @param {string} params.serverName - The name of the server.
|
||||
* @param {string} params.toolName - The name of the tool.
|
||||
* @param {FlowStateManager<any>} params.flowManager - The flow manager instance.
|
||||
*/
|
||||
function createAbortHandler({ userId, serverName, toolName, flowManager }) {
|
||||
return function () {
|
||||
logger.info(`[MCP][User: ${userId}][${serverName}][${toolName}] Tool call aborted`);
|
||||
const flowId = MCPOAuthHandler.generateFlowId(userId, serverName);
|
||||
flowManager.failFlow(flowId, 'mcp_oauth', new Error('Tool call aborted'));
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a general tool for an entire action set.
|
||||
*
|
||||
* @param {Object} params - The parameters for loading action sets.
|
||||
* @param {ServerRequest} params.req - The Express request object, containing user/request info.
|
||||
* @param {ServerResponse} params.res - The Express response object for sending events.
|
||||
* @param {string} params.toolKey - The toolKey for the tool.
|
||||
* @param {import('@librechat/agents').Providers | EModelEndpoint} params.provider - The provider for the tool.
|
||||
* @param {string} params.model - The model for the tool.
|
||||
* @returns { Promise<typeof tool | { _call: (toolInput: Object | string) => unknown}> } An object with `_call` method to execute the tool input.
|
||||
*/
|
||||
async function createMCPTool({ req, toolKey, provider: _provider }) {
|
||||
const toolDefinition = req.app.locals.availableTools[toolKey]?.function;
|
||||
async function createMCPTool({ req, res, toolKey, provider: _provider }) {
|
||||
const availableTools = await getCachedTools({ includeGlobal: true });
|
||||
const toolDefinition = availableTools?.[toolKey]?.function;
|
||||
if (!toolDefinition) {
|
||||
logger.error(`Tool ${toolKey} not found in available tools`);
|
||||
return null;
|
||||
|
@ -51,10 +135,39 @@ async function createMCPTool({ req, toolKey, provider: _provider }) {
|
|||
/** @type {(toolArguments: Object | string, config?: GraphRunnableConfig) => Promise<unknown>} */
|
||||
const _call = async (toolArguments, config) => {
|
||||
const userId = config?.configurable?.user?.id || config?.configurable?.user_id;
|
||||
/** @type {ReturnType<typeof createAbortHandler>} */
|
||||
let abortHandler = null;
|
||||
/** @type {AbortSignal} */
|
||||
let derivedSignal = null;
|
||||
|
||||
try {
|
||||
const derivedSignal = config?.signal ? AbortSignal.any([config.signal]) : undefined;
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = getFlowStateManager(flowsCache);
|
||||
derivedSignal = config?.signal ? AbortSignal.any([config.signal]) : undefined;
|
||||
const mcpManager = getMCPManager(userId);
|
||||
const provider = (config?.metadata?.provider || _provider)?.toLowerCase();
|
||||
|
||||
const { args: _args, stepId, ...toolCall } = config.toolCall ?? {};
|
||||
const loginFlowId = `${serverName}:oauth_login:${config.metadata.thread_id}:${config.metadata.run_id}`;
|
||||
const oauthStart = createOAuthStart({
|
||||
res,
|
||||
stepId,
|
||||
toolCall,
|
||||
loginFlowId,
|
||||
flowManager,
|
||||
signal: derivedSignal,
|
||||
});
|
||||
const oauthEnd = createOAuthEnd({
|
||||
res,
|
||||
stepId,
|
||||
toolCall,
|
||||
});
|
||||
|
||||
if (derivedSignal) {
|
||||
abortHandler = createAbortHandler({ userId, serverName, toolName, flowManager });
|
||||
derivedSignal.addEventListener('abort', abortHandler, { once: true });
|
||||
}
|
||||
|
||||
const result = await mcpManager.callTool({
|
||||
serverName,
|
||||
toolName,
|
||||
|
@ -64,6 +177,14 @@ async function createMCPTool({ req, toolKey, provider: _provider }) {
|
|||
signal: derivedSignal,
|
||||
user: config?.configurable?.user,
|
||||
},
|
||||
flowManager,
|
||||
tokenMethods: {
|
||||
findToken,
|
||||
createToken,
|
||||
updateToken,
|
||||
},
|
||||
oauthStart,
|
||||
oauthEnd,
|
||||
});
|
||||
|
||||
if (isAssistantsEndpoint(provider) && Array.isArray(result)) {
|
||||
|
@ -78,9 +199,28 @@ async function createMCPTool({ req, toolKey, provider: _provider }) {
|
|||
`[MCP][User: ${userId}][${serverName}] Error calling "${toolName}" MCP tool:`,
|
||||
error,
|
||||
);
|
||||
|
||||
/** OAuth error, provide a helpful message */
|
||||
const isOAuthError =
|
||||
error.message?.includes('401') ||
|
||||
error.message?.includes('OAuth') ||
|
||||
error.message?.includes('authentication') ||
|
||||
error.message?.includes('Non-200 status code (401)');
|
||||
|
||||
if (isOAuthError) {
|
||||
throw new Error(
|
||||
`OAuth authentication required for ${serverName}. Please check the server logs for the authentication URL.`,
|
||||
);
|
||||
}
|
||||
|
||||
throw new Error(
|
||||
`"${toolKey}" tool call failed${error?.message ? `: ${error?.message}` : '.'}`,
|
||||
);
|
||||
} finally {
|
||||
// Clean up abort handler to prevent memory leaks
|
||||
if (abortHandler && derivedSignal) {
|
||||
derivedSignal.removeEventListener('abort', abortHandler);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
const { encrypt, decrypt } = require('~/server/utils/crypto');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { encrypt, decrypt } = require('@librechat/api');
|
||||
const { PluginAuth } = require('~/db/models');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
/**
|
||||
* Asynchronously retrieves and decrypts the authentication value for a user's plugin, based on a specified authentication field.
|
||||
|
|
|
@ -1,195 +0,0 @@
|
|||
const axios = require('axios');
|
||||
const { logAxiosError } = require('@librechat/api');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { TokenExchangeMethodEnum } = require('librechat-data-provider');
|
||||
const { handleOAuthToken } = require('~/models/Token');
|
||||
const { decryptV2 } = require('~/server/utils/crypto');
|
||||
|
||||
/**
|
||||
* Processes the access tokens and stores them in the database.
|
||||
* @param {object} tokenData
|
||||
* @param {string} tokenData.access_token
|
||||
* @param {number} tokenData.expires_in
|
||||
* @param {string} [tokenData.refresh_token]
|
||||
* @param {number} [tokenData.refresh_token_expires_in]
|
||||
* @param {object} metadata
|
||||
* @param {string} metadata.userId
|
||||
* @param {string} metadata.identifier
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function processAccessTokens(tokenData, { userId, identifier }) {
|
||||
const { access_token, expires_in = 3600, refresh_token, refresh_token_expires_in } = tokenData;
|
||||
if (!access_token) {
|
||||
logger.error('Access token not found: ', tokenData);
|
||||
throw new Error('Access token not found');
|
||||
}
|
||||
await handleOAuthToken({
|
||||
identifier,
|
||||
token: access_token,
|
||||
expiresIn: expires_in,
|
||||
userId,
|
||||
});
|
||||
|
||||
if (refresh_token != null) {
|
||||
logger.debug('Processing refresh token');
|
||||
await handleOAuthToken({
|
||||
token: refresh_token,
|
||||
type: 'oauth_refresh',
|
||||
userId,
|
||||
identifier: `${identifier}:refresh`,
|
||||
expiresIn: refresh_token_expires_in ?? null,
|
||||
});
|
||||
}
|
||||
logger.debug('Access tokens processed');
|
||||
}
|
||||
|
||||
/**
|
||||
* Refreshes the access token using the refresh token.
|
||||
* @param {object} fields
|
||||
* @param {string} fields.userId - The ID of the user.
|
||||
* @param {string} fields.client_url - The URL of the OAuth provider.
|
||||
* @param {string} fields.identifier - The identifier for the token.
|
||||
* @param {string} fields.refresh_token - The refresh token to use.
|
||||
* @param {string} fields.token_exchange_method - The token exchange method ('default_post' or 'basic_auth_header').
|
||||
* @param {string} fields.encrypted_oauth_client_id - The client ID for the OAuth provider.
|
||||
* @param {string} fields.encrypted_oauth_client_secret - The client secret for the OAuth provider.
|
||||
* @returns {Promise<{
|
||||
* access_token: string,
|
||||
* expires_in: number,
|
||||
* refresh_token?: string,
|
||||
* refresh_token_expires_in?: number,
|
||||
* }>}
|
||||
*/
|
||||
const refreshAccessToken = async ({
|
||||
userId,
|
||||
client_url,
|
||||
identifier,
|
||||
refresh_token,
|
||||
token_exchange_method,
|
||||
encrypted_oauth_client_id,
|
||||
encrypted_oauth_client_secret,
|
||||
}) => {
|
||||
try {
|
||||
const oauth_client_id = await decryptV2(encrypted_oauth_client_id);
|
||||
const oauth_client_secret = await decryptV2(encrypted_oauth_client_secret);
|
||||
|
||||
const headers = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
const params = new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
refresh_token,
|
||||
});
|
||||
|
||||
if (token_exchange_method === TokenExchangeMethodEnum.BasicAuthHeader) {
|
||||
const basicAuth = Buffer.from(`${oauth_client_id}:${oauth_client_secret}`).toString('base64');
|
||||
headers['Authorization'] = `Basic ${basicAuth}`;
|
||||
} else {
|
||||
params.append('client_id', oauth_client_id);
|
||||
params.append('client_secret', oauth_client_secret);
|
||||
}
|
||||
|
||||
const response = await axios({
|
||||
method: 'POST',
|
||||
url: client_url,
|
||||
headers,
|
||||
data: params.toString(),
|
||||
});
|
||||
await processAccessTokens(response.data, {
|
||||
userId,
|
||||
identifier,
|
||||
});
|
||||
logger.debug(`Access token refreshed successfully for ${identifier}`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
const message = 'Error refreshing OAuth tokens';
|
||||
throw new Error(
|
||||
logAxiosError({
|
||||
message,
|
||||
error,
|
||||
}),
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Handles the OAuth callback and exchanges the authorization code for tokens.
|
||||
* @param {object} fields
|
||||
* @param {string} fields.code - The authorization code returned by the provider.
|
||||
* @param {string} fields.userId - The ID of the user.
|
||||
* @param {string} fields.identifier - The identifier for the token.
|
||||
* @param {string} fields.client_url - The URL of the OAuth provider.
|
||||
* @param {string} fields.redirect_uri - The redirect URI for the OAuth provider.
|
||||
* @param {string} fields.token_exchange_method - The token exchange method ('default_post' or 'basic_auth_header').
|
||||
* @param {string} fields.encrypted_oauth_client_id - The client ID for the OAuth provider.
|
||||
* @param {string} fields.encrypted_oauth_client_secret - The client secret for the OAuth provider.
|
||||
* @returns {Promise<{
|
||||
* access_token: string,
|
||||
* expires_in: number,
|
||||
* refresh_token?: string,
|
||||
* refresh_token_expires_in?: number,
|
||||
* }>}
|
||||
*/
|
||||
const getAccessToken = async ({
|
||||
code,
|
||||
userId,
|
||||
identifier,
|
||||
client_url,
|
||||
redirect_uri,
|
||||
token_exchange_method,
|
||||
encrypted_oauth_client_id,
|
||||
encrypted_oauth_client_secret,
|
||||
}) => {
|
||||
const oauth_client_id = await decryptV2(encrypted_oauth_client_id);
|
||||
const oauth_client_secret = await decryptV2(encrypted_oauth_client_secret);
|
||||
|
||||
const headers = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
const params = new URLSearchParams({
|
||||
code,
|
||||
grant_type: 'authorization_code',
|
||||
redirect_uri,
|
||||
});
|
||||
|
||||
if (token_exchange_method === TokenExchangeMethodEnum.BasicAuthHeader) {
|
||||
const basicAuth = Buffer.from(`${oauth_client_id}:${oauth_client_secret}`).toString('base64');
|
||||
headers['Authorization'] = `Basic ${basicAuth}`;
|
||||
} else {
|
||||
params.append('client_id', oauth_client_id);
|
||||
params.append('client_secret', oauth_client_secret);
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await axios({
|
||||
method: 'POST',
|
||||
url: client_url,
|
||||
headers,
|
||||
data: params.toString(),
|
||||
});
|
||||
|
||||
await processAccessTokens(response.data, {
|
||||
userId,
|
||||
identifier,
|
||||
});
|
||||
logger.debug(`Access tokens successfully created for ${identifier}`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
const message = 'Error exchanging OAuth code';
|
||||
throw new Error(
|
||||
logAxiosError({
|
||||
message,
|
||||
error,
|
||||
}),
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
getAccessToken,
|
||||
refreshAccessToken,
|
||||
};
|
|
@ -1,5 +1,7 @@
|
|||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { sleep } = require('@librechat/agents');
|
||||
const { logger } = require('@librechat/data-schemas');
|
||||
const { zodToJsonSchema } = require('zod-to-json-schema');
|
||||
const { Calculator } = require('@langchain/community/tools/calculator');
|
||||
const { tool: toolFn, Tool, DynamicStructuredTool } = require('@langchain/core/tools');
|
||||
|
@ -31,14 +33,12 @@ const {
|
|||
toolkits,
|
||||
} = require('~/app/clients/tools');
|
||||
const { processFileURL, uploadImageBuffer } = require('~/server/services/Files/process');
|
||||
const { getEndpointsConfig, getCachedTools } = require('~/server/services/Config');
|
||||
const { createOnSearchResults } = require('~/server/services/Tools/search');
|
||||
const { isActionDomainAllowed } = require('~/server/services/domains');
|
||||
const { getEndpointsConfig } = require('~/server/services/Config');
|
||||
const { recordUsage } = require('~/server/services/Threads');
|
||||
const { loadTools } = require('~/app/clients/tools/util');
|
||||
const { redactMessage } = require('~/config/parsers');
|
||||
const { sleep } = require('~/server/utils');
|
||||
const { logger } = require('~/config');
|
||||
|
||||
/**
|
||||
* @param {string} toolName
|
||||
|
@ -226,7 +226,7 @@ async function processRequiredActions(client, requiredActions) {
|
|||
`[required actions] user: ${client.req.user.id} | thread_id: ${requiredActions[0].thread_id} | run_id: ${requiredActions[0].run_id}`,
|
||||
requiredActions,
|
||||
);
|
||||
const toolDefinitions = client.req.app.locals.availableTools;
|
||||
const toolDefinitions = await getCachedTools({ includeGlobal: true });
|
||||
const seenToolkits = new Set();
|
||||
const tools = requiredActions
|
||||
.map((action) => {
|
||||
|
@ -553,6 +553,7 @@ async function loadAgentTools({ req, res, agent, tool_resources, openAIApiKey })
|
|||
tools: _agentTools,
|
||||
options: {
|
||||
req,
|
||||
res,
|
||||
openAIApiKey,
|
||||
tool_resources,
|
||||
processFileURL,
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { encrypt, decrypt } = require('@librechat/api');
|
||||
const { ErrorTypes } = require('librechat-data-provider');
|
||||
const { encrypt, decrypt } = require('~/server/utils/crypto');
|
||||
const { updateUser } = require('~/models');
|
||||
const { Key } = require('~/db/models');
|
||||
|
||||
|
@ -70,6 +70,7 @@ const getUserKeyValues = async ({ userId, name }) => {
|
|||
try {
|
||||
userValues = JSON.parse(userValues);
|
||||
} catch (e) {
|
||||
logger.error('[getUserKeyValues]', e);
|
||||
throw new Error(
|
||||
JSON.stringify({
|
||||
type: ErrorTypes.INVALID_USER_KEY,
|
||||
|
|
54
api/server/services/initializeMCP.js
Normal file
54
api/server/services/initializeMCP.js
Normal file
|
@ -0,0 +1,54 @@
|
|||
const { logger } = require('@librechat/data-schemas');
|
||||
const { CacheKeys, processMCPEnv } = require('librechat-data-provider');
|
||||
const { getMCPManager, getFlowStateManager } = require('~/config');
|
||||
const { getCachedTools, setCachedTools } = require('./Config');
|
||||
const { getLogStores } = require('~/cache');
|
||||
const { findToken, updateToken, createToken, deleteTokens } = require('~/models');
|
||||
|
||||
/**
|
||||
* Initialize MCP servers
|
||||
* @param {import('express').Application} app - Express app instance
|
||||
*/
|
||||
async function initializeMCP(app) {
|
||||
const mcpServers = app.locals.mcpConfig;
|
||||
if (!mcpServers) {
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('Initializing MCP servers...');
|
||||
const mcpManager = getMCPManager();
|
||||
const flowsCache = getLogStores(CacheKeys.FLOWS);
|
||||
const flowManager = flowsCache ? getFlowStateManager(flowsCache) : null;
|
||||
|
||||
try {
|
||||
await mcpManager.initializeMCP({
|
||||
mcpServers,
|
||||
flowManager,
|
||||
tokenMethods: {
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
deleteTokens,
|
||||
},
|
||||
processMCPEnv,
|
||||
});
|
||||
|
||||
delete app.locals.mcpConfig;
|
||||
const availableTools = await getCachedTools();
|
||||
|
||||
if (!availableTools) {
|
||||
logger.warn('No available tools found in cache during MCP initialization');
|
||||
return;
|
||||
}
|
||||
|
||||
const toolsCopy = { ...availableTools };
|
||||
await mcpManager.mapAvailableTools(toolsCopy, flowManager);
|
||||
await setCachedTools(toolsCopy, { isGlobal: true });
|
||||
|
||||
logger.info('MCP servers initialized successfully');
|
||||
} catch (error) {
|
||||
logger.error('Failed to initialize MCP servers:', error);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = initializeMCP;
|
|
@ -1,5 +1,5 @@
|
|||
const { webcrypto } = require('node:crypto');
|
||||
const { hashBackupCode, decryptV3, decryptV2 } = require('~/server/utils/crypto');
|
||||
const { hashBackupCode, decryptV3, decryptV2 } = require('@librechat/api');
|
||||
const { updateUser } = require('~/models');
|
||||
|
||||
// Base32 alphabet for TOTP secret encoding.
|
||||
|
|
|
@ -3,7 +3,6 @@ const removePorts = require('./removePorts');
|
|||
const countTokens = require('./countTokens');
|
||||
const handleText = require('./handleText');
|
||||
const sendEmail = require('./sendEmail');
|
||||
const cryptoUtils = require('./crypto');
|
||||
const queue = require('./queue');
|
||||
const files = require('./files');
|
||||
const math = require('./math');
|
||||
|
@ -31,7 +30,6 @@ function checkEmailConfig() {
|
|||
module.exports = {
|
||||
...streamResponse,
|
||||
checkEmailConfig,
|
||||
...cryptoUtils,
|
||||
...handleText,
|
||||
countTokens,
|
||||
removePorts,
|
||||
|
|
|
@ -21,19 +21,18 @@ jest.mock('~/models', () => ({
|
|||
createUser: jest.fn(),
|
||||
updateUser: jest.fn(),
|
||||
}));
|
||||
jest.mock('~/server/utils/crypto', () => ({
|
||||
hashToken: jest.fn().mockResolvedValue('hashed-token'),
|
||||
}));
|
||||
jest.mock('~/server/utils', () => ({
|
||||
jest.mock('@librechat/api', () => ({
|
||||
...jest.requireActual('@librechat/api'),
|
||||
isEnabled: jest.fn(() => false),
|
||||
}));
|
||||
jest.mock('~/config', () => ({
|
||||
jest.mock('@librechat/data-schemas', () => ({
|
||||
...jest.requireActual('@librechat/api'),
|
||||
logger: {
|
||||
info: jest.fn(),
|
||||
debug: jest.fn(),
|
||||
error: jest.fn(),
|
||||
warn: jest.fn(),
|
||||
},
|
||||
hashToken: jest.fn().mockResolvedValue('hashed-token'),
|
||||
}));
|
||||
jest.mock('~/cache/getLogStores', () =>
|
||||
jest.fn(() => ({
|
||||
|
|
|
@ -1,15 +1,17 @@
|
|||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const fetch = require('node-fetch');
|
||||
const { Strategy: SamlStrategy } = require('@node-saml/passport-saml');
|
||||
const { findUser, createUser, updateUser } = require('~/models');
|
||||
const { setupSaml, getCertificateContent } = require('./samlStrategy');
|
||||
|
||||
// --- Mocks ---
|
||||
jest.mock('tiktoken');
|
||||
jest.mock('fs');
|
||||
jest.mock('path');
|
||||
jest.mock('node-fetch');
|
||||
jest.mock('@node-saml/passport-saml');
|
||||
jest.mock('@librechat/data-schemas', () => ({
|
||||
logger: {
|
||||
info: jest.fn(),
|
||||
debug: jest.fn(),
|
||||
error: jest.fn(),
|
||||
},
|
||||
hashToken: jest.fn().mockResolvedValue('hashed-token'),
|
||||
}));
|
||||
jest.mock('~/models', () => ({
|
||||
findUser: jest.fn(),
|
||||
createUser: jest.fn(),
|
||||
|
@ -29,26 +31,26 @@ jest.mock('~/server/services/Config', () => ({
|
|||
jest.mock('~/server/services/Config/EndpointService', () => ({
|
||||
config: {},
|
||||
}));
|
||||
jest.mock('~/server/utils', () => ({
|
||||
isEnabled: jest.fn(() => false),
|
||||
isUserProvided: jest.fn(() => false),
|
||||
}));
|
||||
jest.mock('~/server/services/Files/strategies', () => ({
|
||||
getStrategyFunctions: jest.fn(() => ({
|
||||
saveBuffer: jest.fn().mockResolvedValue('/fake/path/to/avatar.png'),
|
||||
})),
|
||||
}));
|
||||
jest.mock('~/server/utils/crypto', () => ({
|
||||
hashToken: jest.fn().mockResolvedValue('hashed-token'),
|
||||
}));
|
||||
jest.mock('~/config', () => ({
|
||||
logger: {
|
||||
info: jest.fn(),
|
||||
debug: jest.fn(),
|
||||
error: jest.fn(),
|
||||
},
|
||||
jest.mock('~/config/paths', () => ({
|
||||
root: '/fake/root/path',
|
||||
}));
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const fetch = require('node-fetch');
|
||||
const { Strategy: SamlStrategy } = require('@node-saml/passport-saml');
|
||||
const { setupSaml, getCertificateContent } = require('./samlStrategy');
|
||||
|
||||
// Configure fs mock
|
||||
jest.mocked(fs).existsSync = jest.fn();
|
||||
jest.mocked(fs).statSync = jest.fn();
|
||||
jest.mocked(fs).readFileSync = jest.fn();
|
||||
|
||||
// To capture the verify callback from the strategy, we grab it from the mock constructor
|
||||
let verifyCallback;
|
||||
SamlStrategy.mockImplementation((options, verify) => {
|
||||
|
|
|
@ -476,11 +476,18 @@
|
|||
* @memberof typedefs
|
||||
*/
|
||||
|
||||
/**
|
||||
* @exports ToolCallChunk
|
||||
* @typedef {import('librechat-data-provider').Agents.ToolCallChunk} ToolCallChunk
|
||||
* @memberof typedefs
|
||||
*/
|
||||
|
||||
/**
|
||||
* @exports MessageContentImageUrl
|
||||
* @typedef {import('librechat-data-provider').Agents.MessageContentImageUrl} MessageContentImageUrl
|
||||
* @memberof typedefs
|
||||
*/
|
||||
|
||||
/** Web Search */
|
||||
|
||||
/**
|
||||
|
|
72
client/src/components/OAuth/OAuthError.tsx
Normal file
72
client/src/components/OAuth/OAuthError.tsx
Normal file
|
@ -0,0 +1,72 @@
|
|||
import React from 'react';
|
||||
import { useSearchParams } from 'react-router-dom';
|
||||
import { useLocalize } from '~/hooks';
|
||||
|
||||
export default function OAuthError() {
|
||||
const localize = useLocalize();
|
||||
const [searchParams] = useSearchParams();
|
||||
const error = searchParams.get('error') || 'unknown_error';
|
||||
|
||||
const getErrorMessage = (error: string): string => {
|
||||
switch (error) {
|
||||
case 'missing_code':
|
||||
return (
|
||||
localize('com_ui_oauth_error_missing_code') ||
|
||||
'Authorization code is missing. Please try again.'
|
||||
);
|
||||
case 'missing_state':
|
||||
return (
|
||||
localize('com_ui_oauth_error_missing_state') ||
|
||||
'State parameter is missing. Please try again.'
|
||||
);
|
||||
case 'invalid_state':
|
||||
return (
|
||||
localize('com_ui_oauth_error_invalid_state') ||
|
||||
'Invalid state parameter. Please try again.'
|
||||
);
|
||||
case 'callback_failed':
|
||||
return (
|
||||
localize('com_ui_oauth_error_callback_failed') ||
|
||||
'Authentication callback failed. Please try again.'
|
||||
);
|
||||
default:
|
||||
return localize('com_ui_oauth_error_generic') || error.replace(/_/g, ' ');
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="flex min-h-screen items-center justify-center bg-gray-50 p-8">
|
||||
<div className="w-full max-w-md rounded-lg bg-white p-8 text-center shadow-lg">
|
||||
<div className="mb-4 flex justify-center">
|
||||
<div className="flex h-12 w-12 items-center justify-center rounded-full bg-red-100">
|
||||
<svg
|
||||
className="h-6 w-6 text-red-600"
|
||||
fill="none"
|
||||
viewBox="0 0 24 24"
|
||||
stroke="currentColor"
|
||||
aria-hidden="true"
|
||||
>
|
||||
<path
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
strokeWidth={2}
|
||||
d="M6 18L18 6M6 6l12 12"
|
||||
/>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
<h1 className="mb-4 text-3xl font-bold text-gray-900">
|
||||
{localize('com_ui_oauth_error_title') || 'Authentication Failed'}
|
||||
</h1>
|
||||
<p className="mb-6 text-sm text-gray-600">{getErrorMessage(error)}</p>
|
||||
<button
|
||||
onClick={() => window.close()}
|
||||
className="rounded-md bg-gray-900 px-4 py-2 text-sm font-medium text-white hover:bg-gray-800 focus:outline-none focus:ring-2 focus:ring-gray-500 focus:ring-offset-2"
|
||||
aria-label={localize('com_ui_close_window') || 'Close Window'}
|
||||
>
|
||||
{localize('com_ui_close_window') || 'Close Window'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
47
client/src/components/OAuth/OAuthSuccess.tsx
Normal file
47
client/src/components/OAuth/OAuthSuccess.tsx
Normal file
|
@ -0,0 +1,47 @@
|
|||
import React, { useEffect, useState } from 'react';
|
||||
import { useSearchParams } from 'react-router-dom';
|
||||
import { useLocalize } from '~/hooks';
|
||||
|
||||
export default function OAuthSuccess() {
|
||||
const localize = useLocalize();
|
||||
const [searchParams] = useSearchParams();
|
||||
const [secondsLeft, setSecondsLeft] = useState(3);
|
||||
const serverName = searchParams.get('serverName');
|
||||
|
||||
useEffect(() => {
|
||||
const countdown = setInterval(() => {
|
||||
setSecondsLeft((prev) => {
|
||||
if (prev <= 1) {
|
||||
clearInterval(countdown);
|
||||
window.close();
|
||||
return 0;
|
||||
}
|
||||
return prev - 1;
|
||||
});
|
||||
}, 1000);
|
||||
|
||||
return () => clearInterval(countdown);
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<div className="flex min-h-screen items-center justify-center bg-gray-50 p-8">
|
||||
<div className="w-full max-w-md rounded-lg bg-white p-8 text-center shadow-lg">
|
||||
<h1 className="mb-4 text-3xl font-bold text-gray-900">
|
||||
{localize('com_ui_oauth_success_title') || 'Authentication Successful'}
|
||||
</h1>
|
||||
<p className="mb-2 text-sm text-gray-600">
|
||||
{localize('com_ui_oauth_success_description') ||
|
||||
'Your authentication was successful. This window will close in'}{' '}
|
||||
<span className="font-medium text-indigo-500">{secondsLeft}</span>{' '}
|
||||
{localize('com_ui_seconds') || 'seconds'}.
|
||||
</p>
|
||||
{serverName && (
|
||||
<p className="mt-4 text-xs text-gray-500">
|
||||
{localize('com_ui_oauth_connected_to') || 'Connected to'}:{' '}
|
||||
<span className="font-medium">{serverName}</span>
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
2
client/src/components/OAuth/index.ts
Normal file
2
client/src/components/OAuth/index.ts
Normal file
|
@ -0,0 +1,2 @@
|
|||
export { default as OAuthSuccess } from './OAuthSuccess';
|
||||
export { default as OAuthError } from './OAuthError';
|
|
@ -833,6 +833,17 @@
|
|||
"com_ui_not_used": "Not Used",
|
||||
"com_ui_nothing_found": "Nothing found",
|
||||
"com_ui_oauth": "OAuth",
|
||||
"com_ui_oauth_success_title": "Authentication Successful",
|
||||
"com_ui_oauth_success_description": "Your authentication was successful. This window will close in",
|
||||
"com_ui_oauth_connected_to": "Connected to",
|
||||
"com_ui_oauth_error_title": "Authentication Failed",
|
||||
"com_ui_oauth_error_missing_code": "Authorization code is missing. Please try again.",
|
||||
"com_ui_oauth_error_missing_state": "State parameter is missing. Please try again.",
|
||||
"com_ui_oauth_error_invalid_state": "Invalid state parameter. Please try again.",
|
||||
"com_ui_oauth_error_callback_failed": "Authentication callback failed. Please try again.",
|
||||
"com_ui_oauth_error_generic": "Authentication failed. Please try again.",
|
||||
"com_ui_close_window": "Close Window",
|
||||
"com_ui_seconds": "seconds",
|
||||
"com_ui_of": "of",
|
||||
"com_ui_off": "Off",
|
||||
"com_ui_on": "On",
|
||||
|
@ -1006,7 +1017,7 @@
|
|||
"com_ui_zoom": "Zoom",
|
||||
"com_user_message": "You",
|
||||
"com_warning_resubmit_unsupported": "Resubmitting the AI message is not supported for this endpoint.",
|
||||
"com_ui_add_mcp": "Add MCP",
|
||||
"com_ui_add_mcp": "Add MCP",
|
||||
"com_ui_add_mcp_server": "Add MCP Server",
|
||||
"com_ui_edit_mcp_server": "Edit MCP Server",
|
||||
"com_agents_mcps_disabled": "You need to create an agent before adding MCPs.",
|
||||
|
@ -1028,4 +1039,4 @@
|
|||
"com_agents_mcp_trust_subtext": "Custom connectors are not verified by LibreChat",
|
||||
"com_ui_icon": "Icon",
|
||||
"com_agents_mcp_icon_size": "Minimum size 128 x 128 px"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,13 +1,14 @@
|
|||
import { createBrowserRouter, Navigate, Outlet } from 'react-router-dom';
|
||||
import {
|
||||
Login,
|
||||
Registration,
|
||||
RequestPasswordReset,
|
||||
ResetPassword,
|
||||
VerifyEmail,
|
||||
Registration,
|
||||
ResetPassword,
|
||||
ApiErrorWatcher,
|
||||
TwoFactorScreen,
|
||||
RequestPasswordReset,
|
||||
} from '~/components/Auth';
|
||||
import { OAuthSuccess, OAuthError } from '~/components/OAuth';
|
||||
import { AuthContextProvider } from '~/hooks/AuthContext';
|
||||
import RouteErrorBoundary from './RouteErrorBoundary';
|
||||
import StartupLayout from './Layouts/Startup';
|
||||
|
@ -31,6 +32,20 @@ export const router = createBrowserRouter([
|
|||
element: <ShareRoute />,
|
||||
errorElement: <RouteErrorBoundary />,
|
||||
},
|
||||
{
|
||||
path: 'oauth',
|
||||
errorElement: <RouteErrorBoundary />,
|
||||
children: [
|
||||
{
|
||||
path: 'success',
|
||||
element: <OAuthSuccess />,
|
||||
},
|
||||
{
|
||||
path: 'error',
|
||||
element: <OAuthError />,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
path: '/',
|
||||
element: <StartupLayout />,
|
||||
|
|
87
package-lock.json
generated
87
package-lock.json
generated
|
@ -20056,15 +20056,16 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@modelcontextprotocol/sdk": {
|
||||
"version": "1.11.2",
|
||||
"resolved": "https://registry.npmjs.org/@modelcontextprotocol/sdk/-/sdk-1.11.2.tgz",
|
||||
"integrity": "sha512-H9vwztj5OAqHg9GockCQC06k1natgcxWQSRpQcPJf6i5+MWBzfKkRtxGbjQf0X2ihii0ffLZCRGbYV2f2bjNCQ==",
|
||||
"version": "1.12.3",
|
||||
"resolved": "https://registry.npmjs.org/@modelcontextprotocol/sdk/-/sdk-1.12.3.tgz",
|
||||
"integrity": "sha512-DyVYSOafBvk3/j1Oka4z5BWT8o4AFmoNyZY9pALOm7Lh3GZglR71Co4r4dEUoqDWdDazIZQHBe7J2Nwkg6gHgQ==",
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"ajv": "^6.12.6",
|
||||
"content-type": "^1.0.5",
|
||||
"cors": "^2.8.5",
|
||||
"cross-spawn": "^7.0.3",
|
||||
"cross-spawn": "^7.0.5",
|
||||
"eventsource": "^3.0.2",
|
||||
"express": "^5.0.1",
|
||||
"express-rate-limit": "^7.5.0",
|
||||
|
@ -20091,6 +20092,23 @@
|
|||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/@modelcontextprotocol/sdk/node_modules/ajv": {
|
||||
"version": "6.12.6",
|
||||
"resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz",
|
||||
"integrity": "sha512-j3fVLgvTo527anyYyJOGTYJbG+vnnQYvE0m5mmkc1TK+nxAppkCLMIL0aZ4dblVCNoGShhm+kzE4ZUykBoMg4g==",
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"fast-deep-equal": "^3.1.1",
|
||||
"fast-json-stable-stringify": "^2.0.0",
|
||||
"json-schema-traverse": "^0.4.1",
|
||||
"uri-js": "^4.2.2"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/epoberezkin"
|
||||
}
|
||||
},
|
||||
"node_modules/@modelcontextprotocol/sdk/node_modules/body-parser": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-2.2.0.tgz",
|
||||
|
@ -20219,6 +20237,13 @@
|
|||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@modelcontextprotocol/sdk/node_modules/json-schema-traverse": {
|
||||
"version": "0.4.1",
|
||||
"resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz",
|
||||
"integrity": "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg==",
|
||||
"license": "MIT",
|
||||
"peer": true
|
||||
},
|
||||
"node_modules/@modelcontextprotocol/sdk/node_modules/media-typer": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-1.1.0.tgz",
|
||||
|
@ -30602,8 +30627,7 @@
|
|||
"node_modules/fast-json-stable-stringify": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz",
|
||||
"integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw==",
|
||||
"dev": true
|
||||
"integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw=="
|
||||
},
|
||||
"node_modules/fast-levenshtein": {
|
||||
"version": "2.0.6",
|
||||
|
@ -37599,9 +37623,9 @@
|
|||
"integrity": "sha512-1orQ9MT1vHFGQxhuy7E/0gECD3fd2fCC+PIX+/jgmU/gI3EpRocXtmtvxCO5x3WZ443FLTLFWNDjl5MPJf9u+Q=="
|
||||
},
|
||||
"node_modules/oauth4webapi": {
|
||||
"version": "3.5.1",
|
||||
"resolved": "https://registry.npmjs.org/oauth4webapi/-/oauth4webapi-3.5.1.tgz",
|
||||
"integrity": "sha512-txg/jZQwcbaF7PMJgY7aoxc9QuCxHVFMiEkDIJ60DwDz3PbtXPQnrzo+3X4IRYGChIwWLabRBRpf1k9hO9+xrQ==",
|
||||
"version": "3.5.2",
|
||||
"resolved": "https://registry.npmjs.org/oauth4webapi/-/oauth4webapi-3.5.2.tgz",
|
||||
"integrity": "sha512-VYz5BaP3izIrUc1GAVzIoz4JnljiW0YAUFObMBwsqDnfHxz2sjLu3W7/8vE8Ms9IbMewN9+1kcvhY3tMscAeGQ==",
|
||||
"license": "MIT",
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/panva"
|
||||
|
@ -44419,7 +44443,6 @@
|
|||
"version": "4.4.1",
|
||||
"resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz",
|
||||
"integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==",
|
||||
"dev": true,
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"punycode": "^2.1.0"
|
||||
|
@ -46150,7 +46173,7 @@
|
|||
},
|
||||
"packages/api": {
|
||||
"name": "@librechat/api",
|
||||
"version": "1.2.3",
|
||||
"version": "1.2.4",
|
||||
"license": "ISC",
|
||||
"devDependencies": {
|
||||
"@babel/preset-env": "^7.21.5",
|
||||
|
@ -46184,7 +46207,7 @@
|
|||
"peerDependencies": {
|
||||
"@librechat/agents": "^2.4.37",
|
||||
"@librechat/data-schemas": "*",
|
||||
"@modelcontextprotocol/sdk": "^1.11.2",
|
||||
"@modelcontextprotocol/sdk": "^1.12.3",
|
||||
"axios": "^1.8.2",
|
||||
"diff": "^7.0.0",
|
||||
"eventsource": "^3.0.2",
|
||||
|
@ -46270,46 +46293,6 @@
|
|||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"packages/auth": {
|
||||
"name": "@librechat/auth",
|
||||
"version": "0.0.1",
|
||||
"extraneous": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"https-proxy-agent": "^7.0.6",
|
||||
"jsonwebtoken": "^9.0.2",
|
||||
"mongoose": "^8.12.1",
|
||||
"openid-client": "^6.5.0",
|
||||
"passport": "^0.7.0",
|
||||
"passport-facebook": "^3.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@librechat/data-schemas": "^0.0.9",
|
||||
"@rollup/plugin-alias": "^5.1.0",
|
||||
"@rollup/plugin-commonjs": "^25.0.2",
|
||||
"@rollup/plugin-json": "^6.1.0",
|
||||
"@rollup/plugin-node-resolve": "^15.1.0",
|
||||
"@rollup/plugin-replace": "^5.0.5",
|
||||
"@rollup/plugin-terser": "^0.4.4",
|
||||
"@rollup/plugin-typescript": "^12.1.2",
|
||||
"@types/diff": "^6.0.0",
|
||||
"@types/express": "^5.0.0",
|
||||
"@types/jest": "^29.5.2",
|
||||
"@types/node": "^20.3.0",
|
||||
"jest": "^29.5.0",
|
||||
"jest-junit": "^16.0.0",
|
||||
"rimraf": "^5.0.1",
|
||||
"rollup": "^4.22.4",
|
||||
"rollup-plugin-generate-package-json": "^3.2.0",
|
||||
"rollup-plugin-peer-deps-external": "^2.2.4",
|
||||
"rollup-plugin-typescript2": "^0.35.0",
|
||||
"ts-node": "^10.9.2",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"keyv": "^5.3.2"
|
||||
}
|
||||
},
|
||||
"packages/data-provider": {
|
||||
"name": "librechat-data-provider",
|
||||
"version": "0.7.87",
|
||||
|
|
|
@ -64,6 +64,7 @@
|
|||
"b:data": "cd packages/data-provider && bun run b:build",
|
||||
"b:mcp": "cd packages/api && bun run b:build",
|
||||
"b:data-schemas": "cd packages/data-schemas && bun run b:build",
|
||||
"b:build:api": "cd packages/api && bun run b:build",
|
||||
"b:client": "bun --bun run b:data && bun --bun run b:mcp && bun --bun run b:data-schemas && cd client && bun --bun run b:build",
|
||||
"b:client:dev": "cd client && bun run b:dev",
|
||||
"b:test:client": "cd client && bun run b:test",
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
{
|
||||
"name": "@librechat/api",
|
||||
"version": "1.2.3",
|
||||
"version": "1.2.4",
|
||||
"type": "commonjs",
|
||||
"description": "MCP services for LibreChat",
|
||||
"main": "dist/index.js",
|
||||
|
@ -71,7 +71,7 @@
|
|||
"peerDependencies": {
|
||||
"@librechat/agents": "^2.4.37",
|
||||
"@librechat/data-schemas": "*",
|
||||
"@modelcontextprotocol/sdk": "^1.11.2",
|
||||
"@modelcontextprotocol/sdk": "^1.12.3",
|
||||
"axios": "^1.8.2",
|
||||
"diff": "^7.0.0",
|
||||
"eventsource": "^3.0.2",
|
||||
|
|
|
@ -36,10 +36,11 @@ const plugins = [
|
|||
const cjsBuild = {
|
||||
input: 'src/index.ts',
|
||||
output: {
|
||||
file: pkg.main,
|
||||
dir: 'dist',
|
||||
format: 'cjs',
|
||||
sourcemap: true,
|
||||
exports: 'named',
|
||||
entryFileNames: '[name].js',
|
||||
},
|
||||
external: [...Object.keys(pkg.dependencies || {}), ...Object.keys(pkg.devDependencies || {})],
|
||||
preserveSymlinks: true,
|
||||
|
|
|
@ -1,15 +1,15 @@
|
|||
require('dotenv').config();
|
||||
const crypto = require('node:crypto');
|
||||
import 'dotenv/config';
|
||||
import crypto from 'node:crypto';
|
||||
const { webcrypto } = crypto;
|
||||
|
||||
// Use hex decoding for both key and IV for legacy methods.
|
||||
const key = Buffer.from(process.env.CREDS_KEY, 'hex');
|
||||
const iv = Buffer.from(process.env.CREDS_IV, 'hex');
|
||||
const key = Buffer.from(process.env.CREDS_KEY ?? '', 'hex');
|
||||
const iv = Buffer.from(process.env.CREDS_IV ?? '', 'hex');
|
||||
const algorithm = 'AES-CBC';
|
||||
|
||||
// --- Legacy v1/v2 Setup: AES-CBC with fixed key and IV ---
|
||||
|
||||
async function encrypt(value) {
|
||||
export async function encrypt(value: string) {
|
||||
const cryptoKey = await webcrypto.subtle.importKey('raw', key, { name: algorithm }, false, [
|
||||
'encrypt',
|
||||
]);
|
||||
|
@ -23,7 +23,7 @@ async function encrypt(value) {
|
|||
return Buffer.from(encryptedBuffer).toString('hex');
|
||||
}
|
||||
|
||||
async function decrypt(encryptedValue) {
|
||||
export async function decrypt(encryptedValue: string) {
|
||||
const cryptoKey = await webcrypto.subtle.importKey('raw', key, { name: algorithm }, false, [
|
||||
'decrypt',
|
||||
]);
|
||||
|
@ -39,7 +39,7 @@ async function decrypt(encryptedValue) {
|
|||
|
||||
// --- v2: AES-CBC with a random IV per encryption ---
|
||||
|
||||
async function encryptV2(value) {
|
||||
export async function encryptV2(value: string) {
|
||||
const gen_iv = webcrypto.getRandomValues(new Uint8Array(16));
|
||||
const cryptoKey = await webcrypto.subtle.importKey('raw', key, { name: algorithm }, false, [
|
||||
'encrypt',
|
||||
|
@ -54,12 +54,12 @@ async function encryptV2(value) {
|
|||
return Buffer.from(gen_iv).toString('hex') + ':' + Buffer.from(encryptedBuffer).toString('hex');
|
||||
}
|
||||
|
||||
async function decryptV2(encryptedValue) {
|
||||
export async function decryptV2(encryptedValue: string) {
|
||||
const parts = encryptedValue.split(':');
|
||||
if (parts.length === 1) {
|
||||
return parts[0];
|
||||
}
|
||||
const gen_iv = Buffer.from(parts.shift(), 'hex');
|
||||
const gen_iv = Buffer.from(parts.shift() ?? '', 'hex');
|
||||
const encrypted = parts.join(':');
|
||||
const cryptoKey = await webcrypto.subtle.importKey('raw', key, { name: algorithm }, false, [
|
||||
'decrypt',
|
||||
|
@ -81,10 +81,10 @@ const algorithm_v3 = 'aes-256-ctr';
|
|||
* Encrypts a value using AES-256-CTR.
|
||||
* Note: AES-256 requires a 32-byte key. Ensure that process.env.CREDS_KEY is a 64-character hex string.
|
||||
*
|
||||
* @param {string} value - The plaintext to encrypt.
|
||||
* @returns {string} The encrypted string with a "v3:" prefix.
|
||||
* @param value - The plaintext to encrypt.
|
||||
* @returns The encrypted string with a "v3:" prefix.
|
||||
*/
|
||||
function encryptV3(value) {
|
||||
export function encryptV3(value: string) {
|
||||
if (key.length !== 32) {
|
||||
throw new Error(`Invalid key length: expected 32 bytes, got ${key.length} bytes`);
|
||||
}
|
||||
|
@ -94,7 +94,7 @@ function encryptV3(value) {
|
|||
return `v3:${iv_v3.toString('hex')}:${encrypted.toString('hex')}`;
|
||||
}
|
||||
|
||||
function decryptV3(encryptedValue) {
|
||||
export function decryptV3(encryptedValue: string) {
|
||||
const parts = encryptedValue.split(':');
|
||||
if (parts[0] !== 'v3') {
|
||||
throw new Error('Not a v3 encrypted value');
|
||||
|
@ -106,7 +106,7 @@ function decryptV3(encryptedValue) {
|
|||
return decrypted.toString('utf8');
|
||||
}
|
||||
|
||||
async function getRandomValues(length) {
|
||||
export async function getRandomValues(length: number) {
|
||||
if (!Number.isInteger(length) || length <= 0) {
|
||||
throw new Error('Length must be a positive integer');
|
||||
}
|
||||
|
@ -117,24 +117,13 @@ async function getRandomValues(length) {
|
|||
|
||||
/**
|
||||
* Computes SHA-256 hash for the given input.
|
||||
* @param {string} input
|
||||
* @returns {Promise<string>}
|
||||
* @param input - The input to hash.
|
||||
* @returns The SHA-256 hash of the input.
|
||||
*/
|
||||
async function hashBackupCode(input) {
|
||||
export async function hashBackupCode(input: string) {
|
||||
const encoder = new TextEncoder();
|
||||
const data = encoder.encode(input);
|
||||
const hashBuffer = await webcrypto.subtle.digest('SHA-256', data);
|
||||
const hashArray = Array.from(new Uint8Array(hashBuffer));
|
||||
return hashArray.map((b) => b.toString(16).padStart(2, '0')).join('');
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
encrypt,
|
||||
decrypt,
|
||||
encryptV2,
|
||||
decryptV2,
|
||||
encryptV3,
|
||||
decryptV3,
|
||||
hashBackupCode,
|
||||
getRandomValues,
|
||||
};
|
1
packages/api/src/crypto/index.ts
Normal file
1
packages/api/src/crypto/index.ts
Normal file
|
@ -0,0 +1 @@
|
|||
export * from './encryption';
|
|
@ -1,8 +1,8 @@
|
|||
import { FlowStateManager } from './manager';
|
||||
import { Keyv } from 'keyv';
|
||||
import { FlowStateManager } from './manager';
|
||||
import type { FlowState } from './types';
|
||||
|
||||
// Create a mock class without extending Keyv
|
||||
/** Mock class without extending Keyv */
|
||||
class MockKeyv {
|
||||
private store: Map<string, FlowState<string>>;
|
||||
|
||||
|
|
|
@ -1,28 +1,18 @@
|
|||
import { Keyv } from 'keyv';
|
||||
import { logger } from '@librechat/data-schemas';
|
||||
import type { StoredDataNoRaw } from 'keyv';
|
||||
import type { Logger } from 'winston';
|
||||
import type { FlowState, FlowMetadata, FlowManagerOptions } from './types';
|
||||
|
||||
export class FlowStateManager<T = unknown> {
|
||||
private keyv: Keyv;
|
||||
private ttl: number;
|
||||
private logger: Logger;
|
||||
private intervals: Set<NodeJS.Timeout>;
|
||||
|
||||
private static getDefaultLogger(): Logger {
|
||||
return {
|
||||
error: console.error,
|
||||
warn: console.warn,
|
||||
info: console.info,
|
||||
debug: console.debug,
|
||||
} as Logger;
|
||||
}
|
||||
|
||||
constructor(store: Keyv, options?: FlowManagerOptions) {
|
||||
if (!options) {
|
||||
options = { ttl: 60000 * 3 };
|
||||
}
|
||||
const { ci = false, ttl, logger } = options;
|
||||
const { ci = false, ttl } = options;
|
||||
|
||||
if (!ci && !(store instanceof Keyv)) {
|
||||
throw new Error('Invalid store provided to FlowStateManager');
|
||||
|
@ -30,14 +20,13 @@ export class FlowStateManager<T = unknown> {
|
|||
|
||||
this.ttl = ttl;
|
||||
this.keyv = store;
|
||||
this.logger = logger || FlowStateManager.getDefaultLogger();
|
||||
this.intervals = new Set();
|
||||
this.setupCleanupHandlers();
|
||||
}
|
||||
|
||||
private setupCleanupHandlers() {
|
||||
const cleanup = () => {
|
||||
this.logger.info('Cleaning up FlowStateManager intervals...');
|
||||
logger.info('Cleaning up FlowStateManager intervals...');
|
||||
this.intervals.forEach((interval) => clearInterval(interval));
|
||||
this.intervals.clear();
|
||||
process.exit(0);
|
||||
|
@ -66,7 +55,7 @@ export class FlowStateManager<T = unknown> {
|
|||
|
||||
let existingState = (await this.keyv.get(flowKey)) as FlowState<T> | undefined;
|
||||
if (existingState) {
|
||||
this.logger.debug(`[${flowKey}] Flow already exists`);
|
||||
logger.debug(`[${flowKey}] Flow already exists`);
|
||||
return this.monitorFlow(flowKey, type, signal);
|
||||
}
|
||||
|
||||
|
@ -74,7 +63,7 @@ export class FlowStateManager<T = unknown> {
|
|||
|
||||
existingState = (await this.keyv.get(flowKey)) as FlowState<T> | undefined;
|
||||
if (existingState) {
|
||||
this.logger.debug(`[${flowKey}] Flow exists on 2nd check`);
|
||||
logger.debug(`[${flowKey}] Flow exists on 2nd check`);
|
||||
return this.monitorFlow(flowKey, type, signal);
|
||||
}
|
||||
|
||||
|
@ -85,7 +74,7 @@ export class FlowStateManager<T = unknown> {
|
|||
createdAt: Date.now(),
|
||||
};
|
||||
|
||||
this.logger.debug('Creating initial flow state:', flowKey);
|
||||
logger.debug('Creating initial flow state:', flowKey);
|
||||
await this.keyv.set(flowKey, initialState, this.ttl);
|
||||
return this.monitorFlow(flowKey, type, signal);
|
||||
}
|
||||
|
@ -102,7 +91,7 @@ export class FlowStateManager<T = unknown> {
|
|||
if (!flowState) {
|
||||
clearInterval(intervalId);
|
||||
this.intervals.delete(intervalId);
|
||||
this.logger.error(`[${flowKey}] Flow state not found`);
|
||||
logger.error(`[${flowKey}] Flow state not found`);
|
||||
reject(new Error(`${type} Flow state not found`));
|
||||
return;
|
||||
}
|
||||
|
@ -110,7 +99,7 @@ export class FlowStateManager<T = unknown> {
|
|||
if (signal?.aborted) {
|
||||
clearInterval(intervalId);
|
||||
this.intervals.delete(intervalId);
|
||||
this.logger.warn(`[${flowKey}] Flow aborted`);
|
||||
logger.warn(`[${flowKey}] Flow aborted`);
|
||||
const message = `${type} flow aborted`;
|
||||
await this.keyv.delete(flowKey);
|
||||
reject(new Error(message));
|
||||
|
@ -120,7 +109,7 @@ export class FlowStateManager<T = unknown> {
|
|||
if (flowState.status !== 'PENDING') {
|
||||
clearInterval(intervalId);
|
||||
this.intervals.delete(intervalId);
|
||||
this.logger.debug(`[${flowKey}] Flow completed`);
|
||||
logger.debug(`[${flowKey}] Flow completed`);
|
||||
|
||||
if (flowState.status === 'COMPLETED' && flowState.result !== undefined) {
|
||||
resolve(flowState.result);
|
||||
|
@ -135,17 +124,15 @@ export class FlowStateManager<T = unknown> {
|
|||
if (elapsedTime >= this.ttl) {
|
||||
clearInterval(intervalId);
|
||||
this.intervals.delete(intervalId);
|
||||
this.logger.error(
|
||||
logger.error(
|
||||
`[${flowKey}] Flow timed out | Elapsed time: ${elapsedTime} | TTL: ${this.ttl}`,
|
||||
);
|
||||
await this.keyv.delete(flowKey);
|
||||
reject(new Error(`${type} flow timed out`));
|
||||
}
|
||||
this.logger.debug(
|
||||
`[${flowKey}] Flow state elapsed time: ${elapsedTime}, checking again...`,
|
||||
);
|
||||
logger.debug(`[${flowKey}] Flow state elapsed time: ${elapsedTime}, checking again...`);
|
||||
} catch (error) {
|
||||
this.logger.error(`[${flowKey}] Error checking flow state:`, error);
|
||||
logger.error(`[${flowKey}] Error checking flow state:`, error);
|
||||
clearInterval(intervalId);
|
||||
this.intervals.delete(intervalId);
|
||||
reject(error);
|
||||
|
@ -224,7 +211,7 @@ export class FlowStateManager<T = unknown> {
|
|||
const flowKey = this.getFlowKey(flowId, type);
|
||||
let existingState = (await this.keyv.get(flowKey)) as FlowState<T> | undefined;
|
||||
if (existingState) {
|
||||
this.logger.debug(`[${flowKey}] Flow already exists`);
|
||||
logger.debug(`[${flowKey}] Flow already exists`);
|
||||
return this.monitorFlow(flowKey, type, signal);
|
||||
}
|
||||
|
||||
|
@ -232,7 +219,7 @@ export class FlowStateManager<T = unknown> {
|
|||
|
||||
existingState = (await this.keyv.get(flowKey)) as FlowState<T> | undefined;
|
||||
if (existingState) {
|
||||
this.logger.debug(`[${flowKey}] Flow exists on 2nd check`);
|
||||
logger.debug(`[${flowKey}] Flow exists on 2nd check`);
|
||||
return this.monitorFlow(flowKey, type, signal);
|
||||
}
|
||||
|
||||
|
@ -242,7 +229,7 @@ export class FlowStateManager<T = unknown> {
|
|||
metadata: {},
|
||||
createdAt: Date.now(),
|
||||
};
|
||||
this.logger.debug(`[${flowKey}] Creating initial flow state`);
|
||||
logger.debug(`[${flowKey}] Creating initial flow state`);
|
||||
await this.keyv.set(flowKey, initialState, this.ttl);
|
||||
|
||||
try {
|
||||
|
|
|
@ -1,8 +1,13 @@
|
|||
/* MCP */
|
||||
export * from './mcp/manager';
|
||||
export * from './mcp/oauth';
|
||||
/* Utilities */
|
||||
export * from './mcp/utils';
|
||||
export * from './utils';
|
||||
/* OAuth */
|
||||
export * from './oauth';
|
||||
/* Crypto */
|
||||
export * from './crypto';
|
||||
/* Flow */
|
||||
export * from './flow/manager';
|
||||
/* Agents */
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import { EventEmitter } from 'events';
|
||||
import { logger } from '@librechat/data-schemas';
|
||||
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
|
||||
import { SSEClientTransport } from '@modelcontextprotocol/sdk/client/sse.js';
|
||||
import {
|
||||
|
@ -10,7 +11,7 @@ import { ResourceListChangedNotificationSchema } from '@modelcontextprotocol/sdk
|
|||
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
|
||||
import type { Transport } from '@modelcontextprotocol/sdk/shared/transport.js';
|
||||
import type { JSONRPCMessage } from '@modelcontextprotocol/sdk/types.js';
|
||||
import type { Logger } from 'winston';
|
||||
import type { MCPOAuthTokens } from './oauth/types';
|
||||
import type * as t from './types';
|
||||
|
||||
function isStdioOptions(options: t.MCPOptions): options is t.StdioOptions {
|
||||
|
@ -67,24 +68,29 @@ export class MCPConnection extends EventEmitter {
|
|||
private isReconnecting = false;
|
||||
private isInitializing = false;
|
||||
private reconnectAttempts = 0;
|
||||
iconPath?: string;
|
||||
timeout?: number;
|
||||
private readonly userId?: string;
|
||||
private lastPingTime: number;
|
||||
private oauthTokens?: MCPOAuthTokens | null;
|
||||
private oauthRequired = false;
|
||||
iconPath?: string;
|
||||
timeout?: number;
|
||||
url?: string;
|
||||
|
||||
constructor(
|
||||
serverName: string,
|
||||
private readonly options: t.MCPOptions,
|
||||
private logger?: Logger,
|
||||
userId?: string,
|
||||
oauthTokens?: MCPOAuthTokens | null,
|
||||
) {
|
||||
super();
|
||||
this.serverName = serverName;
|
||||
this.logger = logger;
|
||||
this.userId = userId;
|
||||
this.iconPath = options.iconPath;
|
||||
this.timeout = options.timeout;
|
||||
this.lastPingTime = Date.now();
|
||||
if (oauthTokens) {
|
||||
this.oauthTokens = oauthTokens;
|
||||
}
|
||||
this.client = new Client(
|
||||
{
|
||||
name: '@librechat/api-client',
|
||||
|
@ -107,11 +113,10 @@ export class MCPConnection extends EventEmitter {
|
|||
public static getInstance(
|
||||
serverName: string,
|
||||
options: t.MCPOptions,
|
||||
logger?: Logger,
|
||||
userId?: string,
|
||||
): MCPConnection {
|
||||
if (!MCPConnection.instance) {
|
||||
MCPConnection.instance = new MCPConnection(serverName, options, logger, userId);
|
||||
MCPConnection.instance = new MCPConnection(serverName, options, userId);
|
||||
}
|
||||
return MCPConnection.instance;
|
||||
}
|
||||
|
@ -129,7 +134,7 @@ export class MCPConnection extends EventEmitter {
|
|||
|
||||
private emitError(error: unknown, errorContext: string): void {
|
||||
const errorMessage = error instanceof Error ? error.message : String(error);
|
||||
this.logger?.error(`${this.getLogPrefix()} ${errorContext}: ${errorMessage}`);
|
||||
logger.error(`${this.getLogPrefix()} ${errorContext}: ${errorMessage}`);
|
||||
this.emit('error', new Error(`${errorContext}: ${errorMessage}`));
|
||||
}
|
||||
|
||||
|
@ -167,45 +172,52 @@ export class MCPConnection extends EventEmitter {
|
|||
if (!isWebSocketOptions(options)) {
|
||||
throw new Error('Invalid options for websocket transport.');
|
||||
}
|
||||
this.url = options.url;
|
||||
return new WebSocketClientTransport(new URL(options.url));
|
||||
|
||||
case 'sse': {
|
||||
if (!isSSEOptions(options)) {
|
||||
throw new Error('Invalid options for sse transport.');
|
||||
}
|
||||
this.url = options.url;
|
||||
const url = new URL(options.url);
|
||||
this.logger?.info(`${this.getLogPrefix()} Creating SSE transport: ${url.toString()}`);
|
||||
logger.info(`${this.getLogPrefix()} Creating SSE transport: ${url.toString()}`);
|
||||
const abortController = new AbortController();
|
||||
|
||||
/** Add OAuth token to headers if available */
|
||||
const headers = { ...options.headers };
|
||||
if (this.oauthTokens?.access_token) {
|
||||
headers['Authorization'] = `Bearer ${this.oauthTokens.access_token}`;
|
||||
}
|
||||
|
||||
const transport = new SSEClientTransport(url, {
|
||||
requestInit: {
|
||||
headers: options.headers,
|
||||
headers,
|
||||
signal: abortController.signal,
|
||||
},
|
||||
eventSourceInit: {
|
||||
fetch: (url, init) => {
|
||||
const headers = new Headers(Object.assign({}, init?.headers, options.headers));
|
||||
const fetchHeaders = new Headers(Object.assign({}, init?.headers, headers));
|
||||
return fetch(url, {
|
||||
...init,
|
||||
headers,
|
||||
headers: fetchHeaders,
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
transport.onclose = () => {
|
||||
this.logger?.info(`${this.getLogPrefix()} SSE transport closed`);
|
||||
logger.info(`${this.getLogPrefix()} SSE transport closed`);
|
||||
this.emit('connectionChange', 'disconnected');
|
||||
};
|
||||
|
||||
transport.onerror = (error) => {
|
||||
this.logger?.error(`${this.getLogPrefix()} SSE transport error:`, error);
|
||||
logger.error(`${this.getLogPrefix()} SSE transport error:`, error);
|
||||
this.emitError(error, 'SSE transport error:');
|
||||
};
|
||||
|
||||
transport.onmessage = (message) => {
|
||||
this.logger?.info(
|
||||
`${this.getLogPrefix()} Message received: ${JSON.stringify(message)}`,
|
||||
);
|
||||
logger.info(`${this.getLogPrefix()} Message received: ${JSON.stringify(message)}`);
|
||||
};
|
||||
|
||||
this.setupTransportErrorHandlers(transport);
|
||||
|
@ -216,33 +228,38 @@ export class MCPConnection extends EventEmitter {
|
|||
if (!isStreamableHTTPOptions(options)) {
|
||||
throw new Error('Invalid options for streamable-http transport.');
|
||||
}
|
||||
this.url = options.url;
|
||||
const url = new URL(options.url);
|
||||
this.logger?.info(
|
||||
logger.info(
|
||||
`${this.getLogPrefix()} Creating streamable-http transport: ${url.toString()}`,
|
||||
);
|
||||
const abortController = new AbortController();
|
||||
|
||||
// Add OAuth token to headers if available
|
||||
const headers = { ...options.headers };
|
||||
if (this.oauthTokens?.access_token) {
|
||||
headers['Authorization'] = `Bearer ${this.oauthTokens.access_token}`;
|
||||
}
|
||||
|
||||
const transport = new StreamableHTTPClientTransport(url, {
|
||||
requestInit: {
|
||||
headers: options.headers,
|
||||
headers,
|
||||
signal: abortController.signal,
|
||||
},
|
||||
});
|
||||
|
||||
transport.onclose = () => {
|
||||
this.logger?.info(`${this.getLogPrefix()} Streamable-http transport closed`);
|
||||
logger.info(`${this.getLogPrefix()} Streamable-http transport closed`);
|
||||
this.emit('connectionChange', 'disconnected');
|
||||
};
|
||||
|
||||
transport.onerror = (error: Error | unknown) => {
|
||||
this.logger?.error(`${this.getLogPrefix()} Streamable-http transport error:`, error);
|
||||
logger.error(`${this.getLogPrefix()} Streamable-http transport error:`, error);
|
||||
this.emitError(error, 'Streamable-http transport error:');
|
||||
};
|
||||
|
||||
transport.onmessage = (message: JSONRPCMessage) => {
|
||||
this.logger?.info(
|
||||
`${this.getLogPrefix()} Message received: ${JSON.stringify(message)}`,
|
||||
);
|
||||
logger.info(`${this.getLogPrefix()} Message received: ${JSON.stringify(message)}`);
|
||||
};
|
||||
|
||||
this.setupTransportErrorHandlers(transport);
|
||||
|
@ -271,17 +288,17 @@ export class MCPConnection extends EventEmitter {
|
|||
/**
|
||||
* // FOR DEBUGGING
|
||||
* // this.client.setRequestHandler(PingRequestSchema, async (request, extra) => {
|
||||
* // this.logger?.info(`[MCP][${this.serverName}] PingRequest: ${JSON.stringify(request)}`);
|
||||
* // logger.info(`[MCP][${this.serverName}] PingRequest: ${JSON.stringify(request)}`);
|
||||
* // if (getEventListeners && extra.signal) {
|
||||
* // const listenerCount = getEventListeners(extra.signal, 'abort').length;
|
||||
* // this.logger?.debug(`Signal has ${listenerCount} abort listeners`);
|
||||
* // logger.debug(`Signal has ${listenerCount} abort listeners`);
|
||||
* // }
|
||||
* // return {};
|
||||
* // });
|
||||
*/
|
||||
} else if (state === 'error' && !this.isReconnecting && !this.isInitializing) {
|
||||
this.handleReconnection().catch((error) => {
|
||||
this.logger?.error(`${this.getLogPrefix()} Reconnection handler failed:`, error);
|
||||
logger.error(`${this.getLogPrefix()} Reconnection handler failed:`, error);
|
||||
});
|
||||
}
|
||||
});
|
||||
|
@ -290,7 +307,15 @@ export class MCPConnection extends EventEmitter {
|
|||
}
|
||||
|
||||
private async handleReconnection(): Promise<void> {
|
||||
if (this.isReconnecting || this.shouldStopReconnecting || this.isInitializing) {
|
||||
if (
|
||||
this.isReconnecting ||
|
||||
this.shouldStopReconnecting ||
|
||||
this.isInitializing ||
|
||||
this.oauthRequired
|
||||
) {
|
||||
if (this.oauthRequired) {
|
||||
logger.info(`${this.getLogPrefix()} OAuth required, skipping reconnection attempts`);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
|
@ -305,7 +330,7 @@ export class MCPConnection extends EventEmitter {
|
|||
this.reconnectAttempts++;
|
||||
const delay = backoffDelay(this.reconnectAttempts);
|
||||
|
||||
this.logger?.info(
|
||||
logger.info(
|
||||
`${this.getLogPrefix()} Reconnecting ${this.reconnectAttempts}/${this.MAX_RECONNECT_ATTEMPTS} (delay: ${delay}ms)`,
|
||||
);
|
||||
|
||||
|
@ -316,13 +341,13 @@ export class MCPConnection extends EventEmitter {
|
|||
this.reconnectAttempts = 0;
|
||||
return;
|
||||
} catch (error) {
|
||||
this.logger?.error(`${this.getLogPrefix()} Reconnection attempt failed:`, error);
|
||||
logger.error(`${this.getLogPrefix()} Reconnection attempt failed:`, error);
|
||||
|
||||
if (
|
||||
this.reconnectAttempts === this.MAX_RECONNECT_ATTEMPTS ||
|
||||
(this.shouldStopReconnecting as boolean)
|
||||
) {
|
||||
this.logger?.error(`${this.getLogPrefix()} Stopping reconnection attempts`);
|
||||
logger.error(`${this.getLogPrefix()} Stopping reconnection attempts`);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
@ -366,18 +391,21 @@ export class MCPConnection extends EventEmitter {
|
|||
await this.client.close();
|
||||
this.transport = null;
|
||||
} catch (error) {
|
||||
this.logger?.warn(`${this.getLogPrefix()} Error closing connection:`, error);
|
||||
logger.warn(`${this.getLogPrefix()} Error closing connection:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
this.transport = this.constructTransport(this.options);
|
||||
this.setupTransportDebugHandlers();
|
||||
|
||||
const connectTimeout = this.options.initTimeout ?? 10000;
|
||||
const connectTimeout = this.options.initTimeout ?? 120000;
|
||||
await Promise.race([
|
||||
this.client.connect(this.transport),
|
||||
new Promise((_resolve, reject) =>
|
||||
setTimeout(() => reject(new Error('Connection timeout')), connectTimeout),
|
||||
setTimeout(
|
||||
() => reject(new Error(`Connection timeout after ${connectTimeout}ms`)),
|
||||
connectTimeout,
|
||||
),
|
||||
),
|
||||
]);
|
||||
|
||||
|
@ -385,9 +413,85 @@ export class MCPConnection extends EventEmitter {
|
|||
this.emit('connectionChange', 'connected');
|
||||
this.reconnectAttempts = 0;
|
||||
} catch (error) {
|
||||
// Check if it's an OAuth authentication error
|
||||
if (this.isOAuthError(error)) {
|
||||
logger.warn(`${this.getLogPrefix()} OAuth authentication required`);
|
||||
this.oauthRequired = true;
|
||||
const serverUrl = this.url;
|
||||
logger.debug(`${this.getLogPrefix()} Server URL for OAuth: ${serverUrl}`);
|
||||
|
||||
const oauthTimeout = this.options.initTimeout ?? 60000;
|
||||
/** Promise that will resolve when OAuth is handled */
|
||||
const oauthHandledPromise = new Promise<void>((resolve, reject) => {
|
||||
let timeoutId: NodeJS.Timeout | null = null;
|
||||
let oauthHandledListener: (() => void) | null = null;
|
||||
let oauthFailedListener: ((error: Error) => void) | null = null;
|
||||
|
||||
/** Cleanup function to remove listeners and clear timeout */
|
||||
const cleanup = () => {
|
||||
if (timeoutId) {
|
||||
clearTimeout(timeoutId);
|
||||
}
|
||||
if (oauthHandledListener) {
|
||||
this.off('oauthHandled', oauthHandledListener);
|
||||
}
|
||||
if (oauthFailedListener) {
|
||||
this.off('oauthFailed', oauthFailedListener);
|
||||
}
|
||||
};
|
||||
|
||||
// Success handler
|
||||
oauthHandledListener = () => {
|
||||
cleanup();
|
||||
resolve();
|
||||
};
|
||||
|
||||
// Failure handler
|
||||
oauthFailedListener = (error: Error) => {
|
||||
cleanup();
|
||||
reject(error);
|
||||
};
|
||||
|
||||
// Timeout handler
|
||||
timeoutId = setTimeout(() => {
|
||||
cleanup();
|
||||
reject(new Error(`OAuth handling timeout after ${oauthTimeout}ms`));
|
||||
}, oauthTimeout);
|
||||
|
||||
// Listen for both success and failure events
|
||||
this.once('oauthHandled', oauthHandledListener);
|
||||
this.once('oauthFailed', oauthFailedListener);
|
||||
});
|
||||
|
||||
// Emit the event
|
||||
this.emit('oauthRequired', {
|
||||
serverName: this.serverName,
|
||||
error,
|
||||
serverUrl,
|
||||
userId: this.userId,
|
||||
});
|
||||
|
||||
try {
|
||||
// Wait for OAuth to be handled
|
||||
await oauthHandledPromise;
|
||||
// Reset the oauthRequired flag
|
||||
this.oauthRequired = false;
|
||||
// Don't throw the error - just return so connection can be retried
|
||||
logger.info(
|
||||
`${this.getLogPrefix()} OAuth handled successfully, connection will be retried`,
|
||||
);
|
||||
return;
|
||||
} catch (oauthError) {
|
||||
// OAuth failed or timed out
|
||||
this.oauthRequired = false;
|
||||
logger.error(`${this.getLogPrefix()} OAuth handling failed:`, oauthError);
|
||||
// Re-throw the original authentication error
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
this.connectionState = 'error';
|
||||
this.emit('connectionChange', 'error');
|
||||
this.lastError = error instanceof Error ? error : new Error(String(error));
|
||||
throw error;
|
||||
} finally {
|
||||
this.connectPromise = null;
|
||||
|
@ -403,7 +507,7 @@ export class MCPConnection extends EventEmitter {
|
|||
}
|
||||
|
||||
this.transport.onmessage = (msg) => {
|
||||
this.logger?.debug(`${this.getLogPrefix()} Transport received: ${JSON.stringify(msg)}`);
|
||||
logger.debug(`${this.getLogPrefix()} Transport received: ${JSON.stringify(msg)}`);
|
||||
};
|
||||
|
||||
const originalSend = this.transport.send.bind(this.transport);
|
||||
|
@ -414,7 +518,7 @@ export class MCPConnection extends EventEmitter {
|
|||
}
|
||||
this.lastPingTime = Date.now();
|
||||
}
|
||||
this.logger?.debug(`${this.getLogPrefix()} Transport sending: ${JSON.stringify(msg)}`);
|
||||
logger.debug(`${this.getLogPrefix()} Transport sending: ${JSON.stringify(msg)}`);
|
||||
return originalSend(msg);
|
||||
};
|
||||
}
|
||||
|
@ -427,14 +531,24 @@ export class MCPConnection extends EventEmitter {
|
|||
throw new Error('Connection not established');
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger?.error(`${this.getLogPrefix()} Connection failed:`, error);
|
||||
logger.error(`${this.getLogPrefix()} Connection failed:`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private setupTransportErrorHandlers(transport: Transport): void {
|
||||
transport.onerror = (error) => {
|
||||
this.logger?.error(`${this.getLogPrefix()} Transport error:`, error);
|
||||
logger.error(`${this.getLogPrefix()} Transport error:`, error);
|
||||
|
||||
// Check if it's an OAuth authentication error
|
||||
if (error && typeof error === 'object' && 'code' in error) {
|
||||
const errorCode = (error as unknown as { code?: number }).code;
|
||||
if (errorCode === 401 || errorCode === 403) {
|
||||
logger.warn(`${this.getLogPrefix()} OAuth authentication error detected`);
|
||||
this.emit('oauthError', error);
|
||||
}
|
||||
}
|
||||
|
||||
this.emit('connectionChange', 'error');
|
||||
};
|
||||
}
|
||||
|
@ -562,22 +676,36 @@ export class MCPConnection extends EventEmitter {
|
|||
// }
|
||||
// }
|
||||
|
||||
// Public getters for state information
|
||||
public getConnectionState(): t.ConnectionState {
|
||||
return this.connectionState;
|
||||
}
|
||||
|
||||
public async isConnected(): Promise<boolean> {
|
||||
try {
|
||||
await this.client.ping();
|
||||
return this.connectionState === 'connected';
|
||||
} catch (error) {
|
||||
this.logger?.error(`${this.getLogPrefix()} Ping failed:`, error);
|
||||
logger.error(`${this.getLogPrefix()} Ping failed:`, error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
public getLastError(): Error | null {
|
||||
return this.lastError;
|
||||
public setOAuthTokens(tokens: MCPOAuthTokens): void {
|
||||
this.oauthTokens = tokens;
|
||||
}
|
||||
|
||||
private isOAuthError(error: unknown): boolean {
|
||||
if (!error || typeof error !== 'object') {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check for SSE error with 401 status
|
||||
if ('message' in error && typeof error.message === 'string') {
|
||||
return error.message.includes('401') || error.message.includes('Non-200 status code (401)');
|
||||
}
|
||||
|
||||
// Check for error code
|
||||
if ('code' in error) {
|
||||
const code = (error as { code?: number }).code;
|
||||
return code === 401 || code === 403;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,3 +1,9 @@
|
|||
export enum CONSTANTS {
|
||||
mcp_delimiter = '_mcp_',
|
||||
/** System user ID for app-level OAuth tokens (all zeros ObjectId) */
|
||||
SYSTEM_USER_ID = '000000000000000000000000',
|
||||
}
|
||||
|
||||
export function isSystemUserId(userId?: string): boolean {
|
||||
return userId === CONSTANTS.SYSTEM_USER_ID;
|
||||
}
|
||||
|
|
File diff suppressed because it is too large
Load diff
603
packages/api/src/mcp/oauth/handler.ts
Normal file
603
packages/api/src/mcp/oauth/handler.ts
Normal file
|
@ -0,0 +1,603 @@
|
|||
import { randomBytes } from 'crypto';
|
||||
import { logger } from '@librechat/data-schemas';
|
||||
import {
|
||||
discoverOAuthMetadata,
|
||||
registerClient,
|
||||
startAuthorization,
|
||||
exchangeAuthorization,
|
||||
discoverOAuthProtectedResourceMetadata,
|
||||
} from '@modelcontextprotocol/sdk/client/auth.js';
|
||||
import { OAuthMetadataSchema } from '@modelcontextprotocol/sdk/shared/auth.js';
|
||||
import type { MCPOptions } from 'librechat-data-provider';
|
||||
import type { FlowStateManager } from '~/flow/manager';
|
||||
import type {
|
||||
OAuthClientInformation,
|
||||
OAuthProtectedResourceMetadata,
|
||||
MCPOAuthFlowMetadata,
|
||||
MCPOAuthTokens,
|
||||
OAuthMetadata,
|
||||
} from './types';
|
||||
|
||||
/** Type for the OAuth metadata from the SDK */
|
||||
type SDKOAuthMetadata = Parameters<typeof registerClient>[1]['metadata'];
|
||||
|
||||
export class MCPOAuthHandler {
|
||||
private static readonly FLOW_TYPE = 'mcp_oauth';
|
||||
private static readonly FLOW_TTL = 10 * 60 * 1000; // 10 minutes
|
||||
|
||||
/**
|
||||
* Discovers OAuth metadata from the server
|
||||
*/
|
||||
private static async discoverMetadata(serverUrl: string): Promise<{
|
||||
metadata: OAuthMetadata;
|
||||
resourceMetadata?: OAuthProtectedResourceMetadata;
|
||||
authServerUrl: URL;
|
||||
}> {
|
||||
logger.debug(`[MCPOAuth] discoverMetadata called with serverUrl: ${serverUrl}`);
|
||||
|
||||
let authServerUrl = new URL(serverUrl);
|
||||
let resourceMetadata: OAuthProtectedResourceMetadata | undefined;
|
||||
|
||||
try {
|
||||
// Try to discover resource metadata first
|
||||
logger.debug(
|
||||
`[MCPOAuth] Attempting to discover protected resource metadata from ${serverUrl}`,
|
||||
);
|
||||
resourceMetadata = await discoverOAuthProtectedResourceMetadata(serverUrl);
|
||||
|
||||
if (resourceMetadata?.authorization_servers?.length) {
|
||||
authServerUrl = new URL(resourceMetadata.authorization_servers[0]);
|
||||
logger.debug(
|
||||
`[MCPOAuth] Found authorization server from resource metadata: ${authServerUrl}`,
|
||||
);
|
||||
} else {
|
||||
logger.debug(`[MCPOAuth] No authorization servers found in resource metadata`);
|
||||
}
|
||||
} catch (error) {
|
||||
logger.debug('[MCPOAuth] Resource metadata discovery failed, continuing with server URL', {
|
||||
error,
|
||||
});
|
||||
}
|
||||
|
||||
// Discover OAuth metadata
|
||||
logger.debug(`[MCPOAuth] Discovering OAuth metadata from ${authServerUrl}`);
|
||||
const rawMetadata = await discoverOAuthMetadata(authServerUrl);
|
||||
|
||||
if (!rawMetadata) {
|
||||
logger.error(`[MCPOAuth] Failed to discover OAuth metadata from ${authServerUrl}`);
|
||||
throw new Error('Failed to discover OAuth metadata');
|
||||
}
|
||||
|
||||
logger.debug(`[MCPOAuth] OAuth metadata discovered successfully`);
|
||||
const metadata = await OAuthMetadataSchema.parseAsync(rawMetadata);
|
||||
|
||||
logger.debug(`[MCPOAuth] OAuth metadata parsed successfully`);
|
||||
return {
|
||||
metadata: metadata as unknown as OAuthMetadata,
|
||||
resourceMetadata,
|
||||
authServerUrl,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Registers an OAuth client dynamically
|
||||
*/
|
||||
private static async registerOAuthClient(
|
||||
serverUrl: string,
|
||||
metadata: OAuthMetadata,
|
||||
resourceMetadata?: OAuthProtectedResourceMetadata,
|
||||
redirectUri?: string,
|
||||
): Promise<OAuthClientInformation> {
|
||||
logger.debug(`[MCPOAuth] Starting client registration for ${serverUrl}, server metadata:`, {
|
||||
grant_types_supported: metadata.grant_types_supported,
|
||||
response_types_supported: metadata.response_types_supported,
|
||||
token_endpoint_auth_methods_supported: metadata.token_endpoint_auth_methods_supported,
|
||||
scopes_supported: metadata.scopes_supported,
|
||||
});
|
||||
|
||||
/** Client metadata based on what the server supports */
|
||||
const clientMetadata = {
|
||||
client_name: 'LibreChat MCP Client',
|
||||
redirect_uris: [redirectUri || this.getDefaultRedirectUri()],
|
||||
grant_types: ['authorization_code'] as string[],
|
||||
response_types: ['code'] as string[],
|
||||
token_endpoint_auth_method: 'client_secret_basic',
|
||||
scope: undefined as string | undefined,
|
||||
};
|
||||
|
||||
const supportedGrantTypes = metadata.grant_types_supported || ['authorization_code'];
|
||||
const requestedGrantTypes = ['authorization_code'];
|
||||
|
||||
if (supportedGrantTypes.includes('refresh_token')) {
|
||||
requestedGrantTypes.push('refresh_token');
|
||||
logger.debug(
|
||||
`[MCPOAuth] Server ${serverUrl} supports \`refresh_token\` grant type, adding to request`,
|
||||
);
|
||||
} else {
|
||||
logger.debug(`[MCPOAuth] Server ${serverUrl} does not support \`refresh_token\` grant type`);
|
||||
}
|
||||
clientMetadata.grant_types = requestedGrantTypes;
|
||||
|
||||
clientMetadata.response_types = metadata.response_types_supported || ['code'];
|
||||
|
||||
if (metadata.token_endpoint_auth_methods_supported) {
|
||||
// Prefer client_secret_basic if supported, otherwise use the first supported method
|
||||
if (metadata.token_endpoint_auth_methods_supported.includes('client_secret_basic')) {
|
||||
clientMetadata.token_endpoint_auth_method = 'client_secret_basic';
|
||||
} else if (metadata.token_endpoint_auth_methods_supported.includes('client_secret_post')) {
|
||||
clientMetadata.token_endpoint_auth_method = 'client_secret_post';
|
||||
} else if (metadata.token_endpoint_auth_methods_supported.includes('none')) {
|
||||
clientMetadata.token_endpoint_auth_method = 'none';
|
||||
} else {
|
||||
clientMetadata.token_endpoint_auth_method =
|
||||
metadata.token_endpoint_auth_methods_supported[0];
|
||||
}
|
||||
}
|
||||
|
||||
const availableScopes = resourceMetadata?.scopes_supported || metadata.scopes_supported;
|
||||
if (availableScopes) {
|
||||
clientMetadata.scope = availableScopes.join(' ');
|
||||
}
|
||||
|
||||
logger.debug(`[MCPOAuth] Registering client for ${serverUrl} with metadata:`, clientMetadata);
|
||||
|
||||
const clientInfo = await registerClient(serverUrl, {
|
||||
metadata: metadata as unknown as SDKOAuthMetadata,
|
||||
clientMetadata,
|
||||
});
|
||||
|
||||
logger.debug(`[MCPOAuth] Client registered successfully for ${serverUrl}:`, {
|
||||
client_id: clientInfo.client_id,
|
||||
has_client_secret: !!clientInfo.client_secret,
|
||||
grant_types: clientInfo.grant_types,
|
||||
scope: clientInfo.scope,
|
||||
});
|
||||
|
||||
return clientInfo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initiates the OAuth flow for an MCP server
|
||||
*/
|
||||
static async initiateOAuthFlow(
|
||||
serverName: string,
|
||||
serverUrl: string,
|
||||
userId: string,
|
||||
config: MCPOptions['oauth'] | undefined,
|
||||
): Promise<{ authorizationUrl: string; flowId: string; flowMetadata: MCPOAuthFlowMetadata }> {
|
||||
logger.debug(`[MCPOAuth] initiateOAuthFlow called for ${serverName} with URL: ${serverUrl}`);
|
||||
|
||||
const flowId = this.generateFlowId(userId, serverName);
|
||||
const state = this.generateState();
|
||||
|
||||
logger.debug(`[MCPOAuth] Generated flowId: ${flowId}, state: ${state}`);
|
||||
|
||||
try {
|
||||
// Check if we have pre-configured OAuth settings
|
||||
if (config?.authorization_url && config?.token_url && config?.client_id) {
|
||||
logger.debug(`[MCPOAuth] Using pre-configured OAuth settings for ${serverName}`);
|
||||
/** Metadata based on pre-configured settings */
|
||||
const metadata: OAuthMetadata = {
|
||||
authorization_endpoint: config.authorization_url,
|
||||
token_endpoint: config.token_url,
|
||||
issuer: serverUrl,
|
||||
scopes_supported: config.scope?.split(' '),
|
||||
};
|
||||
|
||||
const clientInfo: OAuthClientInformation = {
|
||||
client_id: config.client_id,
|
||||
client_secret: config.client_secret,
|
||||
redirect_uris: [config.redirect_uri || this.getDefaultRedirectUri(serverName)],
|
||||
scope: config.scope,
|
||||
};
|
||||
|
||||
logger.debug(`[MCPOAuth] Starting authorization with pre-configured settings`);
|
||||
const { authorizationUrl, codeVerifier } = await startAuthorization(serverUrl, {
|
||||
metadata: metadata as unknown as SDKOAuthMetadata,
|
||||
clientInformation: clientInfo,
|
||||
redirectUrl: clientInfo.redirect_uris?.[0] || this.getDefaultRedirectUri(serverName),
|
||||
scope: config.scope,
|
||||
});
|
||||
|
||||
/** Add state parameter with flowId to the authorization URL */
|
||||
authorizationUrl.searchParams.set('state', flowId);
|
||||
logger.debug(`[MCPOAuth] Added state parameter to authorization URL`);
|
||||
|
||||
const flowMetadata: MCPOAuthFlowMetadata = {
|
||||
serverName,
|
||||
userId,
|
||||
serverUrl,
|
||||
state,
|
||||
codeVerifier,
|
||||
clientInfo,
|
||||
metadata,
|
||||
};
|
||||
|
||||
logger.debug(`[MCPOAuth] Authorization URL generated: ${authorizationUrl.toString()}`);
|
||||
return {
|
||||
authorizationUrl: authorizationUrl.toString(),
|
||||
flowId,
|
||||
flowMetadata,
|
||||
};
|
||||
}
|
||||
|
||||
logger.debug(`[MCPOAuth] Starting auto-discovery of OAuth metadata from ${serverUrl}`);
|
||||
const { metadata, resourceMetadata, authServerUrl } = await this.discoverMetadata(serverUrl);
|
||||
|
||||
logger.debug(`[MCPOAuth] OAuth metadata discovered, auth server URL: ${authServerUrl}`);
|
||||
|
||||
/** Dynamic client registration based on the discovered metadata */
|
||||
const redirectUri = config?.redirect_uri || this.getDefaultRedirectUri(serverName);
|
||||
logger.debug(`[MCPOAuth] Registering OAuth client with redirect URI: ${redirectUri}`);
|
||||
|
||||
const clientInfo = await this.registerOAuthClient(
|
||||
authServerUrl.toString(),
|
||||
metadata,
|
||||
resourceMetadata,
|
||||
redirectUri,
|
||||
);
|
||||
|
||||
logger.debug(`[MCPOAuth] Client registered with ID: ${clientInfo.client_id}`);
|
||||
|
||||
/** Authorization Scope */
|
||||
const scope =
|
||||
config?.scope ||
|
||||
resourceMetadata?.scopes_supported?.join(' ') ||
|
||||
metadata.scopes_supported?.join(' ');
|
||||
|
||||
logger.debug(`[MCPOAuth] Starting authorization with scope: ${scope}`);
|
||||
|
||||
let authorizationUrl: URL;
|
||||
let codeVerifier: string;
|
||||
|
||||
try {
|
||||
logger.debug(`[MCPOAuth] Calling startAuthorization...`);
|
||||
const authResult = await startAuthorization(serverUrl, {
|
||||
metadata: metadata as unknown as SDKOAuthMetadata,
|
||||
clientInformation: clientInfo,
|
||||
redirectUrl: redirectUri,
|
||||
scope,
|
||||
});
|
||||
|
||||
authorizationUrl = authResult.authorizationUrl;
|
||||
codeVerifier = authResult.codeVerifier;
|
||||
|
||||
logger.debug(`[MCPOAuth] startAuthorization completed successfully`);
|
||||
logger.debug(`[MCPOAuth] Authorization URL: ${authorizationUrl.toString()}`);
|
||||
|
||||
/** Add state parameter with flowId to the authorization URL */
|
||||
authorizationUrl.searchParams.set('state', flowId);
|
||||
logger.debug(`[MCPOAuth] Added state parameter to authorization URL`);
|
||||
} catch (error) {
|
||||
logger.error(`[MCPOAuth] startAuthorization failed:`, error);
|
||||
throw error;
|
||||
}
|
||||
|
||||
const flowMetadata: MCPOAuthFlowMetadata = {
|
||||
serverName,
|
||||
userId,
|
||||
serverUrl,
|
||||
state,
|
||||
codeVerifier,
|
||||
clientInfo,
|
||||
metadata,
|
||||
resourceMetadata,
|
||||
};
|
||||
|
||||
logger.debug(
|
||||
`[MCPOAuth] Authorization URL generated for ${serverName}: ${authorizationUrl.toString()}`,
|
||||
);
|
||||
|
||||
const result = {
|
||||
authorizationUrl: authorizationUrl.toString(),
|
||||
flowId,
|
||||
flowMetadata,
|
||||
};
|
||||
|
||||
logger.debug(
|
||||
`[MCPOAuth] Returning from initiateOAuthFlow with result ${flowId} for ${serverName}`,
|
||||
result,
|
||||
);
|
||||
return result;
|
||||
} catch (error) {
|
||||
logger.error('[MCPOAuth] Failed to initiate OAuth flow', { error, serverName, userId });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Completes the OAuth flow by exchanging the authorization code for tokens
|
||||
*/
|
||||
static async completeOAuthFlow(
|
||||
flowId: string,
|
||||
authorizationCode: string,
|
||||
flowManager: FlowStateManager<MCPOAuthTokens>,
|
||||
): Promise<MCPOAuthTokens> {
|
||||
try {
|
||||
/** Flow state which contains our metadata */
|
||||
const flowState = await flowManager.getFlowState(flowId, this.FLOW_TYPE);
|
||||
if (!flowState) {
|
||||
throw new Error('OAuth flow not found');
|
||||
}
|
||||
|
||||
const flowMetadata = flowState.metadata as MCPOAuthFlowMetadata;
|
||||
if (!flowMetadata) {
|
||||
throw new Error('OAuth flow metadata not found');
|
||||
}
|
||||
|
||||
const metadata = flowMetadata;
|
||||
if (!metadata.metadata || !metadata.clientInfo || !metadata.codeVerifier) {
|
||||
throw new Error('Invalid flow metadata');
|
||||
}
|
||||
|
||||
const tokens = await exchangeAuthorization(metadata.serverUrl, {
|
||||
metadata: metadata.metadata as unknown as SDKOAuthMetadata,
|
||||
clientInformation: metadata.clientInfo,
|
||||
authorizationCode,
|
||||
codeVerifier: metadata.codeVerifier,
|
||||
redirectUri: metadata.clientInfo.redirect_uris?.[0] || this.getDefaultRedirectUri(),
|
||||
});
|
||||
|
||||
logger.debug('[MCPOAuth] Raw tokens from exchange:', {
|
||||
access_token: tokens.access_token ? '[REDACTED]' : undefined,
|
||||
refresh_token: tokens.refresh_token ? '[REDACTED]' : undefined,
|
||||
expires_in: tokens.expires_in,
|
||||
token_type: tokens.token_type,
|
||||
scope: tokens.scope,
|
||||
});
|
||||
|
||||
const mcpTokens: MCPOAuthTokens = {
|
||||
...tokens,
|
||||
obtained_at: Date.now(),
|
||||
expires_at: tokens.expires_in ? Date.now() + tokens.expires_in * 1000 : undefined,
|
||||
};
|
||||
|
||||
/** Now complete the flow with the tokens */
|
||||
await flowManager.completeFlow(flowId, this.FLOW_TYPE, mcpTokens);
|
||||
|
||||
return mcpTokens;
|
||||
} catch (error) {
|
||||
logger.error('[MCPOAuth] Failed to complete OAuth flow', { error, flowId });
|
||||
await flowManager.failFlow(flowId, this.FLOW_TYPE, error as Error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the OAuth flow metadata
|
||||
*/
|
||||
static async getFlowState(
|
||||
flowId: string,
|
||||
flowManager: FlowStateManager<MCPOAuthTokens>,
|
||||
): Promise<MCPOAuthFlowMetadata | null> {
|
||||
const flowState = await flowManager.getFlowState(flowId, this.FLOW_TYPE);
|
||||
if (!flowState) {
|
||||
return null;
|
||||
}
|
||||
return flowState.metadata as MCPOAuthFlowMetadata;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates a flow ID for the OAuth flow
|
||||
* @returns Consistent ID so concurrent requests share the same flow
|
||||
*/
|
||||
public static generateFlowId(userId: string, serverName: string): string {
|
||||
return `${userId}:${serverName}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates a secure state parameter
|
||||
*/
|
||||
private static generateState(): string {
|
||||
return randomBytes(32).toString('base64url');
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the default redirect URI for a server
|
||||
*/
|
||||
private static getDefaultRedirectUri(serverName?: string): string {
|
||||
const baseUrl = process.env.DOMAIN_SERVER || 'http://localhost:3080';
|
||||
return serverName
|
||||
? `${baseUrl}/api/mcp/${serverName}/oauth/callback`
|
||||
: `${baseUrl}/api/mcp/oauth/callback`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Refreshes OAuth tokens using a refresh token
|
||||
*/
|
||||
static async refreshOAuthTokens(
|
||||
refreshToken: string,
|
||||
metadata: { serverName: string; serverUrl?: string; clientInfo?: OAuthClientInformation },
|
||||
config?: MCPOptions['oauth'],
|
||||
): Promise<MCPOAuthTokens> {
|
||||
logger.debug(`[MCPOAuth] Refreshing tokens for ${metadata.serverName}`);
|
||||
|
||||
try {
|
||||
/** If we have stored client information from the original flow, use that first */
|
||||
if (metadata.clientInfo?.client_id) {
|
||||
logger.debug(
|
||||
`[MCPOAuth] Using stored client information for token refresh for ${metadata.serverName}`,
|
||||
);
|
||||
logger.debug(
|
||||
`[MCPOAuth] Client ID: ${metadata.clientInfo.client_id} for ${metadata.serverName}`,
|
||||
);
|
||||
logger.debug(
|
||||
`[MCPOAuth] Has client secret: ${!!metadata.clientInfo.client_secret} for ${metadata.serverName}`,
|
||||
);
|
||||
logger.debug(`[MCPOAuth] Stored client info for ${metadata.serverName}:`, {
|
||||
client_id: metadata.clientInfo.client_id,
|
||||
has_client_secret: !!metadata.clientInfo.client_secret,
|
||||
grant_types: metadata.clientInfo.grant_types,
|
||||
scope: metadata.clientInfo.scope,
|
||||
});
|
||||
|
||||
/** Use the stored client information and metadata to determine the token URL */
|
||||
let tokenUrl: string;
|
||||
if (config?.token_url) {
|
||||
tokenUrl = config.token_url;
|
||||
} else if (!metadata.serverUrl) {
|
||||
throw new Error('No token URL available for refresh');
|
||||
} else {
|
||||
/** Auto-discover OAuth configuration for refresh */
|
||||
const { metadata: oauthMetadata } = await this.discoverMetadata(metadata.serverUrl);
|
||||
if (!oauthMetadata.token_endpoint) {
|
||||
throw new Error('No token endpoint found in OAuth metadata');
|
||||
}
|
||||
tokenUrl = oauthMetadata.token_endpoint;
|
||||
}
|
||||
|
||||
const body = new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
refresh_token: refreshToken,
|
||||
});
|
||||
|
||||
/** Add scope if available */
|
||||
if (metadata.clientInfo.scope) {
|
||||
body.append('scope', metadata.clientInfo.scope);
|
||||
}
|
||||
|
||||
const headers: HeadersInit = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
/** Use client_secret for authentication if available */
|
||||
if (metadata.clientInfo.client_secret) {
|
||||
const clientAuth = Buffer.from(
|
||||
`${metadata.clientInfo.client_id}:${metadata.clientInfo.client_secret}`,
|
||||
).toString('base64');
|
||||
headers['Authorization'] = `Basic ${clientAuth}`;
|
||||
} else {
|
||||
/** For public clients, client_id must be in the body */
|
||||
body.append('client_id', metadata.clientInfo.client_id);
|
||||
}
|
||||
|
||||
logger.debug(`[MCPOAuth] Refresh request to: ${tokenUrl}`, {
|
||||
body: body.toString(),
|
||||
headers,
|
||||
});
|
||||
|
||||
const response = await fetch(tokenUrl, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
throw new Error(
|
||||
`Token refresh failed: ${response.status} ${response.statusText} - ${errorText}`,
|
||||
);
|
||||
}
|
||||
|
||||
const tokens = await response.json();
|
||||
|
||||
return {
|
||||
...tokens,
|
||||
obtained_at: Date.now(),
|
||||
expires_at: tokens.expires_in ? Date.now() + tokens.expires_in * 1000 : undefined,
|
||||
};
|
||||
}
|
||||
|
||||
// Fallback: If we have pre-configured OAuth settings, use them
|
||||
if (config?.token_url && config?.client_id) {
|
||||
logger.debug(`[MCPOAuth] Using pre-configured OAuth settings for token refresh`);
|
||||
|
||||
const tokenUrl = new URL(config.token_url);
|
||||
const clientAuth = config.client_secret
|
||||
? Buffer.from(`${config.client_id}:${config.client_secret}`).toString('base64')
|
||||
: null;
|
||||
|
||||
const body = new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
refresh_token: refreshToken,
|
||||
});
|
||||
|
||||
if (config.scope) {
|
||||
body.append('scope', config.scope);
|
||||
}
|
||||
|
||||
const headers: HeadersInit = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
if (clientAuth) {
|
||||
headers['Authorization'] = `Basic ${clientAuth}`;
|
||||
} else {
|
||||
// Use client_id in body for public clients
|
||||
body.append('client_id', config.client_id);
|
||||
}
|
||||
|
||||
const response = await fetch(tokenUrl, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
throw new Error(
|
||||
`Token refresh failed: ${response.status} ${response.statusText} - ${errorText}`,
|
||||
);
|
||||
}
|
||||
|
||||
const tokens = await response.json();
|
||||
|
||||
return {
|
||||
...tokens,
|
||||
obtained_at: Date.now(),
|
||||
expires_at: tokens.expires_in ? Date.now() + tokens.expires_in * 1000 : undefined,
|
||||
};
|
||||
}
|
||||
|
||||
/** For auto-discovered OAuth, we need the server URL */
|
||||
if (!metadata.serverUrl) {
|
||||
throw new Error('Server URL required for auto-discovered OAuth token refresh');
|
||||
}
|
||||
|
||||
/** Auto-discover OAuth configuration for refresh */
|
||||
const { metadata: oauthMetadata } = await this.discoverMetadata(metadata.serverUrl);
|
||||
|
||||
if (!oauthMetadata.token_endpoint) {
|
||||
throw new Error('No token endpoint found in OAuth metadata');
|
||||
}
|
||||
|
||||
const tokenUrl = new URL(oauthMetadata.token_endpoint);
|
||||
|
||||
const body = new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
refresh_token: refreshToken,
|
||||
});
|
||||
|
||||
const headers: HeadersInit = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
const response = await fetch(tokenUrl, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
throw new Error(
|
||||
`Token refresh failed: ${response.status} ${response.statusText} - ${errorText}`,
|
||||
);
|
||||
}
|
||||
|
||||
const tokens = await response.json();
|
||||
|
||||
return {
|
||||
...tokens,
|
||||
obtained_at: Date.now(),
|
||||
expires_at: tokens.expires_in ? Date.now() + tokens.expires_in * 1000 : undefined,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error(`[MCPOAuth] Failed to refresh tokens for ${metadata.serverName}`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
}
|
3
packages/api/src/mcp/oauth/index.ts
Normal file
3
packages/api/src/mcp/oauth/index.ts
Normal file
|
@ -0,0 +1,3 @@
|
|||
export * from './types';
|
||||
export * from './handler';
|
||||
export * from './tokens';
|
382
packages/api/src/mcp/oauth/tokens.ts
Normal file
382
packages/api/src/mcp/oauth/tokens.ts
Normal file
|
@ -0,0 +1,382 @@
|
|||
import { logger } from '@librechat/data-schemas';
|
||||
import type { OAuthTokens, OAuthClientInformation } from '@modelcontextprotocol/sdk/shared/auth.js';
|
||||
import type { TokenMethods, IToken } from '@librechat/data-schemas';
|
||||
import type { MCPOAuthTokens, ExtendedOAuthTokens } from './types';
|
||||
import { encryptV2, decryptV2 } from '~/crypto';
|
||||
import { isSystemUserId } from '~/mcp/enum';
|
||||
|
||||
interface StoreTokensParams {
|
||||
userId: string;
|
||||
serverName: string;
|
||||
tokens: OAuthTokens | ExtendedOAuthTokens | MCPOAuthTokens;
|
||||
createToken: TokenMethods['createToken'];
|
||||
updateToken?: TokenMethods['updateToken'];
|
||||
findToken?: TokenMethods['findToken'];
|
||||
clientInfo?: OAuthClientInformation;
|
||||
/** Optional: Pass existing token state to avoid duplicate DB calls */
|
||||
existingTokens?: {
|
||||
accessToken?: IToken | null;
|
||||
refreshToken?: IToken | null;
|
||||
clientInfoToken?: IToken | null;
|
||||
};
|
||||
}
|
||||
|
||||
interface GetTokensParams {
|
||||
userId: string;
|
||||
serverName: string;
|
||||
findToken: TokenMethods['findToken'];
|
||||
refreshTokens?: (
|
||||
refreshToken: string,
|
||||
metadata: { userId: string; serverName: string; identifier: string },
|
||||
) => Promise<MCPOAuthTokens>;
|
||||
createToken?: TokenMethods['createToken'];
|
||||
updateToken?: TokenMethods['updateToken'];
|
||||
}
|
||||
|
||||
export class MCPTokenStorage {
|
||||
static getLogPrefix(userId: string, serverName: string): string {
|
||||
return isSystemUserId(userId)
|
||||
? `[MCP][${serverName}]`
|
||||
: `[MCP][User: ${userId}][${serverName}]`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Stores OAuth tokens for an MCP server
|
||||
*
|
||||
* @param params.existingTokens - Optional: Pass existing token state to avoid duplicate DB calls.
|
||||
* This is useful when refreshing tokens, as getTokens() already has the token state.
|
||||
*/
|
||||
static async storeTokens({
|
||||
userId,
|
||||
serverName,
|
||||
tokens,
|
||||
createToken,
|
||||
updateToken,
|
||||
findToken,
|
||||
clientInfo,
|
||||
existingTokens,
|
||||
}: StoreTokensParams): Promise<void> {
|
||||
const logPrefix = this.getLogPrefix(userId, serverName);
|
||||
|
||||
try {
|
||||
const identifier = `mcp:${serverName}`;
|
||||
|
||||
// Encrypt and store access token
|
||||
const encryptedAccessToken = await encryptV2(tokens.access_token);
|
||||
|
||||
logger.debug(
|
||||
`${logPrefix} Token expires_in: ${'expires_in' in tokens ? tokens.expires_in : 'N/A'}, expires_at: ${'expires_at' in tokens ? tokens.expires_at : 'N/A'}`,
|
||||
);
|
||||
|
||||
// Handle both expires_in and expires_at formats
|
||||
let accessTokenExpiry: Date;
|
||||
if ('expires_at' in tokens && tokens.expires_at) {
|
||||
/** MCPOAuthTokens format - already has calculated expiry */
|
||||
logger.debug(`${logPrefix} Using expires_at: ${tokens.expires_at}`);
|
||||
accessTokenExpiry = new Date(tokens.expires_at);
|
||||
} else if (tokens.expires_in) {
|
||||
/** Standard OAuthTokens format - calculate expiry */
|
||||
logger.debug(`${logPrefix} Using expires_in: ${tokens.expires_in}`);
|
||||
accessTokenExpiry = new Date(Date.now() + tokens.expires_in * 1000);
|
||||
} else {
|
||||
/** No expiry provided - default to 1 year */
|
||||
logger.debug(`${logPrefix} No expiry provided, using default`);
|
||||
accessTokenExpiry = new Date(Date.now() + 365 * 24 * 60 * 60 * 1000);
|
||||
}
|
||||
|
||||
logger.debug(`${logPrefix} Calculated expiry date: ${accessTokenExpiry.toISOString()}`);
|
||||
logger.debug(
|
||||
`${logPrefix} Date object: ${JSON.stringify({
|
||||
time: accessTokenExpiry.getTime(),
|
||||
valid: !isNaN(accessTokenExpiry.getTime()),
|
||||
iso: accessTokenExpiry.toISOString(),
|
||||
})}`,
|
||||
);
|
||||
|
||||
// Ensure the date is valid before passing to createToken
|
||||
if (isNaN(accessTokenExpiry.getTime())) {
|
||||
logger.error(`${logPrefix} Invalid expiry date calculated, using default`);
|
||||
accessTokenExpiry = new Date(Date.now() + 365 * 24 * 60 * 60 * 1000);
|
||||
}
|
||||
|
||||
// Calculate expiresIn (seconds from now)
|
||||
const expiresIn = Math.floor((accessTokenExpiry.getTime() - Date.now()) / 1000);
|
||||
|
||||
const accessTokenData = {
|
||||
userId,
|
||||
type: 'mcp_oauth',
|
||||
identifier,
|
||||
token: encryptedAccessToken,
|
||||
expiresIn: expiresIn > 0 ? expiresIn : 365 * 24 * 60 * 60, // Default to 1 year if negative
|
||||
};
|
||||
|
||||
// Check if token already exists and update if it does
|
||||
if (findToken && updateToken) {
|
||||
// Use provided existing token state if available, otherwise look it up
|
||||
const existingToken =
|
||||
existingTokens?.accessToken !== undefined
|
||||
? existingTokens.accessToken
|
||||
: await findToken({ userId, identifier });
|
||||
|
||||
if (existingToken) {
|
||||
await updateToken({ userId, identifier }, accessTokenData);
|
||||
logger.debug(`${logPrefix} Updated existing access token`);
|
||||
} else {
|
||||
await createToken(accessTokenData);
|
||||
logger.debug(`${logPrefix} Created new access token`);
|
||||
}
|
||||
} else {
|
||||
// Create new token if it's initial store or update methods not provided
|
||||
await createToken(accessTokenData);
|
||||
logger.debug(`${logPrefix} Created access token (no update methods available)`);
|
||||
}
|
||||
|
||||
// Store refresh token if available
|
||||
if (tokens.refresh_token) {
|
||||
const encryptedRefreshToken = await encryptV2(tokens.refresh_token);
|
||||
const extendedTokens = tokens as ExtendedOAuthTokens;
|
||||
const refreshTokenExpiry = extendedTokens.refresh_token_expires_in
|
||||
? new Date(Date.now() + extendedTokens.refresh_token_expires_in * 1000)
|
||||
: new Date(Date.now() + 365 * 24 * 60 * 60 * 1000); // Default to 1 year
|
||||
|
||||
/** Calculated expiresIn for refresh token */
|
||||
const refreshExpiresIn = Math.floor((refreshTokenExpiry.getTime() - Date.now()) / 1000);
|
||||
|
||||
const refreshTokenData = {
|
||||
userId,
|
||||
type: 'mcp_oauth_refresh',
|
||||
identifier: `${identifier}:refresh`,
|
||||
token: encryptedRefreshToken,
|
||||
expiresIn: refreshExpiresIn > 0 ? refreshExpiresIn : 365 * 24 * 60 * 60,
|
||||
};
|
||||
|
||||
// Check if refresh token already exists and update if it does
|
||||
if (findToken && updateToken) {
|
||||
// Use provided existing token state if available, otherwise look it up
|
||||
const existingRefreshToken =
|
||||
existingTokens?.refreshToken !== undefined
|
||||
? existingTokens.refreshToken
|
||||
: await findToken({
|
||||
userId,
|
||||
identifier: `${identifier}:refresh`,
|
||||
});
|
||||
|
||||
if (existingRefreshToken) {
|
||||
await updateToken({ userId, identifier: `${identifier}:refresh` }, refreshTokenData);
|
||||
logger.debug(`${logPrefix} Updated existing refresh token`);
|
||||
} else {
|
||||
await createToken(refreshTokenData);
|
||||
logger.debug(`${logPrefix} Created new refresh token`);
|
||||
}
|
||||
} else {
|
||||
await createToken(refreshTokenData);
|
||||
logger.debug(`${logPrefix} Created refresh token (no update methods available)`);
|
||||
}
|
||||
}
|
||||
|
||||
/** Store client information if provided */
|
||||
if (clientInfo) {
|
||||
logger.debug(`${logPrefix} Storing client info:`, {
|
||||
client_id: clientInfo.client_id,
|
||||
has_client_secret: !!clientInfo.client_secret,
|
||||
});
|
||||
const encryptedClientInfo = await encryptV2(JSON.stringify(clientInfo));
|
||||
|
||||
const clientInfoData = {
|
||||
userId,
|
||||
type: 'mcp_oauth_client',
|
||||
identifier: `${identifier}:client`,
|
||||
token: encryptedClientInfo,
|
||||
expiresIn: 365 * 24 * 60 * 60,
|
||||
};
|
||||
|
||||
// Check if client info already exists and update if it does
|
||||
if (findToken && updateToken) {
|
||||
// Use provided existing token state if available, otherwise look it up
|
||||
const existingClientInfo =
|
||||
existingTokens?.clientInfoToken !== undefined
|
||||
? existingTokens.clientInfoToken
|
||||
: await findToken({
|
||||
userId,
|
||||
identifier: `${identifier}:client`,
|
||||
});
|
||||
|
||||
if (existingClientInfo) {
|
||||
await updateToken({ userId, identifier: `${identifier}:client` }, clientInfoData);
|
||||
logger.debug(`${logPrefix} Updated existing client info`);
|
||||
} else {
|
||||
await createToken(clientInfoData);
|
||||
logger.debug(`${logPrefix} Created new client info`);
|
||||
}
|
||||
} else {
|
||||
await createToken(clientInfoData);
|
||||
logger.debug(`${logPrefix} Created client info (no update methods available)`);
|
||||
}
|
||||
}
|
||||
|
||||
logger.debug(`${logPrefix} Stored OAuth tokens`);
|
||||
} catch (error) {
|
||||
const logPrefix = this.getLogPrefix(userId, serverName);
|
||||
logger.error(`${logPrefix} Failed to store tokens`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves OAuth tokens for an MCP server
|
||||
*/
|
||||
static async getTokens({
|
||||
userId,
|
||||
serverName,
|
||||
findToken,
|
||||
createToken,
|
||||
updateToken,
|
||||
refreshTokens,
|
||||
}: GetTokensParams): Promise<MCPOAuthTokens | null> {
|
||||
const logPrefix = this.getLogPrefix(userId, serverName);
|
||||
|
||||
try {
|
||||
const identifier = `mcp:${serverName}`;
|
||||
|
||||
// Get access token
|
||||
const accessTokenData = await findToken({
|
||||
userId,
|
||||
type: 'mcp_oauth',
|
||||
identifier,
|
||||
});
|
||||
|
||||
/** Check if access token is missing or expired */
|
||||
const isMissing = !accessTokenData;
|
||||
const isExpired = accessTokenData?.expiresAt && new Date() >= accessTokenData.expiresAt;
|
||||
|
||||
if (isMissing || isExpired) {
|
||||
logger.info(`${logPrefix} Access token ${isMissing ? 'missing' : 'expired'}`);
|
||||
|
||||
/** Refresh data if we have a refresh token and refresh function */
|
||||
const refreshTokenData = await findToken({
|
||||
userId,
|
||||
type: 'mcp_oauth_refresh',
|
||||
identifier: `${identifier}:refresh`,
|
||||
});
|
||||
|
||||
if (!refreshTokenData) {
|
||||
logger.info(
|
||||
`${logPrefix} Access token ${isMissing ? 'missing' : 'expired'} and no refresh token available`,
|
||||
);
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!refreshTokens) {
|
||||
logger.warn(
|
||||
`${logPrefix} Access token ${isMissing ? 'missing' : 'expired'}, refresh token available but no \`refreshTokens\` provided`,
|
||||
);
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!createToken) {
|
||||
logger.warn(
|
||||
`${logPrefix} Access token ${isMissing ? 'missing' : 'expired'}, refresh token available but no \`createToken\` function provided`,
|
||||
);
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
logger.info(`${logPrefix} Attempting to refresh token`);
|
||||
const decryptedRefreshToken = await decryptV2(refreshTokenData.token);
|
||||
|
||||
/** Client information if available */
|
||||
let clientInfo;
|
||||
let clientInfoData;
|
||||
try {
|
||||
clientInfoData = await findToken({
|
||||
userId,
|
||||
type: 'mcp_oauth_client',
|
||||
identifier: `${identifier}:client`,
|
||||
});
|
||||
if (clientInfoData) {
|
||||
const decryptedClientInfo = await decryptV2(clientInfoData.token);
|
||||
clientInfo = JSON.parse(decryptedClientInfo);
|
||||
logger.debug(`${logPrefix} Retrieved client info:`, {
|
||||
client_id: clientInfo.client_id,
|
||||
has_client_secret: !!clientInfo.client_secret,
|
||||
});
|
||||
}
|
||||
} catch {
|
||||
logger.debug(`${logPrefix} No client info found`);
|
||||
}
|
||||
|
||||
const metadata = {
|
||||
userId,
|
||||
serverName,
|
||||
identifier,
|
||||
clientInfo,
|
||||
};
|
||||
|
||||
const newTokens = await refreshTokens(decryptedRefreshToken, metadata);
|
||||
|
||||
// Store the refreshed tokens (handles both create and update)
|
||||
// Pass existing token state to avoid duplicate DB calls
|
||||
await this.storeTokens({
|
||||
userId,
|
||||
serverName,
|
||||
tokens: newTokens,
|
||||
createToken,
|
||||
updateToken,
|
||||
findToken,
|
||||
clientInfo,
|
||||
existingTokens: {
|
||||
accessToken: accessTokenData, // We know this is expired/missing
|
||||
refreshToken: refreshTokenData, // We already have this
|
||||
clientInfoToken: clientInfoData, // We already looked this up
|
||||
},
|
||||
});
|
||||
|
||||
logger.info(`${logPrefix} Successfully refreshed and stored OAuth tokens`);
|
||||
return newTokens;
|
||||
} catch (refreshError) {
|
||||
logger.error(`${logPrefix} Failed to refresh tokens`, refreshError);
|
||||
// Check if it's an unauthorized_client error (refresh not supported)
|
||||
const errorMessage =
|
||||
refreshError instanceof Error ? refreshError.message : String(refreshError);
|
||||
if (errorMessage.includes('unauthorized_client')) {
|
||||
logger.info(
|
||||
`${logPrefix} Server does not support refresh tokens for this client. New authentication required.`,
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// If we reach here, access token should exist and be valid
|
||||
if (!accessTokenData) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const decryptedAccessToken = await decryptV2(accessTokenData.token);
|
||||
|
||||
/** Get refresh token if available */
|
||||
const refreshTokenData = await findToken({
|
||||
userId,
|
||||
type: 'mcp_oauth_refresh',
|
||||
identifier: `${identifier}:refresh`,
|
||||
});
|
||||
|
||||
const tokens: MCPOAuthTokens = {
|
||||
access_token: decryptedAccessToken,
|
||||
token_type: 'Bearer',
|
||||
obtained_at: accessTokenData.createdAt.getTime(),
|
||||
expires_at: accessTokenData.expiresAt?.getTime(),
|
||||
};
|
||||
|
||||
if (refreshTokenData) {
|
||||
tokens.refresh_token = await decryptV2(refreshTokenData.token);
|
||||
}
|
||||
|
||||
logger.debug(`${logPrefix} Loaded existing OAuth tokens from storage`);
|
||||
return tokens;
|
||||
} catch (error) {
|
||||
logger.error(`${logPrefix} Failed to retrieve tokens`, error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
98
packages/api/src/mcp/oauth/types.ts
Normal file
98
packages/api/src/mcp/oauth/types.ts
Normal file
|
@ -0,0 +1,98 @@
|
|||
import type { OAuthTokens } from '@modelcontextprotocol/sdk/shared/auth.js';
|
||||
import type { FlowMetadata } from '~/flow/types';
|
||||
|
||||
export interface OAuthMetadata {
|
||||
/** OAuth authorization endpoint */
|
||||
authorization_endpoint: string;
|
||||
/** OAuth token endpoint */
|
||||
token_endpoint: string;
|
||||
/** OAuth issuer */
|
||||
issuer?: string;
|
||||
/** Supported scopes */
|
||||
scopes_supported?: string[];
|
||||
/** Response types supported */
|
||||
response_types_supported?: string[];
|
||||
/** Grant types supported */
|
||||
grant_types_supported?: string[];
|
||||
/** Token endpoint auth methods supported */
|
||||
token_endpoint_auth_methods_supported?: string[];
|
||||
/** Code challenge methods supported */
|
||||
code_challenge_methods_supported?: string[];
|
||||
}
|
||||
|
||||
export interface OAuthProtectedResourceMetadata {
|
||||
/** Resource identifier */
|
||||
resource: string;
|
||||
/** Authorization servers */
|
||||
authorization_servers?: string[];
|
||||
/** Scopes supported by the resource */
|
||||
scopes_supported?: string[];
|
||||
}
|
||||
|
||||
export interface OAuthClientInformation {
|
||||
/** Client ID */
|
||||
client_id: string;
|
||||
/** Client secret (optional for public clients) */
|
||||
client_secret?: string;
|
||||
/** Client name */
|
||||
client_name?: string;
|
||||
/** Redirect URIs */
|
||||
redirect_uris?: string[];
|
||||
/** Grant types */
|
||||
grant_types?: string[];
|
||||
/** Response types */
|
||||
response_types?: string[];
|
||||
/** Scope */
|
||||
scope?: string;
|
||||
/** Token endpoint auth method */
|
||||
token_endpoint_auth_method?: string;
|
||||
}
|
||||
|
||||
export interface MCPOAuthState {
|
||||
/** Current step in the OAuth flow */
|
||||
step: 'discovery' | 'registration' | 'authorization' | 'token_exchange' | 'complete' | 'error';
|
||||
/** Server name */
|
||||
serverName: string;
|
||||
/** User ID */
|
||||
userId: string;
|
||||
/** OAuth metadata from discovery */
|
||||
metadata?: OAuthMetadata;
|
||||
/** Resource metadata */
|
||||
resourceMetadata?: OAuthProtectedResourceMetadata;
|
||||
/** Client information */
|
||||
clientInfo?: OAuthClientInformation;
|
||||
/** Authorization URL */
|
||||
authorizationUrl?: string;
|
||||
/** Code verifier for PKCE */
|
||||
codeVerifier?: string;
|
||||
/** State parameter for OAuth flow */
|
||||
state?: string;
|
||||
/** Error information */
|
||||
error?: string;
|
||||
/** Timestamp */
|
||||
timestamp: number;
|
||||
}
|
||||
|
||||
export interface MCPOAuthFlowMetadata extends FlowMetadata {
|
||||
serverName: string;
|
||||
userId: string;
|
||||
serverUrl: string;
|
||||
state: string;
|
||||
codeVerifier?: string;
|
||||
clientInfo?: OAuthClientInformation;
|
||||
metadata?: OAuthMetadata;
|
||||
resourceMetadata?: OAuthProtectedResourceMetadata;
|
||||
}
|
||||
|
||||
export interface MCPOAuthTokens extends OAuthTokens {
|
||||
/** When the tokens were obtained */
|
||||
obtained_at: number;
|
||||
/** Calculated expiry time */
|
||||
expires_at?: number;
|
||||
}
|
||||
|
||||
/** Extended OAuth tokens that may include refresh token expiry */
|
||||
export interface ExtendedOAuthTokens extends OAuthTokens {
|
||||
/** Refresh token expiry in seconds (non-standard, some providers include this) */
|
||||
refresh_token_expires_in?: number;
|
||||
}
|
1
packages/api/src/oauth/index.ts
Normal file
1
packages/api/src/oauth/index.ts
Normal file
|
@ -0,0 +1 @@
|
|||
export * from './tokens';
|
324
packages/api/src/oauth/tokens.ts
Normal file
324
packages/api/src/oauth/tokens.ts
Normal file
|
@ -0,0 +1,324 @@
|
|||
import axios from 'axios';
|
||||
import { logger } from '@librechat/data-schemas';
|
||||
import { TokenExchangeMethodEnum } from 'librechat-data-provider';
|
||||
import type { TokenMethods } from '@librechat/data-schemas';
|
||||
import type { AxiosError } from 'axios';
|
||||
import { encryptV2, decryptV2 } from '~/crypto';
|
||||
import { logAxiosError } from '~/utils';
|
||||
|
||||
export function createHandleOAuthToken({
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
}: {
|
||||
findToken: TokenMethods['findToken'];
|
||||
updateToken: TokenMethods['updateToken'];
|
||||
createToken: TokenMethods['createToken'];
|
||||
}) {
|
||||
/**
|
||||
* Handles the OAuth token by creating or updating the token.
|
||||
* @param fields
|
||||
* @param fields.userId - The user's ID.
|
||||
* @param fields.token - The full token to store.
|
||||
* @param fields.identifier - Unique, alternative identifier for the token.
|
||||
* @param fields.expiresIn - The number of seconds until the token expires.
|
||||
* @param fields.metadata - Additional metadata to store with the token.
|
||||
* @param [fields.type="oauth"] - The type of token. Default is 'oauth'.
|
||||
*/
|
||||
return async function handleOAuthToken({
|
||||
token,
|
||||
userId,
|
||||
identifier,
|
||||
expiresIn,
|
||||
metadata,
|
||||
type = 'oauth',
|
||||
}: {
|
||||
token: string;
|
||||
userId: string;
|
||||
identifier: string;
|
||||
expiresIn?: number | string | null;
|
||||
metadata?: Record<string, unknown>;
|
||||
type?: string;
|
||||
}) {
|
||||
const encrypedToken = await encryptV2(token);
|
||||
let expiresInNumber = 3600;
|
||||
if (typeof expiresIn === 'number') {
|
||||
expiresInNumber = expiresIn;
|
||||
} else if (expiresIn != null) {
|
||||
expiresInNumber = parseInt(expiresIn, 10) || 3600;
|
||||
}
|
||||
const tokenData = {
|
||||
type,
|
||||
userId,
|
||||
metadata,
|
||||
identifier,
|
||||
token: encrypedToken,
|
||||
expiresIn: expiresInNumber,
|
||||
};
|
||||
|
||||
const existingToken = await findToken({ userId, identifier });
|
||||
if (existingToken) {
|
||||
return await updateToken({ identifier }, tokenData);
|
||||
} else {
|
||||
return await createToken(tokenData);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Processes the access tokens and stores them in the database.
|
||||
* @param tokenData
|
||||
* @param tokenData.access_token
|
||||
* @param tokenData.expires_in
|
||||
* @param [tokenData.refresh_token]
|
||||
* @param [tokenData.refresh_token_expires_in]
|
||||
* @param metadata
|
||||
* @param metadata.userId
|
||||
* @param metadata.identifier
|
||||
*/
|
||||
async function processAccessTokens(
|
||||
tokenData: {
|
||||
access_token: string;
|
||||
expires_in: number;
|
||||
refresh_token?: string;
|
||||
refresh_token_expires_in?: number;
|
||||
},
|
||||
{ userId, identifier }: { userId: string; identifier: string },
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
}: {
|
||||
findToken: TokenMethods['findToken'];
|
||||
updateToken: TokenMethods['updateToken'];
|
||||
createToken: TokenMethods['createToken'];
|
||||
},
|
||||
) {
|
||||
const { access_token, expires_in = 3600, refresh_token, refresh_token_expires_in } = tokenData;
|
||||
if (!access_token) {
|
||||
logger.error('Access token not found: ', tokenData);
|
||||
throw new Error('Access token not found');
|
||||
}
|
||||
const handleOAuthToken = createHandleOAuthToken({
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
});
|
||||
await handleOAuthToken({
|
||||
identifier,
|
||||
token: access_token,
|
||||
expiresIn: expires_in,
|
||||
userId,
|
||||
});
|
||||
|
||||
if (refresh_token != null) {
|
||||
logger.debug('Processing refresh token');
|
||||
await handleOAuthToken({
|
||||
token: refresh_token,
|
||||
type: 'oauth_refresh',
|
||||
userId,
|
||||
identifier: `${identifier}:refresh`,
|
||||
expiresIn: refresh_token_expires_in ?? null,
|
||||
});
|
||||
}
|
||||
logger.debug('Access tokens processed');
|
||||
}
|
||||
|
||||
/**
|
||||
* Refreshes the access token using the refresh token.
|
||||
* @param fields
|
||||
* @param fields.userId - The ID of the user.
|
||||
* @param fields.client_url - The URL of the OAuth provider.
|
||||
* @param fields.identifier - The identifier for the token.
|
||||
* @param fields.refresh_token - The refresh token to use.
|
||||
* @param fields.token_exchange_method - The token exchange method ('default_post' or 'basic_auth_header').
|
||||
* @param fields.encrypted_oauth_client_id - The client ID for the OAuth provider.
|
||||
* @param fields.encrypted_oauth_client_secret - The client secret for the OAuth provider.
|
||||
*/
|
||||
export async function refreshAccessToken(
|
||||
{
|
||||
userId,
|
||||
client_url,
|
||||
identifier,
|
||||
refresh_token,
|
||||
token_exchange_method,
|
||||
encrypted_oauth_client_id,
|
||||
encrypted_oauth_client_secret,
|
||||
}: {
|
||||
userId: string;
|
||||
client_url: string;
|
||||
identifier: string;
|
||||
refresh_token: string;
|
||||
token_exchange_method: TokenExchangeMethodEnum;
|
||||
encrypted_oauth_client_id: string;
|
||||
encrypted_oauth_client_secret: string;
|
||||
},
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
}: {
|
||||
findToken: TokenMethods['findToken'];
|
||||
updateToken: TokenMethods['updateToken'];
|
||||
createToken: TokenMethods['createToken'];
|
||||
},
|
||||
): Promise<{
|
||||
access_token: string;
|
||||
expires_in: number;
|
||||
refresh_token?: string;
|
||||
refresh_token_expires_in?: number;
|
||||
}> {
|
||||
try {
|
||||
const oauth_client_id = await decryptV2(encrypted_oauth_client_id);
|
||||
const oauth_client_secret = await decryptV2(encrypted_oauth_client_secret);
|
||||
|
||||
const headers: Record<string, string> = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
const params = new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
refresh_token,
|
||||
});
|
||||
|
||||
if (token_exchange_method === TokenExchangeMethodEnum.BasicAuthHeader) {
|
||||
const basicAuth = Buffer.from(`${oauth_client_id}:${oauth_client_secret}`).toString('base64');
|
||||
headers['Authorization'] = `Basic ${basicAuth}`;
|
||||
} else {
|
||||
params.append('client_id', oauth_client_id);
|
||||
params.append('client_secret', oauth_client_secret);
|
||||
}
|
||||
|
||||
const response = await axios({
|
||||
method: 'POST',
|
||||
url: client_url,
|
||||
headers,
|
||||
data: params.toString(),
|
||||
});
|
||||
await processAccessTokens(
|
||||
response.data,
|
||||
{
|
||||
userId,
|
||||
identifier,
|
||||
},
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
},
|
||||
);
|
||||
logger.debug(`Access token refreshed successfully for ${identifier}`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
const message = 'Error refreshing OAuth tokens';
|
||||
throw new Error(
|
||||
logAxiosError({
|
||||
message,
|
||||
error: error as AxiosError,
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handles the OAuth callback and exchanges the authorization code for tokens.
|
||||
* @param {object} fields
|
||||
* @param {string} fields.code - The authorization code returned by the provider.
|
||||
* @param {string} fields.userId - The ID of the user.
|
||||
* @param {string} fields.identifier - The identifier for the token.
|
||||
* @param {string} fields.client_url - The URL of the OAuth provider.
|
||||
* @param {string} fields.redirect_uri - The redirect URI for the OAuth provider.
|
||||
* @param {string} fields.token_exchange_method - The token exchange method ('default_post' or 'basic_auth_header').
|
||||
* @param {string} fields.encrypted_oauth_client_id - The client ID for the OAuth provider.
|
||||
* @param {string} fields.encrypted_oauth_client_secret - The client secret for the OAuth provider.
|
||||
*/
|
||||
export async function getAccessToken(
|
||||
{
|
||||
code,
|
||||
userId,
|
||||
identifier,
|
||||
client_url,
|
||||
redirect_uri,
|
||||
token_exchange_method,
|
||||
encrypted_oauth_client_id,
|
||||
encrypted_oauth_client_secret,
|
||||
}: {
|
||||
code: string;
|
||||
userId: string;
|
||||
identifier: string;
|
||||
client_url: string;
|
||||
redirect_uri: string;
|
||||
token_exchange_method: TokenExchangeMethodEnum;
|
||||
encrypted_oauth_client_id: string;
|
||||
encrypted_oauth_client_secret: string;
|
||||
},
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
}: {
|
||||
findToken: TokenMethods['findToken'];
|
||||
updateToken: TokenMethods['updateToken'];
|
||||
createToken: TokenMethods['createToken'];
|
||||
},
|
||||
): Promise<{
|
||||
access_token: string;
|
||||
expires_in: number;
|
||||
refresh_token?: string;
|
||||
refresh_token_expires_in?: number;
|
||||
}> {
|
||||
const oauth_client_id = await decryptV2(encrypted_oauth_client_id);
|
||||
const oauth_client_secret = await decryptV2(encrypted_oauth_client_secret);
|
||||
|
||||
const headers: Record<string, string> = {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json',
|
||||
};
|
||||
|
||||
const params = new URLSearchParams({
|
||||
code,
|
||||
grant_type: 'authorization_code',
|
||||
redirect_uri,
|
||||
});
|
||||
|
||||
if (token_exchange_method === TokenExchangeMethodEnum.BasicAuthHeader) {
|
||||
const basicAuth = Buffer.from(`${oauth_client_id}:${oauth_client_secret}`).toString('base64');
|
||||
headers['Authorization'] = `Basic ${basicAuth}`;
|
||||
} else {
|
||||
params.append('client_id', oauth_client_id);
|
||||
params.append('client_secret', oauth_client_secret);
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await axios({
|
||||
method: 'POST',
|
||||
url: client_url,
|
||||
headers,
|
||||
data: params.toString(),
|
||||
});
|
||||
|
||||
await processAccessTokens(
|
||||
response.data,
|
||||
{
|
||||
userId,
|
||||
identifier,
|
||||
},
|
||||
{
|
||||
findToken,
|
||||
updateToken,
|
||||
createToken,
|
||||
},
|
||||
);
|
||||
logger.debug(`Access tokens successfully created for ${identifier}`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
const message = 'Error exchanging OAuth code';
|
||||
throw new Error(
|
||||
logAxiosError({
|
||||
message,
|
||||
error: error as AxiosError,
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
|
@ -1136,6 +1136,10 @@ export enum CacheKeys {
|
|||
* Key for in-progress flow states.
|
||||
*/
|
||||
FLOWS = 'flows',
|
||||
/**
|
||||
* Key for individual MCP Tool Manifests.
|
||||
*/
|
||||
MCP_TOOLS = 'mcp_tools',
|
||||
/**
|
||||
* Key for pending chat requests (concurrency check)
|
||||
*/
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import { z } from 'zod';
|
||||
import type { TUser } from './types';
|
||||
import { extractEnvVariable } from './utils';
|
||||
import { TokenExchangeMethodEnum } from './types/assistants';
|
||||
|
||||
const BaseOptionsSchema = z.object({
|
||||
iconPath: z.string().optional(),
|
||||
|
@ -15,6 +16,29 @@ const BaseOptionsSchema = z.object({
|
|||
* - string: Use custom instructions (overrides server-provided)
|
||||
*/
|
||||
serverInstructions: z.union([z.boolean(), z.string()]).optional(),
|
||||
/**
|
||||
* OAuth configuration for SSE and Streamable HTTP transports
|
||||
* - Optional: OAuth can be auto-discovered on 401 responses
|
||||
* - Pre-configured values will skip discovery steps
|
||||
*/
|
||||
oauth: z
|
||||
.object({
|
||||
/** OAuth authorization endpoint (optional - can be auto-discovered) */
|
||||
authorization_url: z.string().url().optional(),
|
||||
/** OAuth token endpoint (optional - can be auto-discovered) */
|
||||
token_url: z.string().url().optional(),
|
||||
/** OAuth client ID (optional - can use dynamic registration) */
|
||||
client_id: z.string().optional(),
|
||||
/** OAuth client secret (optional - can use dynamic registration) */
|
||||
client_secret: z.string().optional(),
|
||||
/** OAuth scopes to request */
|
||||
scope: z.string().optional(),
|
||||
/** OAuth redirect URI (defaults to /api/mcp/{serverName}/oauth/callback) */
|
||||
redirect_uri: z.string().url().optional(),
|
||||
/** Token exchange method */
|
||||
token_exchange_method: z.nativeEnum(TokenExchangeMethodEnum).optional(),
|
||||
})
|
||||
.optional(),
|
||||
});
|
||||
|
||||
export const StdioOptionsSchema = BaseOptionsSchema.extend({
|
||||
|
|
|
@ -20,7 +20,7 @@ export function createMethods(mongoose: typeof import('mongoose')) {
|
|||
};
|
||||
}
|
||||
|
||||
export type { MemoryMethods, ShareMethods };
|
||||
export type { MemoryMethods, ShareMethods, TokenMethods };
|
||||
export type AllMethods = UserMethods &
|
||||
SessionMethods &
|
||||
TokenMethods &
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue