📦 refactor: Consolidate DB models, encapsulating Mongoose usage in data-schemas (#11830)

* chore: move database model methods to /packages/data-schemas

* chore: add TypeScript ESLint rule to warn on unused variables

* refactor: model imports to streamline access

- Consolidated model imports across various files to improve code organization and reduce redundancy.
- Updated imports for models such as Assistant, Message, Conversation, and others to a unified import path.
- Adjusted middleware and service files to reflect the new import structure, ensuring functionality remains intact.
- Enhanced test files to align with the new import paths, maintaining test coverage and integrity.

* chore: migrate database models to packages/data-schemas and refactor all direct Mongoose Model usage outside of data-schemas

* test: update agent model mocks in unit tests

- Added `getAgent` mock to `client.test.js` to enhance test coverage for agent-related functionality.
- Removed redundant `getAgent` and `getAgents` mocks from `openai.spec.js` and `responses.unit.spec.js` to streamline test setup and reduce duplication.
- Ensured consistency in agent mock implementations across test files.

* fix: update types in data-schemas

* refactor: enhance type definitions in transaction and spending methods

- Updated type definitions in `checkBalance.ts` to use specific request and response types.
- Refined `spendTokens.ts` to utilize a new `SpendTxData` interface for better clarity and type safety.
- Improved transaction handling in `transaction.ts` by introducing `TransactionResult` and `TxData` interfaces, ensuring consistent data structures across methods.
- Adjusted unit tests in `transaction.spec.ts` to accommodate new type definitions and enhance robustness.

* refactor: streamline model imports and enhance code organization

- Consolidated model imports across various controllers and services to a unified import path, improving code clarity and reducing redundancy.
- Updated multiple files to reflect the new import structure, ensuring all functionalities remain intact.
- Enhanced overall code organization by removing duplicate import statements and optimizing the usage of model methods.

* feat: implement loadAddedAgent and refactor agent loading logic

- Introduced `loadAddedAgent` function to handle loading agents from added conversations, supporting multi-convo parallel execution.
- Created a new `load.ts` file to encapsulate agent loading functionalities, including `loadEphemeralAgent` and `loadAgent`.
- Updated the `index.ts` file to export the new `load` module instead of the deprecated `loadAgent`.
- Enhanced type definitions and improved error handling in the agent loading process.
- Adjusted unit tests to reflect changes in the agent loading structure and ensure comprehensive coverage.

* refactor: enhance balance handling with new update interface

- Introduced `IBalanceUpdate` interface to streamline balance update operations across the codebase.
- Updated `upsertBalanceFields` method signatures in `balance.ts`, `transaction.ts`, and related tests to utilize the new interface for improved type safety.
- Adjusted type imports in `balance.spec.ts` to include `IBalanceUpdate`, ensuring consistency in balance management functionalities.
- Enhanced overall code clarity and maintainability by refining type definitions related to balance operations.

* feat: add unit tests for loadAgent functionality and enhance agent loading logic

- Introduced comprehensive unit tests for the `loadAgent` function, covering various scenarios including null and empty agent IDs, loading of ephemeral agents, and permission checks.
- Enhanced the `initializeClient` function by moving `getConvoFiles` to the correct position in the database method exports, ensuring proper functionality.
- Improved test coverage for agent loading, including handling of non-existent agents and user permissions.

* chore: reorder memory method exports for consistency

- Moved `deleteAllUserMemories` to the correct position in the exported memory methods, ensuring a consistent and logical order of method exports in `memory.ts`.
This commit is contained in:
Danny Avila 2026-02-17 18:23:44 -05:00
parent a85e99ff45
commit a6fb257bcf
No known key found for this signature in database
GPG key ID: BF31EEB2C5CA0956
182 changed files with 8548 additions and 8105 deletions

View file

@ -1,77 +0,0 @@
const { Action } = require('~/db/models');
/**
* Update an action with new data without overwriting existing properties,
* or create a new action if it doesn't exist.
*
* @param {Object} searchParams - The search parameters to find the action to update.
* @param {string} searchParams.action_id - The ID of the action to update.
* @param {string} searchParams.user - The user ID of the action's author.
* @param {Object} updateData - An object containing the properties to update.
* @returns {Promise<Action>} The updated or newly created action document as a plain object.
*/
const updateAction = async (searchParams, updateData) => {
const options = { new: true, upsert: true };
return await Action.findOneAndUpdate(searchParams, updateData, options).lean();
};
/**
* Retrieves all actions that match the given search parameters.
*
* @param {Object} searchParams - The search parameters to find matching actions.
* @param {boolean} includeSensitive - Flag to include sensitive data in the metadata.
* @returns {Promise<Array<Action>>} A promise that resolves to an array of action documents as plain objects.
*/
const getActions = async (searchParams, includeSensitive = false) => {
const actions = await Action.find(searchParams).lean();
if (!includeSensitive) {
for (let i = 0; i < actions.length; i++) {
const metadata = actions[i].metadata;
if (!metadata) {
continue;
}
const sensitiveFields = ['api_key', 'oauth_client_id', 'oauth_client_secret'];
for (let field of sensitiveFields) {
if (metadata[field]) {
delete metadata[field];
}
}
}
}
return actions;
};
/**
* Deletes an action by params.
*
* @param {Object} searchParams - The search parameters to find the action to delete.
* @param {string} searchParams.action_id - The ID of the action to delete.
* @param {string} searchParams.user - The user ID of the action's author.
* @returns {Promise<Action>} A promise that resolves to the deleted action document as a plain object, or null if no document was found.
*/
const deleteAction = async (searchParams) => {
return await Action.findOneAndDelete(searchParams).lean();
};
/**
* Deletes actions by params.
*
* @param {Object} searchParams - The search parameters to find the actions to delete.
* @param {string} searchParams.action_id - The ID of the action(s) to delete.
* @param {string} searchParams.user - The user ID of the action's author.
* @returns {Promise<Number>} A promise that resolves to the number of deleted action documents.
*/
const deleteActions = async (searchParams) => {
const result = await Action.deleteMany(searchParams);
return result.deletedCount;
};
module.exports = {
getActions,
updateAction,
deleteAction,
deleteActions,
};

View file

@ -1,845 +0,0 @@
const mongoose = require('mongoose');
const crypto = require('node:crypto');
const { logger } = require('@librechat/data-schemas');
const { getCustomEndpointConfig } = require('@librechat/api');
const {
Tools,
ResourceType,
actionDelimiter,
isAgentsEndpoint,
isEphemeralAgentId,
encodeEphemeralAgentId,
} = require('librechat-data-provider');
const { mcp_all, mcp_delimiter } = require('librechat-data-provider').Constants;
const { removeAllPermissions } = require('~/server/services/PermissionService');
const { getMCPServerTools } = require('~/server/services/Config');
const { Agent, AclEntry, User } = require('~/db/models');
const { getActions } = require('./Action');
/**
* Extracts unique MCP server names from tools array
* Tools format: "toolName_mcp_serverName" or "sys__server__sys_mcp_serverName"
* @param {string[]} tools - Array of tool identifiers
* @returns {string[]} Array of unique MCP server names
*/
const extractMCPServerNames = (tools) => {
if (!tools || !Array.isArray(tools)) {
return [];
}
const serverNames = new Set();
for (const tool of tools) {
if (!tool || !tool.includes(mcp_delimiter)) {
continue;
}
const parts = tool.split(mcp_delimiter);
if (parts.length >= 2) {
serverNames.add(parts[parts.length - 1]);
}
}
return Array.from(serverNames);
};
/**
* Create an agent with the provided data.
* @param {Object} agentData - The agent data to create.
* @returns {Promise<Agent>} The created agent document as a plain object.
* @throws {Error} If the agent creation fails.
*/
const createAgent = async (agentData) => {
const { author: _author, ...versionData } = agentData;
const timestamp = new Date();
const initialAgentData = {
...agentData,
versions: [
{
...versionData,
createdAt: timestamp,
updatedAt: timestamp,
},
],
category: agentData.category || 'general',
mcpServerNames: extractMCPServerNames(agentData.tools),
};
return (await Agent.create(initialAgentData)).toObject();
};
/**
* Get an agent document based on the provided ID.
*
* @param {Object} searchParameter - The search parameters to find the agent to update.
* @param {string} searchParameter.id - The ID of the agent to update.
* @param {string} searchParameter.author - The user ID of the agent's author.
* @returns {Promise<Agent|null>} The agent document as a plain object, or null if not found.
*/
const getAgent = async (searchParameter) => await Agent.findOne(searchParameter).lean();
/**
* Get multiple agent documents based on the provided search parameters.
*
* @param {Object} searchParameter - The search parameters to find agents.
* @returns {Promise<Agent[]>} Array of agent documents as plain objects.
*/
const getAgents = async (searchParameter) => await Agent.find(searchParameter).lean();
/**
* Load an agent based on the provided ID
*
* @param {Object} params
* @param {ServerRequest} params.req
* @param {string} params.spec
* @param {string} params.agent_id
* @param {string} params.endpoint
* @param {import('@librechat/agents').ClientOptions} [params.model_parameters]
* @returns {Promise<Agent|null>} The agent document as a plain object, or null if not found.
*/
const loadEphemeralAgent = async ({ req, spec, endpoint, model_parameters: _m }) => {
const { model, ...model_parameters } = _m;
const modelSpecs = req.config?.modelSpecs?.list;
/** @type {TModelSpec | null} */
let modelSpec = null;
if (spec != null && spec !== '') {
modelSpec = modelSpecs?.find((s) => s.name === spec) || null;
}
/** @type {TEphemeralAgent | null} */
const ephemeralAgent = req.body.ephemeralAgent;
const mcpServers = new Set(ephemeralAgent?.mcp);
const userId = req.user?.id; // note: userId cannot be undefined at runtime
if (modelSpec?.mcpServers) {
for (const mcpServer of modelSpec.mcpServers) {
mcpServers.add(mcpServer);
}
}
/** @type {string[]} */
const tools = [];
if (ephemeralAgent?.execute_code === true || modelSpec?.executeCode === true) {
tools.push(Tools.execute_code);
}
if (ephemeralAgent?.file_search === true || modelSpec?.fileSearch === true) {
tools.push(Tools.file_search);
}
if (ephemeralAgent?.web_search === true || modelSpec?.webSearch === true) {
tools.push(Tools.web_search);
}
const addedServers = new Set();
if (mcpServers.size > 0) {
for (const mcpServer of mcpServers) {
if (addedServers.has(mcpServer)) {
continue;
}
const serverTools = await getMCPServerTools(userId, mcpServer);
if (!serverTools) {
tools.push(`${mcp_all}${mcp_delimiter}${mcpServer}`);
addedServers.add(mcpServer);
continue;
}
tools.push(...Object.keys(serverTools));
addedServers.add(mcpServer);
}
}
const instructions = req.body.promptPrefix;
// Get endpoint config for modelDisplayLabel fallback
const appConfig = req.config;
let endpointConfig = appConfig?.endpoints?.[endpoint];
if (!isAgentsEndpoint(endpoint) && !endpointConfig) {
try {
endpointConfig = getCustomEndpointConfig({ endpoint, appConfig });
} catch (err) {
logger.error('[loadEphemeralAgent] Error getting custom endpoint config', err);
}
}
// For ephemeral agents, use modelLabel if provided, then model spec's label,
// then modelDisplayLabel from endpoint config, otherwise empty string to show model name
const sender =
model_parameters?.modelLabel ?? modelSpec?.label ?? endpointConfig?.modelDisplayLabel ?? '';
// Encode ephemeral agent ID with endpoint, model, and computed sender for display
const ephemeralId = encodeEphemeralAgentId({ endpoint, model, sender });
const result = {
id: ephemeralId,
instructions,
provider: endpoint,
model_parameters,
model,
tools,
};
if (ephemeralAgent?.artifacts != null && ephemeralAgent.artifacts) {
result.artifacts = ephemeralAgent.artifacts;
}
return result;
};
/**
* Load an agent based on the provided ID
*
* @param {Object} params
* @param {ServerRequest} params.req
* @param {string} params.spec
* @param {string} params.agent_id
* @param {string} params.endpoint
* @param {import('@librechat/agents').ClientOptions} [params.model_parameters]
* @returns {Promise<Agent|null>} The agent document as a plain object, or null if not found.
*/
const loadAgent = async ({ req, spec, agent_id, endpoint, model_parameters }) => {
if (!agent_id) {
return null;
}
if (isEphemeralAgentId(agent_id)) {
return await loadEphemeralAgent({ req, spec, endpoint, model_parameters });
}
const agent = await getAgent({
id: agent_id,
});
if (!agent) {
return null;
}
agent.version = agent.versions ? agent.versions.length : 0;
return agent;
};
/**
* Check if a version already exists in the versions array, excluding timestamp and author fields
* @param {Object} updateData - The update data to compare
* @param {Object} currentData - The current agent data
* @param {Array} versions - The existing versions array
* @param {string} [actionsHash] - Hash of current action metadata
* @returns {Object|null} - The matching version if found, null otherwise
*/
const isDuplicateVersion = (updateData, currentData, versions, actionsHash = null) => {
if (!versions || versions.length === 0) {
return null;
}
const excludeFields = [
'_id',
'id',
'createdAt',
'updatedAt',
'author',
'updatedBy',
'created_at',
'updated_at',
'__v',
'versions',
'actionsHash', // Exclude actionsHash from direct comparison
];
const { $push: _$push, $pull: _$pull, $addToSet: _$addToSet, ...directUpdates } = updateData;
if (Object.keys(directUpdates).length === 0 && !actionsHash) {
return null;
}
const wouldBeVersion = { ...currentData, ...directUpdates };
const lastVersion = versions[versions.length - 1];
if (actionsHash && lastVersion.actionsHash !== actionsHash) {
return null;
}
const allFields = new Set([...Object.keys(wouldBeVersion), ...Object.keys(lastVersion)]);
const importantFields = Array.from(allFields).filter((field) => !excludeFields.includes(field));
let isMatch = true;
for (const field of importantFields) {
const wouldBeValue = wouldBeVersion[field];
const lastVersionValue = lastVersion[field];
// Skip if both are undefined/null
if (!wouldBeValue && !lastVersionValue) {
continue;
}
// Handle arrays
if (Array.isArray(wouldBeValue) || Array.isArray(lastVersionValue)) {
// Normalize: treat undefined/null as empty array for comparison
let wouldBeArr;
if (Array.isArray(wouldBeValue)) {
wouldBeArr = wouldBeValue;
} else if (wouldBeValue == null) {
wouldBeArr = [];
} else {
wouldBeArr = [wouldBeValue];
}
let lastVersionArr;
if (Array.isArray(lastVersionValue)) {
lastVersionArr = lastVersionValue;
} else if (lastVersionValue == null) {
lastVersionArr = [];
} else {
lastVersionArr = [lastVersionValue];
}
if (wouldBeArr.length !== lastVersionArr.length) {
isMatch = false;
break;
}
// Handle arrays of objects
if (wouldBeArr.length > 0 && typeof wouldBeArr[0] === 'object' && wouldBeArr[0] !== null) {
const sortedWouldBe = [...wouldBeArr].map((item) => JSON.stringify(item)).sort();
const sortedVersion = [...lastVersionArr].map((item) => JSON.stringify(item)).sort();
if (!sortedWouldBe.every((item, i) => item === sortedVersion[i])) {
isMatch = false;
break;
}
} else {
const sortedWouldBe = [...wouldBeArr].sort();
const sortedVersion = [...lastVersionArr].sort();
if (!sortedWouldBe.every((item, i) => item === sortedVersion[i])) {
isMatch = false;
break;
}
}
}
// Handle objects
else if (typeof wouldBeValue === 'object' && wouldBeValue !== null) {
const lastVersionObj =
typeof lastVersionValue === 'object' && lastVersionValue !== null ? lastVersionValue : {};
// For empty objects, normalize the comparison
const wouldBeKeys = Object.keys(wouldBeValue);
const lastVersionKeys = Object.keys(lastVersionObj);
// If both are empty objects, they're equal
if (wouldBeKeys.length === 0 && lastVersionKeys.length === 0) {
continue;
}
// Otherwise do a deep comparison
if (JSON.stringify(wouldBeValue) !== JSON.stringify(lastVersionObj)) {
isMatch = false;
break;
}
}
// Handle primitive values
else {
// For primitives, handle the case where one is undefined and the other is a default value
if (wouldBeValue !== lastVersionValue) {
// Special handling for boolean false vs undefined
if (
typeof wouldBeValue === 'boolean' &&
wouldBeValue === false &&
lastVersionValue === undefined
) {
continue;
}
// Special handling for empty string vs undefined
if (
typeof wouldBeValue === 'string' &&
wouldBeValue === '' &&
lastVersionValue === undefined
) {
continue;
}
isMatch = false;
break;
}
}
}
return isMatch ? lastVersion : null;
};
/**
* Update an agent with new data without overwriting existing
* properties, or create a new agent if it doesn't exist.
* When an agent is updated, a copy of the current state will be saved to the versions array.
*
* @param {Object} searchParameter - The search parameters to find the agent to update.
* @param {string} searchParameter.id - The ID of the agent to update.
* @param {string} [searchParameter.author] - The user ID of the agent's author.
* @param {Object} updateData - An object containing the properties to update.
* @param {Object} [options] - Optional configuration object.
* @param {string} [options.updatingUserId] - The ID of the user performing the update (used for tracking non-author updates).
* @param {boolean} [options.forceVersion] - Force creation of a new version even if no fields changed.
* @param {boolean} [options.skipVersioning] - Skip version creation entirely (useful for isolated operations like sharing).
* @returns {Promise<Agent>} The updated or newly created agent document as a plain object.
* @throws {Error} If the update would create a duplicate version
*/
const updateAgent = async (searchParameter, updateData, options = {}) => {
const { updatingUserId = null, forceVersion = false, skipVersioning = false } = options;
const mongoOptions = { new: true, upsert: false };
const currentAgent = await Agent.findOne(searchParameter);
if (currentAgent) {
const {
__v,
_id,
id: __id,
versions,
author: _author,
...versionData
} = currentAgent.toObject();
const { $push, $pull, $addToSet, ...directUpdates } = updateData;
// Sync mcpServerNames when tools are updated
if (directUpdates.tools !== undefined) {
const mcpServerNames = extractMCPServerNames(directUpdates.tools);
directUpdates.mcpServerNames = mcpServerNames;
updateData.mcpServerNames = mcpServerNames; // Also update the original updateData
}
let actionsHash = null;
// Generate actions hash if agent has actions
if (currentAgent.actions && currentAgent.actions.length > 0) {
// Extract action IDs from the format "domain_action_id"
const actionIds = currentAgent.actions
.map((action) => {
const parts = action.split(actionDelimiter);
return parts[1]; // Get just the action ID part
})
.filter(Boolean);
if (actionIds.length > 0) {
try {
const actions = await getActions(
{
action_id: { $in: actionIds },
},
true,
); // Include sensitive data for hash
actionsHash = await generateActionMetadataHash(currentAgent.actions, actions);
} catch (error) {
logger.error('Error fetching actions for hash generation:', error);
}
}
}
const shouldCreateVersion =
!skipVersioning &&
(forceVersion || Object.keys(directUpdates).length > 0 || $push || $pull || $addToSet);
if (shouldCreateVersion) {
const duplicateVersion = isDuplicateVersion(updateData, versionData, versions, actionsHash);
if (duplicateVersion && !forceVersion) {
// No changes detected, return the current agent without creating a new version
const agentObj = currentAgent.toObject();
agentObj.version = versions.length;
return agentObj;
}
}
const versionEntry = {
...versionData,
...directUpdates,
updatedAt: new Date(),
};
// Include actions hash in version if available
if (actionsHash) {
versionEntry.actionsHash = actionsHash;
}
// Always store updatedBy field to track who made the change
if (updatingUserId) {
versionEntry.updatedBy = new mongoose.Types.ObjectId(updatingUserId);
}
if (shouldCreateVersion) {
updateData.$push = {
...($push || {}),
versions: versionEntry,
};
}
}
return Agent.findOneAndUpdate(searchParameter, updateData, mongoOptions).lean();
};
/**
* Modifies an agent with the resource file id.
* @param {object} params
* @param {ServerRequest} params.req
* @param {string} params.agent_id
* @param {string} params.tool_resource
* @param {string} params.file_id
* @returns {Promise<Agent>} The updated agent.
*/
const addAgentResourceFile = async ({ req, agent_id, tool_resource, file_id }) => {
const searchParameter = { id: agent_id };
let agent = await getAgent(searchParameter);
if (!agent) {
throw new Error('Agent not found for adding resource file');
}
const fileIdsPath = `tool_resources.${tool_resource}.file_ids`;
await Agent.updateOne(
{
id: agent_id,
[`${fileIdsPath}`]: { $exists: false },
},
{
$set: {
[`${fileIdsPath}`]: [],
},
},
);
const updateData = {
$addToSet: {
tools: tool_resource,
[fileIdsPath]: file_id,
},
};
const updatedAgent = await updateAgent(searchParameter, updateData, {
updatingUserId: req?.user?.id,
});
if (updatedAgent) {
return updatedAgent;
} else {
throw new Error('Agent not found for adding resource file');
}
};
/**
* Removes multiple resource files from an agent using atomic operations.
* @param {object} params
* @param {string} params.agent_id
* @param {Array<{tool_resource: string, file_id: string}>} params.files
* @returns {Promise<Agent>} The updated agent.
* @throws {Error} If the agent is not found or update fails.
*/
const removeAgentResourceFiles = async ({ agent_id, files }) => {
const searchParameter = { id: agent_id };
// Group files to remove by resource
const filesByResource = files.reduce((acc, { tool_resource, file_id }) => {
if (!acc[tool_resource]) {
acc[tool_resource] = [];
}
acc[tool_resource].push(file_id);
return acc;
}, {});
const pullAllOps = {};
const resourcesToCheck = new Set();
for (const [resource, fileIds] of Object.entries(filesByResource)) {
const fileIdsPath = `tool_resources.${resource}.file_ids`;
pullAllOps[fileIdsPath] = fileIds;
resourcesToCheck.add(resource);
}
const updatePullData = { $pullAll: pullAllOps };
const agentAfterPull = await Agent.findOneAndUpdate(searchParameter, updatePullData, {
new: true,
}).lean();
if (!agentAfterPull) {
// Agent might have been deleted concurrently, or never existed.
// Check if it existed before trying to throw.
const agentExists = await getAgent(searchParameter);
if (!agentExists) {
throw new Error('Agent not found for removing resource files');
}
// If it existed but findOneAndUpdate returned null, something else went wrong.
throw new Error('Failed to update agent during file removal (pull step)');
}
// Return the agent state directly after the $pull operation.
// Skipping the $unset step for now to simplify and test core $pull atomicity.
// Empty arrays might remain, but the removal itself should be correct.
return agentAfterPull;
};
/**
* Deletes an agent based on the provided ID.
*
* @param {Object} searchParameter - The search parameters to find the agent to delete.
* @param {string} searchParameter.id - The ID of the agent to delete.
* @param {string} [searchParameter.author] - The user ID of the agent's author.
* @returns {Promise<void>} Resolves when the agent has been successfully deleted.
*/
const deleteAgent = async (searchParameter) => {
const agent = await Agent.findOneAndDelete(searchParameter);
if (agent) {
await Promise.all([
removeAllPermissions({
resourceType: ResourceType.AGENT,
resourceId: agent._id,
}),
removeAllPermissions({
resourceType: ResourceType.REMOTE_AGENT,
resourceId: agent._id,
}),
]);
try {
await Agent.updateMany({ 'edges.to': agent.id }, { $pull: { edges: { to: agent.id } } });
} catch (error) {
logger.error('[deleteAgent] Error removing agent from handoff edges', error);
}
try {
await User.updateMany(
{ 'favorites.agentId': agent.id },
{ $pull: { favorites: { agentId: agent.id } } },
);
} catch (error) {
logger.error('[deleteAgent] Error removing agent from user favorites', error);
}
}
return agent;
};
/**
* Deletes all agents created by a specific user.
* @param {string} userId - The ID of the user whose agents should be deleted.
* @returns {Promise<void>} A promise that resolves when all user agents have been deleted.
*/
const deleteUserAgents = async (userId) => {
try {
const userAgents = await getAgents({ author: userId });
if (userAgents.length === 0) {
return;
}
const agentIds = userAgents.map((agent) => agent.id);
const agentObjectIds = userAgents.map((agent) => agent._id);
await AclEntry.deleteMany({
resourceType: { $in: [ResourceType.AGENT, ResourceType.REMOTE_AGENT] },
resourceId: { $in: agentObjectIds },
});
try {
await User.updateMany(
{ 'favorites.agentId': { $in: agentIds } },
{ $pull: { favorites: { agentId: { $in: agentIds } } } },
);
} catch (error) {
logger.error('[deleteUserAgents] Error removing agents from user favorites', error);
}
await Agent.deleteMany({ author: userId });
} catch (error) {
logger.error('[deleteUserAgents] General error:', error);
}
};
/**
* Get agents by accessible IDs with optional cursor-based pagination.
* @param {Object} params - The parameters for getting accessible agents.
* @param {Array} [params.accessibleIds] - Array of agent ObjectIds the user has ACL access to.
* @param {Object} [params.otherParams] - Additional query parameters (including author filter).
* @param {number} [params.limit] - Number of agents to return (max 100). If not provided, returns all agents.
* @param {string} [params.after] - Cursor for pagination - get agents after this cursor. // base64 encoded JSON string with updatedAt and _id.
* @returns {Promise<Object>} A promise that resolves to an object containing the agents data and pagination info.
*/
const getListAgentsByAccess = async ({
accessibleIds = [],
otherParams = {},
limit = null,
after = null,
}) => {
const isPaginated = limit !== null && limit !== undefined;
const normalizedLimit = isPaginated ? Math.min(Math.max(1, parseInt(limit) || 20), 100) : null;
// Build base query combining ACL accessible agents with other filters
const baseQuery = { ...otherParams, _id: { $in: accessibleIds } };
// Add cursor condition
if (after) {
try {
const cursor = JSON.parse(Buffer.from(after, 'base64').toString('utf8'));
const { updatedAt, _id } = cursor;
const cursorCondition = {
$or: [
{ updatedAt: { $lt: new Date(updatedAt) } },
{ updatedAt: new Date(updatedAt), _id: { $gt: new mongoose.Types.ObjectId(_id) } },
],
};
// Merge cursor condition with base query
if (Object.keys(baseQuery).length > 0) {
baseQuery.$and = [{ ...baseQuery }, cursorCondition];
// Remove the original conditions from baseQuery to avoid duplication
Object.keys(baseQuery).forEach((key) => {
if (key !== '$and') delete baseQuery[key];
});
} else {
Object.assign(baseQuery, cursorCondition);
}
} catch (error) {
logger.warn('Invalid cursor:', error.message);
}
}
let query = Agent.find(baseQuery, {
id: 1,
_id: 1,
name: 1,
avatar: 1,
author: 1,
description: 1,
updatedAt: 1,
category: 1,
support_contact: 1,
is_promoted: 1,
}).sort({ updatedAt: -1, _id: 1 });
// Only apply limit if pagination is requested
if (isPaginated) {
query = query.limit(normalizedLimit + 1);
}
const agents = await query.lean();
const hasMore = isPaginated ? agents.length > normalizedLimit : false;
const data = (isPaginated ? agents.slice(0, normalizedLimit) : agents).map((agent) => {
if (agent.author) {
agent.author = agent.author.toString();
}
return agent;
});
// Generate next cursor only if paginated
let nextCursor = null;
if (isPaginated && hasMore && data.length > 0) {
const lastAgent = agents[normalizedLimit - 1];
nextCursor = Buffer.from(
JSON.stringify({
updatedAt: lastAgent.updatedAt.toISOString(),
_id: lastAgent._id.toString(),
}),
).toString('base64');
}
return {
object: 'list',
data,
first_id: data.length > 0 ? data[0].id : null,
last_id: data.length > 0 ? data[data.length - 1].id : null,
has_more: hasMore,
after: nextCursor,
};
};
/**
* Reverts an agent to a specific version in its version history.
* @param {Object} searchParameter - The search parameters to find the agent to revert.
* @param {string} searchParameter.id - The ID of the agent to revert.
* @param {string} [searchParameter.author] - The user ID of the agent's author.
* @param {number} versionIndex - The index of the version to revert to in the versions array.
* @returns {Promise<MongoAgent>} The updated agent document after reverting.
* @throws {Error} If the agent is not found or the specified version does not exist.
*/
const revertAgentVersion = async (searchParameter, versionIndex) => {
const agent = await Agent.findOne(searchParameter);
if (!agent) {
throw new Error('Agent not found');
}
if (!agent.versions || !agent.versions[versionIndex]) {
throw new Error(`Version ${versionIndex} not found`);
}
const revertToVersion = agent.versions[versionIndex];
const updateData = {
...revertToVersion,
};
delete updateData._id;
delete updateData.id;
delete updateData.versions;
delete updateData.author;
delete updateData.updatedBy;
return Agent.findOneAndUpdate(searchParameter, updateData, { new: true }).lean();
};
/**
* Generates a hash of action metadata for version comparison
* @param {string[]} actionIds - Array of action IDs in format "domain_action_id"
* @param {Action[]} actions - Array of action documents
* @returns {Promise<string>} - SHA256 hash of the action metadata
*/
const generateActionMetadataHash = async (actionIds, actions) => {
if (!actionIds || actionIds.length === 0) {
return '';
}
// Create a map of action_id to metadata for quick lookup
const actionMap = new Map();
actions.forEach((action) => {
actionMap.set(action.action_id, action.metadata);
});
// Sort action IDs for consistent hashing
const sortedActionIds = [...actionIds].sort();
// Build a deterministic string representation of all action metadata
const metadataString = sortedActionIds
.map((actionFullId) => {
// Extract just the action_id part (after the delimiter)
const parts = actionFullId.split(actionDelimiter);
const actionId = parts[1];
const metadata = actionMap.get(actionId);
if (!metadata) {
return `${actionId}:null`;
}
// Sort metadata keys for deterministic output
const sortedKeys = Object.keys(metadata).sort();
const metadataStr = sortedKeys
.map((key) => `${key}:${JSON.stringify(metadata[key])}`)
.join(',');
return `${actionId}:{${metadataStr}}`;
})
.join(';');
// Use Web Crypto API to generate hash
const encoder = new TextEncoder();
const data = encoder.encode(metadataString);
const hashBuffer = await crypto.webcrypto.subtle.digest('SHA-256', data);
const hashArray = Array.from(new Uint8Array(hashBuffer));
const hashHex = hashArray.map((b) => b.toString(16).padStart(2, '0')).join('');
return hashHex;
};
/**
* Counts the number of promoted agents.
* @returns {Promise<number>} - The count of promoted agents
*/
const countPromotedAgents = async () => {
const count = await Agent.countDocuments({ is_promoted: true });
return count;
};
/**
* Load a default agent based on the endpoint
* @param {string} endpoint
* @returns {Agent | null}
*/
module.exports = {
getAgent,
getAgents,
loadAgent,
createAgent,
updateAgent,
deleteAgent,
deleteUserAgents,
revertAgentVersion,
countPromotedAgents,
addAgentResourceFile,
getListAgentsByAccess,
removeAgentResourceFiles,
generateActionMetadataHash,
};

File diff suppressed because it is too large Load diff

View file

@ -1,62 +0,0 @@
const { Assistant } = require('~/db/models');
/**
* Update an assistant with new data without overwriting existing properties,
* or create a new assistant if it doesn't exist.
*
* @param {Object} searchParams - The search parameters to find the assistant to update.
* @param {string} searchParams.assistant_id - The ID of the assistant to update.
* @param {string} searchParams.user - The user ID of the assistant's author.
* @param {Object} updateData - An object containing the properties to update.
* @returns {Promise<AssistantDocument>} The updated or newly created assistant document as a plain object.
*/
const updateAssistantDoc = async (searchParams, updateData) => {
const options = { new: true, upsert: true };
return await Assistant.findOneAndUpdate(searchParams, updateData, options).lean();
};
/**
* Retrieves an assistant document based on the provided ID.
*
* @param {Object} searchParams - The search parameters to find the assistant to update.
* @param {string} searchParams.assistant_id - The ID of the assistant to update.
* @param {string} searchParams.user - The user ID of the assistant's author.
* @returns {Promise<AssistantDocument|null>} The assistant document as a plain object, or null if not found.
*/
const getAssistant = async (searchParams) => await Assistant.findOne(searchParams).lean();
/**
* Retrieves all assistants that match the given search parameters.
*
* @param {Object} searchParams - The search parameters to find matching assistants.
* @param {Object} [select] - Optional. Specifies which document fields to include or exclude.
* @returns {Promise<Array<AssistantDocument>>} A promise that resolves to an array of assistant documents as plain objects.
*/
const getAssistants = async (searchParams, select = null) => {
let query = Assistant.find(searchParams);
if (select) {
query = query.select(select);
}
return await query.lean();
};
/**
* Deletes an assistant based on the provided ID.
*
* @param {Object} searchParams - The search parameters to find the assistant to delete.
* @param {string} searchParams.assistant_id - The ID of the assistant to delete.
* @param {string} searchParams.user - The user ID of the assistant's author.
* @returns {Promise<void>} Resolves when the assistant has been successfully deleted.
*/
const deleteAssistant = async (searchParams) => {
return await Assistant.findOneAndDelete(searchParams);
};
module.exports = {
updateAssistantDoc,
deleteAssistant,
getAssistants,
getAssistant,
};

View file

@ -1,28 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { Banner } = require('~/db/models');
/**
* Retrieves the current active banner.
* @returns {Promise<Object|null>} The active banner object or null if no active banner is found.
*/
const getBanner = async (user) => {
try {
const now = new Date();
const banner = await Banner.findOne({
displayFrom: { $lte: now },
$or: [{ displayTo: { $gte: now } }, { displayTo: null }],
type: 'banner',
}).lean();
if (!banner || banner.isPublic || user) {
return banner;
}
return null;
} catch (error) {
logger.error('[getBanners] Error getting banners', error);
throw new Error('Error getting banners');
}
};
module.exports = { getBanner };

View file

@ -1,57 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const options = [
{
label: 'com_ui_idea',
value: 'idea',
},
{
label: 'com_ui_travel',
value: 'travel',
},
{
label: 'com_ui_teach_or_explain',
value: 'teach_or_explain',
},
{
label: 'com_ui_write',
value: 'write',
},
{
label: 'com_ui_shop',
value: 'shop',
},
{
label: 'com_ui_code',
value: 'code',
},
{
label: 'com_ui_misc',
value: 'misc',
},
{
label: 'com_ui_roleplay',
value: 'roleplay',
},
{
label: 'com_ui_finance',
value: 'finance',
},
];
module.exports = {
/**
* Retrieves the categories asynchronously.
* @returns {Promise<TGetCategoriesResponse>} An array of category objects.
* @throws {Error} If there is an error retrieving the categories.
*/
getCategories: async () => {
try {
// const categories = await Categories.find();
return options;
} catch (error) {
logger.error('Error getting categories', error);
return [];
}
},
};

View file

@ -1,372 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { createTempChatExpirationDate } = require('@librechat/api');
const { getMessages, deleteMessages } = require('./Message');
const { Conversation } = require('~/db/models');
/**
* Searches for a conversation by conversationId and returns a lean document with only conversationId and user.
* @param {string} conversationId - The conversation's ID.
* @returns {Promise<{conversationId: string, user: string} | null>} The conversation object with selected fields or null if not found.
*/
const searchConversation = async (conversationId) => {
try {
return await Conversation.findOne({ conversationId }, 'conversationId user').lean();
} catch (error) {
logger.error('[searchConversation] Error searching conversation', error);
throw new Error('Error searching conversation');
}
};
/**
* Retrieves a single conversation for a given user and conversation ID.
* @param {string} user - The user's ID.
* @param {string} conversationId - The conversation's ID.
* @returns {Promise<TConversation>} The conversation object.
*/
const getConvo = async (user, conversationId) => {
try {
return await Conversation.findOne({ user, conversationId }).lean();
} catch (error) {
logger.error('[getConvo] Error getting single conversation', error);
throw new Error('Error getting single conversation');
}
};
const deleteNullOrEmptyConversations = async () => {
try {
const filter = {
$or: [
{ conversationId: null },
{ conversationId: '' },
{ conversationId: { $exists: false } },
],
};
const result = await Conversation.deleteMany(filter);
// Delete associated messages
const messageDeleteResult = await deleteMessages(filter);
logger.info(
`[deleteNullOrEmptyConversations] Deleted ${result.deletedCount} conversations and ${messageDeleteResult.deletedCount} messages`,
);
return {
conversations: result,
messages: messageDeleteResult,
};
} catch (error) {
logger.error('[deleteNullOrEmptyConversations] Error deleting conversations', error);
throw new Error('Error deleting conversations with null or empty conversationId');
}
};
/**
* Searches for a conversation by conversationId and returns associated file ids.
* @param {string} conversationId - The conversation's ID.
* @returns {Promise<string[] | null>}
*/
const getConvoFiles = async (conversationId) => {
try {
return (await Conversation.findOne({ conversationId }, 'files').lean())?.files ?? [];
} catch (error) {
logger.error('[getConvoFiles] Error getting conversation files', error);
throw new Error('Error getting conversation files');
}
};
module.exports = {
getConvoFiles,
searchConversation,
deleteNullOrEmptyConversations,
/**
* Saves a conversation to the database.
* @param {Object} req - The request object.
* @param {string} conversationId - The conversation's ID.
* @param {Object} metadata - Additional metadata to log for operation.
* @returns {Promise<TConversation>} The conversation object.
*/
saveConvo: async (req, { conversationId, newConversationId, ...convo }, metadata) => {
try {
if (metadata?.context) {
logger.debug(`[saveConvo] ${metadata.context}`);
}
const messages = await getMessages({ conversationId }, '_id');
const update = { ...convo, messages, user: req.user.id };
if (newConversationId) {
update.conversationId = newConversationId;
}
if (req?.body?.isTemporary) {
try {
const appConfig = req.config;
update.expiredAt = createTempChatExpirationDate(appConfig?.interfaceConfig);
} catch (err) {
logger.error('Error creating temporary chat expiration date:', err);
logger.info(`---\`saveConvo\` context: ${metadata?.context}`);
update.expiredAt = null;
}
} else {
update.expiredAt = null;
}
/** @type {{ $set: Partial<TConversation>; $unset?: Record<keyof TConversation, number> }} */
const updateOperation = { $set: update };
if (metadata && metadata.unsetFields && Object.keys(metadata.unsetFields).length > 0) {
updateOperation.$unset = metadata.unsetFields;
}
/** Note: the resulting Model object is necessary for Meilisearch operations */
const conversation = await Conversation.findOneAndUpdate(
{ conversationId, user: req.user.id },
updateOperation,
{
new: true,
upsert: metadata?.noUpsert !== true,
},
);
if (!conversation) {
logger.debug('[saveConvo] Conversation not found, skipping update');
return null;
}
return conversation.toObject();
} catch (error) {
logger.error('[saveConvo] Error saving conversation', error);
if (metadata && metadata?.context) {
logger.info(`[saveConvo] ${metadata.context}`);
}
return { message: 'Error saving conversation' };
}
},
bulkSaveConvos: async (conversations) => {
try {
const bulkOps = conversations.map((convo) => ({
updateOne: {
filter: { conversationId: convo.conversationId, user: convo.user },
update: convo,
upsert: true,
timestamps: false,
},
}));
const result = await Conversation.bulkWrite(bulkOps);
return result;
} catch (error) {
logger.error('[bulkSaveConvos] Error saving conversations in bulk', error);
throw new Error('Failed to save conversations in bulk.');
}
},
getConvosByCursor: async (
user,
{
cursor,
limit = 25,
isArchived = false,
tags,
search,
sortBy = 'updatedAt',
sortDirection = 'desc',
} = {},
) => {
const filters = [{ user }];
if (isArchived) {
filters.push({ isArchived: true });
} else {
filters.push({ $or: [{ isArchived: false }, { isArchived: { $exists: false } }] });
}
if (Array.isArray(tags) && tags.length > 0) {
filters.push({ tags: { $in: tags } });
}
filters.push({ $or: [{ expiredAt: null }, { expiredAt: { $exists: false } }] });
if (search) {
try {
const meiliResults = await Conversation.meiliSearch(search, { filter: `user = "${user}"` });
const matchingIds = Array.isArray(meiliResults.hits)
? meiliResults.hits.map((result) => result.conversationId)
: [];
if (!matchingIds.length) {
return { conversations: [], nextCursor: null };
}
filters.push({ conversationId: { $in: matchingIds } });
} catch (error) {
logger.error('[getConvosByCursor] Error during meiliSearch', error);
throw new Error('Error during meiliSearch');
}
}
const validSortFields = ['title', 'createdAt', 'updatedAt'];
if (!validSortFields.includes(sortBy)) {
throw new Error(
`Invalid sortBy field: ${sortBy}. Must be one of ${validSortFields.join(', ')}`,
);
}
const finalSortBy = sortBy;
const finalSortDirection = sortDirection === 'asc' ? 'asc' : 'desc';
let cursorFilter = null;
if (cursor) {
try {
const decoded = JSON.parse(Buffer.from(cursor, 'base64').toString());
const { primary, secondary } = decoded;
const primaryValue = finalSortBy === 'title' ? primary : new Date(primary);
const secondaryValue = new Date(secondary);
const op = finalSortDirection === 'asc' ? '$gt' : '$lt';
cursorFilter = {
$or: [
{ [finalSortBy]: { [op]: primaryValue } },
{
[finalSortBy]: primaryValue,
updatedAt: { [op]: secondaryValue },
},
],
};
} catch (err) {
logger.warn('[getConvosByCursor] Invalid cursor format, starting from beginning');
}
if (cursorFilter) {
filters.push(cursorFilter);
}
}
const query = filters.length === 1 ? filters[0] : { $and: filters };
try {
const sortOrder = finalSortDirection === 'asc' ? 1 : -1;
const sortObj = { [finalSortBy]: sortOrder };
if (finalSortBy !== 'updatedAt') {
sortObj.updatedAt = sortOrder;
}
const convos = await Conversation.find(query)
.select(
'conversationId endpoint title createdAt updatedAt user model agent_id assistant_id spec iconURL',
)
.sort(sortObj)
.limit(limit + 1)
.lean();
let nextCursor = null;
if (convos.length > limit) {
convos.pop(); // Remove extra item used to detect next page
// Create cursor from the last RETURNED item (not the popped one)
const lastReturned = convos[convos.length - 1];
const primaryValue = lastReturned[finalSortBy];
const primaryStr = finalSortBy === 'title' ? primaryValue : primaryValue.toISOString();
const secondaryStr = lastReturned.updatedAt.toISOString();
const composite = { primary: primaryStr, secondary: secondaryStr };
nextCursor = Buffer.from(JSON.stringify(composite)).toString('base64');
}
return { conversations: convos, nextCursor };
} catch (error) {
logger.error('[getConvosByCursor] Error getting conversations', error);
throw new Error('Error getting conversations');
}
},
getConvosQueried: async (user, convoIds, cursor = null, limit = 25) => {
try {
if (!convoIds?.length) {
return { conversations: [], nextCursor: null, convoMap: {} };
}
const conversationIds = convoIds.map((convo) => convo.conversationId);
const results = await Conversation.find({
user,
conversationId: { $in: conversationIds },
$or: [{ expiredAt: { $exists: false } }, { expiredAt: null }],
}).lean();
results.sort((a, b) => new Date(b.updatedAt) - new Date(a.updatedAt));
let filtered = results;
if (cursor && cursor !== 'start') {
const cursorDate = new Date(cursor);
filtered = results.filter((convo) => new Date(convo.updatedAt) < cursorDate);
}
const limited = filtered.slice(0, limit + 1);
let nextCursor = null;
if (limited.length > limit) {
limited.pop(); // Remove extra item used to detect next page
// Create cursor from the last RETURNED item (not the popped one)
nextCursor = limited[limited.length - 1].updatedAt.toISOString();
}
const convoMap = {};
limited.forEach((convo) => {
convoMap[convo.conversationId] = convo;
});
return { conversations: limited, nextCursor, convoMap };
} catch (error) {
logger.error('[getConvosQueried] Error getting conversations', error);
throw new Error('Error fetching conversations');
}
},
getConvo,
/* chore: this method is not properly error handled */
getConvoTitle: async (user, conversationId) => {
try {
const convo = await getConvo(user, conversationId);
/* ChatGPT Browser was triggering error here due to convo being saved later */
if (convo && !convo.title) {
return null;
} else {
// TypeError: Cannot read properties of null (reading 'title')
return convo?.title || 'New Chat';
}
} catch (error) {
logger.error('[getConvoTitle] Error getting conversation title', error);
throw new Error('Error getting conversation title');
}
},
/**
* Asynchronously deletes conversations and associated messages for a given user and filter.
*
* @async
* @function
* @param {string|ObjectId} user - The user's ID.
* @param {Object} filter - Additional filter criteria for the conversations to be deleted.
* @returns {Promise<{ n: number, ok: number, deletedCount: number, messages: { n: number, ok: number, deletedCount: number } }>}
* An object containing the count of deleted conversations and associated messages.
* @throws {Error} Throws an error if there's an issue with the database operations.
*
* @example
* const user = 'someUserId';
* const filter = { someField: 'someValue' };
* const result = await deleteConvos(user, filter);
* logger.error(result); // { n: 5, ok: 1, deletedCount: 5, messages: { n: 10, ok: 1, deletedCount: 10 } }
*/
deleteConvos: async (user, filter) => {
try {
const userFilter = { ...filter, user };
const conversations = await Conversation.find(userFilter).select('conversationId');
const conversationIds = conversations.map((c) => c.conversationId);
if (!conversationIds.length) {
throw new Error('Conversation not found or already deleted.');
}
const deleteConvoResult = await Conversation.deleteMany(userFilter);
const deleteMessagesResult = await deleteMessages({
conversationId: { $in: conversationIds },
});
return { ...deleteConvoResult, messages: deleteMessagesResult };
} catch (error) {
logger.error('[deleteConvos] Error deleting conversations and messages', error);
throw error;
}
},
};

View file

@ -1,870 +0,0 @@
const mongoose = require('mongoose');
const { v4: uuidv4 } = require('uuid');
const { EModelEndpoint } = require('librechat-data-provider');
const { MongoMemoryServer } = require('mongodb-memory-server');
const {
deleteNullOrEmptyConversations,
searchConversation,
getConvosByCursor,
getConvosQueried,
getConvoFiles,
getConvoTitle,
deleteConvos,
saveConvo,
getConvo,
} = require('./Conversation');
jest.mock('~/server/services/Config/app');
jest.mock('./Message');
const { getMessages, deleteMessages } = require('./Message');
const { Conversation } = require('~/db/models');
describe('Conversation Operations', () => {
let mongoServer;
let mockReq;
let mockConversationData;
beforeAll(async () => {
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
await mongoose.connect(mongoUri);
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
});
beforeEach(async () => {
// Clear database
await Conversation.deleteMany({});
// Reset mocks
jest.clearAllMocks();
// Default mock implementations
getMessages.mockResolvedValue([]);
deleteMessages.mockResolvedValue({ deletedCount: 0 });
mockReq = {
user: { id: 'user123' },
body: {},
config: {
interfaceConfig: {
temporaryChatRetention: 24, // Default 24 hours
},
},
};
mockConversationData = {
conversationId: uuidv4(),
title: 'Test Conversation',
endpoint: EModelEndpoint.openAI,
};
});
describe('saveConvo', () => {
it('should save a conversation for an authenticated user', async () => {
const result = await saveConvo(mockReq, mockConversationData);
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.user).toBe('user123');
expect(result.title).toBe('Test Conversation');
expect(result.endpoint).toBe(EModelEndpoint.openAI);
// Verify the conversation was actually saved to the database
const savedConvo = await Conversation.findOne({
conversationId: mockConversationData.conversationId,
user: 'user123',
});
expect(savedConvo).toBeTruthy();
expect(savedConvo.title).toBe('Test Conversation');
});
it('should query messages when saving a conversation', async () => {
// Mock messages as ObjectIds
const mongoose = require('mongoose');
const mockMessages = [new mongoose.Types.ObjectId(), new mongoose.Types.ObjectId()];
getMessages.mockResolvedValue(mockMessages);
await saveConvo(mockReq, mockConversationData);
// Verify that getMessages was called with correct parameters
expect(getMessages).toHaveBeenCalledWith(
{ conversationId: mockConversationData.conversationId },
'_id',
);
});
it('should handle newConversationId when provided', async () => {
const newConversationId = uuidv4();
const result = await saveConvo(mockReq, {
...mockConversationData,
newConversationId,
});
expect(result.conversationId).toBe(newConversationId);
});
it('should not create a conversation when noUpsert is true and conversation does not exist', async () => {
const nonExistentId = uuidv4();
const result = await saveConvo(
mockReq,
{ conversationId: nonExistentId, title: 'Ghost Title' },
{ noUpsert: true },
);
expect(result).toBeNull();
const dbConvo = await Conversation.findOne({ conversationId: nonExistentId });
expect(dbConvo).toBeNull();
});
it('should update an existing conversation when noUpsert is true', async () => {
await saveConvo(mockReq, mockConversationData);
const result = await saveConvo(
mockReq,
{ conversationId: mockConversationData.conversationId, title: 'Updated Title' },
{ noUpsert: true },
);
expect(result).not.toBeNull();
expect(result.title).toBe('Updated Title');
expect(result.conversationId).toBe(mockConversationData.conversationId);
});
it('should still upsert by default when noUpsert is not provided', async () => {
const newId = uuidv4();
const result = await saveConvo(mockReq, {
conversationId: newId,
title: 'New Conversation',
endpoint: EModelEndpoint.openAI,
});
expect(result).not.toBeNull();
expect(result.conversationId).toBe(newId);
expect(result.title).toBe('New Conversation');
});
it('should handle unsetFields metadata', async () => {
const metadata = {
unsetFields: { someField: 1 },
};
await saveConvo(mockReq, mockConversationData, metadata);
const savedConvo = await Conversation.findOne({
conversationId: mockConversationData.conversationId,
});
expect(savedConvo.someField).toBeUndefined();
});
});
describe('isTemporary conversation handling', () => {
it('should save a conversation with expiredAt when isTemporary is true', async () => {
mockReq.config.interfaceConfig.temporaryChatRetention = 24;
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveConvo(mockReq, mockConversationData);
const afterSave = new Date();
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.expiredAt).toBeDefined();
expect(result.expiredAt).toBeInstanceOf(Date);
const expectedExpirationTime = new Date(beforeSave.getTime() + 24 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
new Date(afterSave.getTime() + 24 * 60 * 60 * 1000 + 1000).getTime(),
);
});
it('should save a conversation without expiredAt when isTemporary is false', async () => {
mockReq.body = { isTemporary: false };
const result = await saveConvo(mockReq, mockConversationData);
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.expiredAt).toBeNull();
});
it('should save a conversation without expiredAt when isTemporary is not provided', async () => {
mockReq.body = {};
const result = await saveConvo(mockReq, mockConversationData);
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.expiredAt).toBeNull();
});
it('should use custom retention period from config', async () => {
mockReq.config.interfaceConfig.temporaryChatRetention = 48;
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveConvo(mockReq, mockConversationData);
expect(result.expiredAt).toBeDefined();
// Verify expiredAt is approximately 48 hours in the future
const expectedExpirationTime = new Date(beforeSave.getTime() + 48 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should handle minimum retention period (1 hour)', async () => {
// Mock app config with less than minimum retention
mockReq.config.interfaceConfig.temporaryChatRetention = 0.5; // Half hour - should be clamped to 1 hour
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveConvo(mockReq, mockConversationData);
expect(result.expiredAt).toBeDefined();
// Verify expiredAt is approximately 1 hour in the future (minimum)
const expectedExpirationTime = new Date(beforeSave.getTime() + 1 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should handle maximum retention period (8760 hours)', async () => {
// Mock app config with more than maximum retention
mockReq.config.interfaceConfig.temporaryChatRetention = 10000; // Should be clamped to 8760 hours
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveConvo(mockReq, mockConversationData);
expect(result.expiredAt).toBeDefined();
// Verify expiredAt is approximately 8760 hours (1 year) in the future
const expectedExpirationTime = new Date(beforeSave.getTime() + 8760 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should handle missing config gracefully', async () => {
// Simulate missing config - should use default retention period
delete mockReq.config;
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveConvo(mockReq, mockConversationData);
const afterSave = new Date();
// Should still save the conversation with default retention period (30 days)
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.expiredAt).toBeDefined();
expect(result.expiredAt).toBeInstanceOf(Date);
// Verify expiredAt is approximately 30 days in the future (720 hours)
const expectedExpirationTime = new Date(beforeSave.getTime() + 720 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
new Date(afterSave.getTime() + 720 * 60 * 60 * 1000 + 1000).getTime(),
);
});
it('should use default retention when config is not provided', async () => {
// Mock getAppConfig to return empty config
mockReq.config = {}; // Empty config
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveConvo(mockReq, mockConversationData);
expect(result.expiredAt).toBeDefined();
// Default retention is 30 days (720 hours)
const expectedExpirationTime = new Date(beforeSave.getTime() + 30 * 24 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should update expiredAt when saving existing temporary conversation', async () => {
// First save a temporary conversation
mockReq.config.interfaceConfig.temporaryChatRetention = 24;
mockReq.body = { isTemporary: true };
const firstSave = await saveConvo(mockReq, mockConversationData);
const originalExpiredAt = firstSave.expiredAt;
// Wait a bit to ensure time difference
await new Promise((resolve) => setTimeout(resolve, 100));
// Save again with same conversationId but different title
const updatedData = { ...mockConversationData, title: 'Updated Title' };
const secondSave = await saveConvo(mockReq, updatedData);
// Should update title and create new expiredAt
expect(secondSave.title).toBe('Updated Title');
expect(secondSave.expiredAt).toBeDefined();
expect(new Date(secondSave.expiredAt).getTime()).toBeGreaterThan(
new Date(originalExpiredAt).getTime(),
);
});
it('should not set expiredAt when updating non-temporary conversation', async () => {
// First save a non-temporary conversation
mockReq.body = { isTemporary: false };
const firstSave = await saveConvo(mockReq, mockConversationData);
expect(firstSave.expiredAt).toBeNull();
// Update without isTemporary flag
mockReq.body = {};
const updatedData = { ...mockConversationData, title: 'Updated Title' };
const secondSave = await saveConvo(mockReq, updatedData);
expect(secondSave.title).toBe('Updated Title');
expect(secondSave.expiredAt).toBeNull();
});
it('should filter out expired conversations in getConvosByCursor', async () => {
// Create some test conversations
const nonExpiredConvo = await Conversation.create({
conversationId: uuidv4(),
user: 'user123',
title: 'Non-expired',
endpoint: EModelEndpoint.openAI,
expiredAt: null,
updatedAt: new Date(),
});
await Conversation.create({
conversationId: uuidv4(),
user: 'user123',
title: 'Future expired',
endpoint: EModelEndpoint.openAI,
expiredAt: new Date(Date.now() + 24 * 60 * 60 * 1000), // 24 hours from now
updatedAt: new Date(),
});
// Mock Meili search
Conversation.meiliSearch = jest.fn().mockResolvedValue({ hits: [] });
const result = await getConvosByCursor('user123');
// Should only return conversations with null or non-existent expiredAt
expect(result.conversations).toHaveLength(1);
expect(result.conversations[0].conversationId).toBe(nonExpiredConvo.conversationId);
});
it('should filter out expired conversations in getConvosQueried', async () => {
// Create test conversations
const nonExpiredConvo = await Conversation.create({
conversationId: uuidv4(),
user: 'user123',
title: 'Non-expired',
endpoint: EModelEndpoint.openAI,
expiredAt: null,
});
const expiredConvo = await Conversation.create({
conversationId: uuidv4(),
user: 'user123',
title: 'Expired',
endpoint: EModelEndpoint.openAI,
expiredAt: new Date(Date.now() + 24 * 60 * 60 * 1000),
});
const convoIds = [
{ conversationId: nonExpiredConvo.conversationId },
{ conversationId: expiredConvo.conversationId },
];
const result = await getConvosQueried('user123', convoIds);
// Should only return the non-expired conversation
expect(result.conversations).toHaveLength(1);
expect(result.conversations[0].conversationId).toBe(nonExpiredConvo.conversationId);
expect(result.convoMap[nonExpiredConvo.conversationId]).toBeDefined();
expect(result.convoMap[expiredConvo.conversationId]).toBeUndefined();
});
});
describe('searchConversation', () => {
it('should find a conversation by conversationId', async () => {
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
title: 'Test',
endpoint: EModelEndpoint.openAI,
});
const result = await searchConversation(mockConversationData.conversationId);
expect(result).toBeTruthy();
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.user).toBe('user123');
expect(result.title).toBeUndefined(); // Only returns conversationId and user
});
it('should return null if conversation not found', async () => {
const result = await searchConversation('non-existent-id');
expect(result).toBeNull();
});
});
describe('getConvo', () => {
it('should retrieve a conversation for a user', async () => {
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
title: 'Test Conversation',
endpoint: EModelEndpoint.openAI,
});
const result = await getConvo('user123', mockConversationData.conversationId);
expect(result.conversationId).toBe(mockConversationData.conversationId);
expect(result.user).toBe('user123');
expect(result.title).toBe('Test Conversation');
});
it('should return null if conversation not found', async () => {
const result = await getConvo('user123', 'non-existent-id');
expect(result).toBeNull();
});
});
describe('getConvoTitle', () => {
it('should return the conversation title', async () => {
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
title: 'Test Title',
endpoint: EModelEndpoint.openAI,
});
const result = await getConvoTitle('user123', mockConversationData.conversationId);
expect(result).toBe('Test Title');
});
it('should return null if conversation has no title', async () => {
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
title: null,
endpoint: EModelEndpoint.openAI,
});
const result = await getConvoTitle('user123', mockConversationData.conversationId);
expect(result).toBeNull();
});
it('should return "New Chat" if conversation not found', async () => {
const result = await getConvoTitle('user123', 'non-existent-id');
expect(result).toBe('New Chat');
});
});
describe('getConvoFiles', () => {
it('should return conversation files', async () => {
const files = ['file1', 'file2'];
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
endpoint: EModelEndpoint.openAI,
files,
});
const result = await getConvoFiles(mockConversationData.conversationId);
expect(result).toEqual(files);
});
it('should return empty array if no files', async () => {
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
endpoint: EModelEndpoint.openAI,
});
const result = await getConvoFiles(mockConversationData.conversationId);
expect(result).toEqual([]);
});
it('should return empty array if conversation not found', async () => {
const result = await getConvoFiles('non-existent-id');
expect(result).toEqual([]);
});
});
describe('deleteConvos', () => {
it('should delete conversations and associated messages', async () => {
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user123',
title: 'To Delete',
endpoint: EModelEndpoint.openAI,
});
deleteMessages.mockResolvedValue({ deletedCount: 5 });
const result = await deleteConvos('user123', {
conversationId: mockConversationData.conversationId,
});
expect(result.deletedCount).toBe(1);
expect(result.messages.deletedCount).toBe(5);
expect(deleteMessages).toHaveBeenCalledWith({
conversationId: { $in: [mockConversationData.conversationId] },
});
// Verify conversation was deleted
const deletedConvo = await Conversation.findOne({
conversationId: mockConversationData.conversationId,
});
expect(deletedConvo).toBeNull();
});
it('should throw error if no conversations found', async () => {
await expect(deleteConvos('user123', { conversationId: 'non-existent' })).rejects.toThrow(
'Conversation not found or already deleted.',
);
});
});
describe('deleteNullOrEmptyConversations', () => {
it('should delete conversations with null, empty, or missing conversationIds', async () => {
// Since conversationId is required by the schema, we can't create documents with null/missing IDs
// This test should verify the function works when such documents exist (e.g., from data corruption)
// For this test, let's create a valid conversation and verify the function doesn't delete it
await Conversation.create({
conversationId: mockConversationData.conversationId,
user: 'user4',
endpoint: EModelEndpoint.openAI,
});
deleteMessages.mockResolvedValue({ deletedCount: 0 });
const result = await deleteNullOrEmptyConversations();
expect(result.conversations.deletedCount).toBe(0); // No invalid conversations to delete
expect(result.messages.deletedCount).toBe(0);
// Verify valid conversation remains
const remainingConvos = await Conversation.find({});
expect(remainingConvos).toHaveLength(1);
expect(remainingConvos[0].conversationId).toBe(mockConversationData.conversationId);
});
});
describe('Error Handling', () => {
it('should handle database errors in saveConvo', async () => {
// Force a database error by disconnecting
await mongoose.disconnect();
const result = await saveConvo(mockReq, mockConversationData);
expect(result).toEqual({ message: 'Error saving conversation' });
// Reconnect for other tests
await mongoose.connect(mongoServer.getUri());
});
});
describe('getConvosByCursor pagination', () => {
/**
* Helper to create conversations with specific timestamps
* Uses collection.insertOne to bypass Mongoose timestamps entirely
*/
const createConvoWithTimestamps = async (index, createdAt, updatedAt) => {
const conversationId = uuidv4();
// Use collection-level insert to bypass Mongoose timestamps
await Conversation.collection.insertOne({
conversationId,
user: 'user123',
title: `Conversation ${index}`,
endpoint: EModelEndpoint.openAI,
expiredAt: null,
isArchived: false,
createdAt,
updatedAt,
});
return Conversation.findOne({ conversationId }).lean();
};
it('should not skip conversations at page boundaries', async () => {
// Create 30 conversations to ensure pagination (limit is 25)
const baseTime = new Date('2026-01-01T00:00:00.000Z');
const convos = [];
for (let i = 0; i < 30; i++) {
const updatedAt = new Date(baseTime.getTime() - i * 60000); // Each 1 minute apart
const convo = await createConvoWithTimestamps(i, updatedAt, updatedAt);
convos.push(convo);
}
// Fetch first page
const page1 = await getConvosByCursor('user123', { limit: 25 });
expect(page1.conversations).toHaveLength(25);
expect(page1.nextCursor).toBeTruthy();
// Fetch second page using cursor
const page2 = await getConvosByCursor('user123', {
limit: 25,
cursor: page1.nextCursor,
});
// Should get remaining 5 conversations
expect(page2.conversations).toHaveLength(5);
expect(page2.nextCursor).toBeNull();
// Verify no duplicates and no gaps
const allIds = [
...page1.conversations.map((c) => c.conversationId),
...page2.conversations.map((c) => c.conversationId),
];
const uniqueIds = new Set(allIds);
expect(uniqueIds.size).toBe(30); // All 30 conversations accounted for
expect(allIds.length).toBe(30); // No duplicates
});
it('should include conversation at exact page boundary (item 26 bug fix)', async () => {
// This test specifically verifies the fix for the bug where item 26
// (the first item that should appear on page 2) was being skipped
const baseTime = new Date('2026-01-01T12:00:00.000Z');
// Create exactly 26 conversations
const convos = [];
for (let i = 0; i < 26; i++) {
const updatedAt = new Date(baseTime.getTime() - i * 60000);
const convo = await createConvoWithTimestamps(i, updatedAt, updatedAt);
convos.push(convo);
}
// The 26th conversation (index 25) should be on page 2
const item26 = convos[25];
// Fetch first page with limit 25
const page1 = await getConvosByCursor('user123', { limit: 25 });
expect(page1.conversations).toHaveLength(25);
expect(page1.nextCursor).toBeTruthy();
// Item 26 should NOT be in page 1
const page1Ids = page1.conversations.map((c) => c.conversationId);
expect(page1Ids).not.toContain(item26.conversationId);
// Fetch second page
const page2 = await getConvosByCursor('user123', {
limit: 25,
cursor: page1.nextCursor,
});
// Item 26 MUST be in page 2 (this was the bug - it was being skipped)
expect(page2.conversations).toHaveLength(1);
expect(page2.conversations[0].conversationId).toBe(item26.conversationId);
});
it('should sort by updatedAt DESC by default', async () => {
// Create conversations with different updatedAt times
// Note: createdAt is older but updatedAt varies
const convo1 = await createConvoWithTimestamps(
1,
new Date('2026-01-01T00:00:00.000Z'), // oldest created
new Date('2026-01-03T00:00:00.000Z'), // most recently updated
);
const convo2 = await createConvoWithTimestamps(
2,
new Date('2026-01-02T00:00:00.000Z'), // middle created
new Date('2026-01-02T00:00:00.000Z'), // middle updated
);
const convo3 = await createConvoWithTimestamps(
3,
new Date('2026-01-03T00:00:00.000Z'), // newest created
new Date('2026-01-01T00:00:00.000Z'), // oldest updated
);
const result = await getConvosByCursor('user123');
// Should be sorted by updatedAt DESC (most recent first)
expect(result.conversations).toHaveLength(3);
expect(result.conversations[0].conversationId).toBe(convo1.conversationId); // Jan 3 updatedAt
expect(result.conversations[1].conversationId).toBe(convo2.conversationId); // Jan 2 updatedAt
expect(result.conversations[2].conversationId).toBe(convo3.conversationId); // Jan 1 updatedAt
});
it('should handle conversations with same updatedAt (tie-breaker)', async () => {
const sameTime = new Date('2026-01-01T12:00:00.000Z');
// Create 3 conversations with exact same updatedAt
const convo1 = await createConvoWithTimestamps(1, sameTime, sameTime);
const convo2 = await createConvoWithTimestamps(2, sameTime, sameTime);
const convo3 = await createConvoWithTimestamps(3, sameTime, sameTime);
const result = await getConvosByCursor('user123');
// All 3 should be returned (no skipping due to same timestamps)
expect(result.conversations).toHaveLength(3);
const returnedIds = result.conversations.map((c) => c.conversationId);
expect(returnedIds).toContain(convo1.conversationId);
expect(returnedIds).toContain(convo2.conversationId);
expect(returnedIds).toContain(convo3.conversationId);
});
it('should handle cursor pagination with conversations updated during pagination', async () => {
// Simulate the scenario where a conversation is updated between page fetches
const baseTime = new Date('2026-01-01T00:00:00.000Z');
// Create 30 conversations
for (let i = 0; i < 30; i++) {
const updatedAt = new Date(baseTime.getTime() - i * 60000);
await createConvoWithTimestamps(i, updatedAt, updatedAt);
}
// Fetch first page
const page1 = await getConvosByCursor('user123', { limit: 25 });
expect(page1.conversations).toHaveLength(25);
// Now update one of the conversations that should be on page 2
// to have a newer updatedAt (simulating user activity during pagination)
const convosOnPage2 = await Conversation.find({ user: 'user123' })
.sort({ updatedAt: -1 })
.skip(25)
.limit(5);
if (convosOnPage2.length > 0) {
const updatedConvo = convosOnPage2[0];
await Conversation.updateOne(
{ _id: updatedConvo._id },
{ updatedAt: new Date('2026-01-02T00:00:00.000Z') }, // Much newer
);
}
// Fetch second page with original cursor
const page2 = await getConvosByCursor('user123', {
limit: 25,
cursor: page1.nextCursor,
});
// The updated conversation might not be in page 2 anymore
// (it moved to the front), but we should still get remaining items
// without errors and without infinite loops
expect(page2.conversations.length).toBeGreaterThanOrEqual(0);
});
it('should correctly decode and use cursor for pagination', async () => {
const baseTime = new Date('2026-01-01T00:00:00.000Z');
// Create 30 conversations
for (let i = 0; i < 30; i++) {
const updatedAt = new Date(baseTime.getTime() - i * 60000);
await createConvoWithTimestamps(i, updatedAt, updatedAt);
}
// Fetch first page
const page1 = await getConvosByCursor('user123', { limit: 25 });
// Decode the cursor to verify it's based on the last RETURNED item
const decodedCursor = JSON.parse(Buffer.from(page1.nextCursor, 'base64').toString());
// The cursor should match the last item in page1 (item at index 24)
const lastReturnedItem = page1.conversations[24];
expect(new Date(decodedCursor.primary).getTime()).toBe(
new Date(lastReturnedItem.updatedAt).getTime(),
);
});
it('should support sortBy createdAt when explicitly requested', async () => {
// Create conversations with different timestamps
const convo1 = await createConvoWithTimestamps(
1,
new Date('2026-01-03T00:00:00.000Z'), // newest created
new Date('2026-01-01T00:00:00.000Z'), // oldest updated
);
const convo2 = await createConvoWithTimestamps(
2,
new Date('2026-01-01T00:00:00.000Z'), // oldest created
new Date('2026-01-03T00:00:00.000Z'), // newest updated
);
// Verify timestamps were set correctly
expect(new Date(convo1.createdAt).getTime()).toBe(
new Date('2026-01-03T00:00:00.000Z').getTime(),
);
expect(new Date(convo2.createdAt).getTime()).toBe(
new Date('2026-01-01T00:00:00.000Z').getTime(),
);
const result = await getConvosByCursor('user123', { sortBy: 'createdAt' });
// Should be sorted by createdAt DESC
expect(result.conversations).toHaveLength(2);
expect(result.conversations[0].conversationId).toBe(convo1.conversationId); // Jan 3 createdAt
expect(result.conversations[1].conversationId).toBe(convo2.conversationId); // Jan 1 createdAt
});
it('should handle empty result set gracefully', async () => {
const result = await getConvosByCursor('user123');
expect(result.conversations).toHaveLength(0);
expect(result.nextCursor).toBeNull();
});
it('should handle exactly limit number of conversations (no next page)', async () => {
const baseTime = new Date('2026-01-01T00:00:00.000Z');
// Create exactly 25 conversations (equal to default limit)
for (let i = 0; i < 25; i++) {
const updatedAt = new Date(baseTime.getTime() - i * 60000);
await createConvoWithTimestamps(i, updatedAt, updatedAt);
}
const result = await getConvosByCursor('user123', { limit: 25 });
expect(result.conversations).toHaveLength(25);
expect(result.nextCursor).toBeNull(); // No next page
});
});
});

View file

@ -1,284 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { ConversationTag, Conversation } = require('~/db/models');
/**
* Retrieves all conversation tags for a user.
* @param {string} user - The user ID.
* @returns {Promise<Array>} An array of conversation tags.
*/
const getConversationTags = async (user) => {
try {
return await ConversationTag.find({ user }).sort({ position: 1 }).lean();
} catch (error) {
logger.error('[getConversationTags] Error getting conversation tags', error);
throw new Error('Error getting conversation tags');
}
};
/**
* Creates a new conversation tag.
* @param {string} user - The user ID.
* @param {Object} data - The tag data.
* @param {string} data.tag - The tag name.
* @param {string} [data.description] - The tag description.
* @param {boolean} [data.addToConversation] - Whether to add the tag to a conversation.
* @param {string} [data.conversationId] - The conversation ID to add the tag to.
* @returns {Promise<Object>} The created tag.
*/
const createConversationTag = async (user, data) => {
try {
const { tag, description, addToConversation, conversationId } = data;
const existingTag = await ConversationTag.findOne({ user, tag }).lean();
if (existingTag) {
return existingTag;
}
const maxPosition = await ConversationTag.findOne({ user }).sort('-position').lean();
const position = (maxPosition?.position || 0) + 1;
const newTag = await ConversationTag.findOneAndUpdate(
{ tag, user },
{
tag,
user,
count: addToConversation ? 1 : 0,
position,
description,
$setOnInsert: { createdAt: new Date() },
},
{
new: true,
upsert: true,
lean: true,
},
);
if (addToConversation && conversationId) {
await Conversation.findOneAndUpdate(
{ user, conversationId },
{ $addToSet: { tags: tag } },
{ new: true },
);
}
return newTag;
} catch (error) {
logger.error('[createConversationTag] Error creating conversation tag', error);
throw new Error('Error creating conversation tag');
}
};
/**
* Updates an existing conversation tag.
* @param {string} user - The user ID.
* @param {string} oldTag - The current tag name.
* @param {Object} data - The updated tag data.
* @param {string} [data.tag] - The new tag name.
* @param {string} [data.description] - The updated description.
* @param {number} [data.position] - The new position.
* @returns {Promise<Object>} The updated tag.
*/
const updateConversationTag = async (user, oldTag, data) => {
try {
const { tag: newTag, description, position } = data;
const existingTag = await ConversationTag.findOne({ user, tag: oldTag }).lean();
if (!existingTag) {
return null;
}
if (newTag && newTag !== oldTag) {
const tagAlreadyExists = await ConversationTag.findOne({ user, tag: newTag }).lean();
if (tagAlreadyExists) {
throw new Error('Tag already exists');
}
await Conversation.updateMany({ user, tags: oldTag }, { $set: { 'tags.$': newTag } });
}
const updateData = {};
if (newTag) {
updateData.tag = newTag;
}
if (description !== undefined) {
updateData.description = description;
}
if (position !== undefined) {
await adjustPositions(user, existingTag.position, position);
updateData.position = position;
}
return await ConversationTag.findOneAndUpdate({ user, tag: oldTag }, updateData, {
new: true,
lean: true,
});
} catch (error) {
logger.error('[updateConversationTag] Error updating conversation tag', error);
throw new Error('Error updating conversation tag');
}
};
/**
* Adjusts positions of tags when a tag's position is changed.
* @param {string} user - The user ID.
* @param {number} oldPosition - The old position of the tag.
* @param {number} newPosition - The new position of the tag.
* @returns {Promise<void>}
*/
const adjustPositions = async (user, oldPosition, newPosition) => {
if (oldPosition === newPosition) {
return;
}
const update = oldPosition < newPosition ? { $inc: { position: -1 } } : { $inc: { position: 1 } };
const position =
oldPosition < newPosition
? {
$gt: Math.min(oldPosition, newPosition),
$lte: Math.max(oldPosition, newPosition),
}
: {
$gte: Math.min(oldPosition, newPosition),
$lt: Math.max(oldPosition, newPosition),
};
await ConversationTag.updateMany(
{
user,
position,
},
update,
);
};
/**
* Deletes a conversation tag.
* @param {string} user - The user ID.
* @param {string} tag - The tag to delete.
* @returns {Promise<Object>} The deleted tag.
*/
const deleteConversationTag = async (user, tag) => {
try {
const deletedTag = await ConversationTag.findOneAndDelete({ user, tag }).lean();
if (!deletedTag) {
return null;
}
await Conversation.updateMany({ user, tags: tag }, { $pullAll: { tags: [tag] } });
await ConversationTag.updateMany(
{ user, position: { $gt: deletedTag.position } },
{ $inc: { position: -1 } },
);
return deletedTag;
} catch (error) {
logger.error('[deleteConversationTag] Error deleting conversation tag', error);
throw new Error('Error deleting conversation tag');
}
};
/**
* Updates tags for a specific conversation.
* @param {string} user - The user ID.
* @param {string} conversationId - The conversation ID.
* @param {string[]} tags - The new set of tags for the conversation.
* @returns {Promise<string[]>} The updated list of tags for the conversation.
*/
const updateTagsForConversation = async (user, conversationId, tags) => {
try {
const conversation = await Conversation.findOne({ user, conversationId }).lean();
if (!conversation) {
throw new Error('Conversation not found');
}
const oldTags = new Set(conversation.tags);
const newTags = new Set(tags);
const addedTags = [...newTags].filter((tag) => !oldTags.has(tag));
const removedTags = [...oldTags].filter((tag) => !newTags.has(tag));
const bulkOps = [];
for (const tag of addedTags) {
bulkOps.push({
updateOne: {
filter: { user, tag },
update: { $inc: { count: 1 } },
upsert: true,
},
});
}
for (const tag of removedTags) {
bulkOps.push({
updateOne: {
filter: { user, tag },
update: { $inc: { count: -1 } },
},
});
}
if (bulkOps.length > 0) {
await ConversationTag.bulkWrite(bulkOps);
}
const updatedConversation = (
await Conversation.findOneAndUpdate(
{ user, conversationId },
{ $set: { tags: [...newTags] } },
{ new: true },
)
).toObject();
return updatedConversation.tags;
} catch (error) {
logger.error('[updateTagsForConversation] Error updating tags', error);
throw new Error('Error updating tags for conversation');
}
};
/**
* Increments tag counts for existing tags only.
* @param {string} user - The user ID.
* @param {string[]} tags - Array of tag names to increment
* @returns {Promise<void>}
*/
const bulkIncrementTagCounts = async (user, tags) => {
if (!tags || tags.length === 0) {
return;
}
try {
const uniqueTags = [...new Set(tags.filter(Boolean))];
if (uniqueTags.length === 0) {
return;
}
const bulkOps = uniqueTags.map((tag) => ({
updateOne: {
filter: { user, tag },
update: { $inc: { count: 1 } },
},
}));
const result = await ConversationTag.bulkWrite(bulkOps);
if (result && result.modifiedCount > 0) {
logger.debug(
`user: ${user} | Incremented tag counts - modified ${result.modifiedCount} tags`,
);
}
} catch (error) {
logger.error('[bulkIncrementTagCounts] Error incrementing tag counts', error);
}
};
module.exports = {
getConversationTags,
createConversationTag,
updateConversationTag,
deleteConversationTag,
bulkIncrementTagCounts,
updateTagsForConversation,
};

View file

@ -1,114 +0,0 @@
const mongoose = require('mongoose');
const { MongoMemoryServer } = require('mongodb-memory-server');
const { ConversationTag, Conversation } = require('~/db/models');
const { deleteConversationTag } = require('./ConversationTag');
let mongoServer;
beforeAll(async () => {
mongoServer = await MongoMemoryServer.create();
await mongoose.connect(mongoServer.getUri());
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
});
afterEach(async () => {
await ConversationTag.deleteMany({});
await Conversation.deleteMany({});
});
describe('ConversationTag model - $pullAll operations', () => {
const userId = new mongoose.Types.ObjectId().toString();
describe('deleteConversationTag', () => {
it('should remove the tag from all conversations that have it', async () => {
await ConversationTag.create({ tag: 'work', user: userId, position: 1 });
await Conversation.create([
{ conversationId: 'conv1', user: userId, endpoint: 'openAI', tags: ['work', 'important'] },
{ conversationId: 'conv2', user: userId, endpoint: 'openAI', tags: ['work'] },
{ conversationId: 'conv3', user: userId, endpoint: 'openAI', tags: ['personal'] },
]);
await deleteConversationTag(userId, 'work');
const convos = await Conversation.find({ user: userId }).sort({ conversationId: 1 }).lean();
expect(convos[0].tags).toEqual(['important']);
expect(convos[1].tags).toEqual([]);
expect(convos[2].tags).toEqual(['personal']);
});
it('should delete the tag document itself', async () => {
await ConversationTag.create({ tag: 'temp', user: userId, position: 1 });
const result = await deleteConversationTag(userId, 'temp');
expect(result).toBeDefined();
expect(result.tag).toBe('temp');
const remaining = await ConversationTag.find({ user: userId }).lean();
expect(remaining).toHaveLength(0);
});
it('should return null when the tag does not exist', async () => {
const result = await deleteConversationTag(userId, 'nonexistent');
expect(result).toBeNull();
});
it('should adjust positions of tags after the deleted one', async () => {
await ConversationTag.create([
{ tag: 'first', user: userId, position: 1 },
{ tag: 'second', user: userId, position: 2 },
{ tag: 'third', user: userId, position: 3 },
]);
await deleteConversationTag(userId, 'first');
const tags = await ConversationTag.find({ user: userId }).sort({ position: 1 }).lean();
expect(tags).toHaveLength(2);
expect(tags[0].tag).toBe('second');
expect(tags[0].position).toBe(1);
expect(tags[1].tag).toBe('third');
expect(tags[1].position).toBe(2);
});
it('should not affect conversations of other users', async () => {
const otherUser = new mongoose.Types.ObjectId().toString();
await ConversationTag.create({ tag: 'shared-name', user: userId, position: 1 });
await ConversationTag.create({ tag: 'shared-name', user: otherUser, position: 1 });
await Conversation.create([
{ conversationId: 'mine', user: userId, endpoint: 'openAI', tags: ['shared-name'] },
{ conversationId: 'theirs', user: otherUser, endpoint: 'openAI', tags: ['shared-name'] },
]);
await deleteConversationTag(userId, 'shared-name');
const myConvo = await Conversation.findOne({ conversationId: 'mine' }).lean();
const theirConvo = await Conversation.findOne({ conversationId: 'theirs' }).lean();
expect(myConvo.tags).toEqual([]);
expect(theirConvo.tags).toEqual(['shared-name']);
});
it('should handle duplicate tags in conversations correctly', async () => {
await ConversationTag.create({ tag: 'dup', user: userId, position: 1 });
const conv = await Conversation.create({
conversationId: 'conv-dup',
user: userId,
endpoint: 'openAI',
tags: ['dup', 'other', 'dup'],
});
await deleteConversationTag(userId, 'dup');
const updated = await Conversation.findById(conv._id).lean();
expect(updated.tags).toEqual(['other']);
});
});
});

View file

@ -1,250 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { EToolResources, FileContext } = require('librechat-data-provider');
const { File } = require('~/db/models');
/**
* Finds a file by its file_id with additional query options.
* @param {string} file_id - The unique identifier of the file.
* @param {object} options - Query options for filtering, projection, etc.
* @returns {Promise<MongoFile>} A promise that resolves to the file document or null.
*/
const findFileById = async (file_id, options = {}) => {
return await File.findOne({ file_id, ...options }).lean();
};
/**
* Retrieves files matching a given filter, sorted by the most recently updated.
* @param {Object} filter - The filter criteria to apply.
* @param {Object} [_sortOptions] - Optional sort parameters.
* @param {Object|String} [selectFields={ text: 0 }] - Fields to include/exclude in the query results.
* Default excludes the 'text' field.
* @returns {Promise<Array<MongoFile>>} A promise that resolves to an array of file documents.
*/
const getFiles = async (filter, _sortOptions, selectFields = { text: 0 }) => {
const sortOptions = { updatedAt: -1, ..._sortOptions };
return await File.find(filter).select(selectFields).sort(sortOptions).lean();
};
/**
* Retrieves tool files (files that are embedded or have a fileIdentifier) from an array of file IDs.
* Note: execute_code files are handled separately by getCodeGeneratedFiles.
* @param {string[]} fileIds - Array of file_id strings to search for
* @param {Set<EToolResources>} toolResourceSet - Optional filter for tool resources
* @returns {Promise<Array<MongoFile>>} Files that match the criteria
*/
const getToolFilesByIds = async (fileIds, toolResourceSet) => {
if (!fileIds || !fileIds.length || !toolResourceSet?.size) {
return [];
}
try {
const orConditions = [];
if (toolResourceSet.has(EToolResources.context)) {
orConditions.push({ text: { $exists: true, $ne: null }, context: FileContext.agents });
}
if (toolResourceSet.has(EToolResources.file_search)) {
orConditions.push({ embedded: true });
}
if (orConditions.length === 0) {
return [];
}
const filter = {
file_id: { $in: fileIds },
context: { $ne: FileContext.execute_code }, // Exclude code-generated files
$or: orConditions,
};
const selectFields = { text: 0 };
const sortOptions = { updatedAt: -1 };
return await getFiles(filter, sortOptions, selectFields);
} catch (error) {
logger.error('[getToolFilesByIds] Error retrieving tool files:', error);
throw new Error('Error retrieving tool files');
}
};
/**
* Retrieves files generated by code execution for a given conversation.
* These files are stored locally with fileIdentifier metadata for code env re-upload.
* @param {string} conversationId - The conversation ID to search for
* @param {string[]} [messageIds] - Optional array of messageIds to filter by (for linear thread filtering)
* @returns {Promise<Array<MongoFile>>} Files generated by code execution in the conversation
*/
const getCodeGeneratedFiles = async (conversationId, messageIds) => {
if (!conversationId) {
return [];
}
/** messageIds are required for proper thread filtering of code-generated files */
if (!messageIds || messageIds.length === 0) {
return [];
}
try {
const filter = {
conversationId,
context: FileContext.execute_code,
messageId: { $exists: true, $in: messageIds },
'metadata.fileIdentifier': { $exists: true },
};
const selectFields = { text: 0 };
const sortOptions = { createdAt: 1 };
return await getFiles(filter, sortOptions, selectFields);
} catch (error) {
logger.error('[getCodeGeneratedFiles] Error retrieving code generated files:', error);
return [];
}
};
/**
* Retrieves user-uploaded execute_code files (not code-generated) by their file IDs.
* These are files with fileIdentifier metadata but context is NOT execute_code (e.g., agents or message_attachment).
* File IDs should be collected from message.files arrays in the current thread.
* @param {string[]} fileIds - Array of file IDs to fetch (from message.files in the thread)
* @returns {Promise<Array<MongoFile>>} User-uploaded execute_code files
*/
const getUserCodeFiles = async (fileIds) => {
if (!fileIds || fileIds.length === 0) {
return [];
}
try {
const filter = {
file_id: { $in: fileIds },
context: { $ne: FileContext.execute_code },
'metadata.fileIdentifier': { $exists: true },
};
const selectFields = { text: 0 };
const sortOptions = { createdAt: 1 };
return await getFiles(filter, sortOptions, selectFields);
} catch (error) {
logger.error('[getUserCodeFiles] Error retrieving user code files:', error);
return [];
}
};
/**
* Creates a new file with a TTL of 1 hour.
* @param {MongoFile} data - The file data to be created, must contain file_id.
* @param {boolean} disableTTL - Whether to disable the TTL.
* @returns {Promise<MongoFile>} A promise that resolves to the created file document.
*/
const createFile = async (data, disableTTL) => {
const fileData = {
...data,
expiresAt: new Date(Date.now() + 3600 * 1000),
};
if (disableTTL) {
delete fileData.expiresAt;
}
return await File.findOneAndUpdate({ file_id: data.file_id }, fileData, {
new: true,
upsert: true,
}).lean();
};
/**
* Updates a file identified by file_id with new data and removes the TTL.
* @param {MongoFile} data - The data to update, must contain file_id.
* @returns {Promise<MongoFile>} A promise that resolves to the updated file document.
*/
const updateFile = async (data) => {
const { file_id, ...update } = data;
const updateOperation = {
$set: update,
$unset: { expiresAt: '' }, // Remove the expiresAt field to prevent TTL
};
return await File.findOneAndUpdate({ file_id }, updateOperation, { new: true }).lean();
};
/**
* Increments the usage of a file identified by file_id.
* @param {MongoFile} data - The data to update, must contain file_id and the increment value for usage.
* @returns {Promise<MongoFile>} A promise that resolves to the updated file document.
*/
const updateFileUsage = async (data) => {
const { file_id, inc = 1 } = data;
const updateOperation = {
$inc: { usage: inc },
$unset: { expiresAt: '', temp_file_id: '' },
};
return await File.findOneAndUpdate({ file_id }, updateOperation, { new: true }).lean();
};
/**
* Deletes a file identified by file_id.
* @param {string} file_id - The unique identifier of the file to delete.
* @returns {Promise<MongoFile>} A promise that resolves to the deleted file document or null.
*/
const deleteFile = async (file_id) => {
return await File.findOneAndDelete({ file_id }).lean();
};
/**
* Deletes a file identified by a filter.
* @param {object} filter - The filter criteria to apply.
* @returns {Promise<MongoFile>} A promise that resolves to the deleted file document or null.
*/
const deleteFileByFilter = async (filter) => {
return await File.findOneAndDelete(filter).lean();
};
/**
* Deletes multiple files identified by an array of file_ids.
* @param {Array<string>} file_ids - The unique identifiers of the files to delete.
* @returns {Promise<Object>} A promise that resolves to the result of the deletion operation.
*/
const deleteFiles = async (file_ids, user) => {
let deleteQuery = { file_id: { $in: file_ids } };
if (user) {
deleteQuery = { user: user };
}
return await File.deleteMany(deleteQuery);
};
/**
* Batch updates files with new signed URLs in MongoDB
*
* @param {MongoFile[]} updates - Array of updates in the format { file_id, filepath }
* @returns {Promise<void>}
*/
async function batchUpdateFiles(updates) {
if (!updates || updates.length === 0) {
return;
}
const bulkOperations = updates.map((update) => ({
updateOne: {
filter: { file_id: update.file_id },
update: { $set: { filepath: update.filepath } },
},
}));
const result = await File.bulkWrite(bulkOperations);
logger.info(`Updated ${result.modifiedCount} files with new S3 URLs`);
}
module.exports = {
findFileById,
getFiles,
getToolFilesByIds,
getCodeGeneratedFiles,
getUserCodeFiles,
createFile,
updateFile,
updateFileUsage,
deleteFile,
deleteFiles,
deleteFileByFilter,
batchUpdateFiles,
};

View file

@ -1,629 +0,0 @@
const mongoose = require('mongoose');
const { v4: uuidv4 } = require('uuid');
const { MongoMemoryServer } = require('mongodb-memory-server');
const { createModels, createMethods } = require('@librechat/data-schemas');
const {
SystemRoles,
ResourceType,
AccessRoleIds,
PrincipalType,
} = require('librechat-data-provider');
const { grantPermission } = require('~/server/services/PermissionService');
const { createAgent } = require('./Agent');
let File;
let Agent;
let AclEntry;
let User;
let modelsToCleanup = [];
let methods;
let getFiles;
let createFile;
let seedDefaultRoles;
describe('File Access Control', () => {
let mongoServer;
beforeAll(async () => {
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
await mongoose.connect(mongoUri);
// Initialize all models
const models = createModels(mongoose);
// Track which models we're adding
modelsToCleanup = Object.keys(models);
// Register models on mongoose.models so methods can access them
const dbModels = require('~/db/models');
Object.assign(mongoose.models, dbModels);
File = dbModels.File;
Agent = dbModels.Agent;
AclEntry = dbModels.AclEntry;
User = dbModels.User;
// Create methods from data-schemas (includes file methods)
methods = createMethods(mongoose);
getFiles = methods.getFiles;
createFile = methods.createFile;
seedDefaultRoles = methods.seedDefaultRoles;
// Seed default roles
await seedDefaultRoles();
});
afterAll(async () => {
// Clean up all collections before disconnecting
const collections = mongoose.connection.collections;
for (const key in collections) {
await collections[key].deleteMany({});
}
// Clear only the models we added
for (const modelName of modelsToCleanup) {
if (mongoose.models[modelName]) {
delete mongoose.models[modelName];
}
}
await mongoose.disconnect();
await mongoServer.stop();
});
beforeEach(async () => {
await File.deleteMany({});
await Agent.deleteMany({});
await AclEntry.deleteMany({});
await User.deleteMany({});
// Don't delete AccessRole as they are seeded defaults needed for tests
});
describe('hasAccessToFilesViaAgent', () => {
it('should efficiently check access for multiple files at once', async () => {
const userId = new mongoose.Types.ObjectId();
const authorId = new mongoose.Types.ObjectId();
const agentId = uuidv4();
const fileIds = [uuidv4(), uuidv4(), uuidv4(), uuidv4()];
// Create users
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
});
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create files
for (const fileId of fileIds) {
await createFile({
user: authorId,
file_id: fileId,
filename: `file-${fileId}.txt`,
filepath: `/uploads/${fileId}`,
});
}
// Create agent with only first two files attached
const agent = await createAgent({
id: agentId,
name: 'Test Agent',
author: authorId,
model: 'gpt-4',
provider: 'openai',
tool_resources: {
file_search: {
file_ids: [fileIds[0], fileIds[1]],
},
},
});
// Grant EDIT permission to user on the agent
await grantPermission({
principalType: PrincipalType.USER,
principalId: userId,
resourceType: ResourceType.AGENT,
resourceId: agent._id,
accessRoleId: AccessRoleIds.AGENT_EDITOR,
grantedBy: authorId,
});
// Check access for all files
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
const accessMap = await hasAccessToFilesViaAgent({
userId: userId,
role: SystemRoles.USER,
fileIds,
agentId: agent.id, // Use agent.id which is the custom UUID
});
// Should have access only to the first two files
expect(accessMap.get(fileIds[0])).toBe(true);
expect(accessMap.get(fileIds[1])).toBe(true);
expect(accessMap.get(fileIds[2])).toBe(false);
expect(accessMap.get(fileIds[3])).toBe(false);
});
it('should grant access to all files when user is the agent author', async () => {
const authorId = new mongoose.Types.ObjectId();
const agentId = uuidv4();
const fileIds = [uuidv4(), uuidv4(), uuidv4()];
// Create author user
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create agent
await createAgent({
id: agentId,
name: 'Test Agent',
author: authorId,
model: 'gpt-4',
provider: 'openai',
tool_resources: {
file_search: {
file_ids: [fileIds[0]], // Only one file attached
},
},
});
// Check access as the author
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
const accessMap = await hasAccessToFilesViaAgent({
userId: authorId,
role: SystemRoles.USER,
fileIds,
agentId,
});
// Author should have access to all files
expect(accessMap.get(fileIds[0])).toBe(true);
expect(accessMap.get(fileIds[1])).toBe(true);
expect(accessMap.get(fileIds[2])).toBe(true);
});
it('should handle non-existent agent gracefully', async () => {
const userId = new mongoose.Types.ObjectId();
const fileIds = [uuidv4(), uuidv4()];
// Create user
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
});
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
const accessMap = await hasAccessToFilesViaAgent({
userId: userId,
role: SystemRoles.USER,
fileIds,
agentId: 'non-existent-agent',
});
// Should have no access to any files
expect(accessMap.get(fileIds[0])).toBe(false);
expect(accessMap.get(fileIds[1])).toBe(false);
});
it('should deny access when user only has VIEW permission and needs access for deletion', async () => {
const userId = new mongoose.Types.ObjectId();
const authorId = new mongoose.Types.ObjectId();
const agentId = uuidv4();
const fileIds = [uuidv4(), uuidv4()];
// Create users
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
});
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create agent with files
const agent = await createAgent({
id: agentId,
name: 'View-Only Agent',
author: authorId,
model: 'gpt-4',
provider: 'openai',
tool_resources: {
file_search: {
file_ids: fileIds,
},
},
});
// Grant only VIEW permission to user on the agent
await grantPermission({
principalType: PrincipalType.USER,
principalId: userId,
resourceType: ResourceType.AGENT,
resourceId: agent._id,
accessRoleId: AccessRoleIds.AGENT_VIEWER,
grantedBy: authorId,
});
// Check access for files
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
const accessMap = await hasAccessToFilesViaAgent({
userId: userId,
role: SystemRoles.USER,
fileIds,
agentId,
isDelete: true,
});
// Should have no access to any files when only VIEW permission
expect(accessMap.get(fileIds[0])).toBe(false);
expect(accessMap.get(fileIds[1])).toBe(false);
});
it('should grant access when user has VIEW permission', async () => {
const userId = new mongoose.Types.ObjectId();
const authorId = new mongoose.Types.ObjectId();
const agentId = uuidv4();
const fileIds = [uuidv4(), uuidv4()];
// Create users
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
});
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create agent with files
const agent = await createAgent({
id: agentId,
name: 'View-Only Agent',
author: authorId,
model: 'gpt-4',
provider: 'openai',
tool_resources: {
file_search: {
file_ids: fileIds,
},
},
});
// Grant only VIEW permission to user on the agent
await grantPermission({
principalType: PrincipalType.USER,
principalId: userId,
resourceType: ResourceType.AGENT,
resourceId: agent._id,
accessRoleId: AccessRoleIds.AGENT_VIEWER,
grantedBy: authorId,
});
// Check access for files
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
const accessMap = await hasAccessToFilesViaAgent({
userId: userId,
role: SystemRoles.USER,
fileIds,
agentId,
});
expect(accessMap.get(fileIds[0])).toBe(true);
expect(accessMap.get(fileIds[1])).toBe(true);
});
});
describe('getFiles with agent access control', () => {
test('should return files owned by user and files accessible through agent', async () => {
const authorId = new mongoose.Types.ObjectId();
const userId = new mongoose.Types.ObjectId();
const agentId = `agent_${uuidv4()}`;
const ownedFileId = `file_${uuidv4()}`;
const sharedFileId = `file_${uuidv4()}`;
const inaccessibleFileId = `file_${uuidv4()}`;
// Create users
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
});
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create agent with shared file
const agent = await createAgent({
id: agentId,
name: 'Shared Agent',
provider: 'test',
model: 'test-model',
author: authorId,
tool_resources: {
file_search: {
file_ids: [sharedFileId],
},
},
});
// Grant EDIT permission to user on the agent
await grantPermission({
principalType: PrincipalType.USER,
principalId: userId,
resourceType: ResourceType.AGENT,
resourceId: agent._id,
accessRoleId: AccessRoleIds.AGENT_EDITOR,
grantedBy: authorId,
});
// Create files
await createFile({
file_id: ownedFileId,
user: userId,
filename: 'owned.txt',
filepath: '/uploads/owned.txt',
type: 'text/plain',
bytes: 100,
});
await createFile({
file_id: sharedFileId,
user: authorId,
filename: 'shared.txt',
filepath: '/uploads/shared.txt',
type: 'text/plain',
bytes: 200,
embedded: true,
});
await createFile({
file_id: inaccessibleFileId,
user: authorId,
filename: 'inaccessible.txt',
filepath: '/uploads/inaccessible.txt',
type: 'text/plain',
bytes: 300,
});
// Get all files first
const allFiles = await getFiles(
{ file_id: { $in: [ownedFileId, sharedFileId, inaccessibleFileId] } },
null,
{ text: 0 },
);
// Then filter by access control
const { filterFilesByAgentAccess } = require('~/server/services/Files/permissions');
const files = await filterFilesByAgentAccess({
files: allFiles,
userId: userId,
role: SystemRoles.USER,
agentId,
});
expect(files).toHaveLength(2);
expect(files.map((f) => f.file_id)).toContain(ownedFileId);
expect(files.map((f) => f.file_id)).toContain(sharedFileId);
expect(files.map((f) => f.file_id)).not.toContain(inaccessibleFileId);
});
test('should return all files when no userId/agentId provided', async () => {
const userId = new mongoose.Types.ObjectId();
const fileId1 = `file_${uuidv4()}`;
const fileId2 = `file_${uuidv4()}`;
await createFile({
file_id: fileId1,
user: userId,
filename: 'file1.txt',
filepath: '/uploads/file1.txt',
type: 'text/plain',
bytes: 100,
});
await createFile({
file_id: fileId2,
user: new mongoose.Types.ObjectId(),
filename: 'file2.txt',
filepath: '/uploads/file2.txt',
type: 'text/plain',
bytes: 200,
});
const files = await getFiles({ file_id: { $in: [fileId1, fileId2] } });
expect(files).toHaveLength(2);
});
});
describe('Role-based file permissions', () => {
it('should optimize permission checks when role is provided', async () => {
const userId = new mongoose.Types.ObjectId();
const authorId = new mongoose.Types.ObjectId();
const agentId = uuidv4();
const fileIds = [uuidv4(), uuidv4()];
// Create users
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
role: 'ADMIN', // User has ADMIN role
});
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create files
for (const fileId of fileIds) {
await createFile({
file_id: fileId,
user: authorId,
filename: `${fileId}.txt`,
filepath: `/uploads/${fileId}.txt`,
type: 'text/plain',
bytes: 100,
});
}
// Create agent with files
const agent = await createAgent({
id: agentId,
name: 'Test Agent',
author: authorId,
model: 'gpt-4',
provider: 'openai',
tool_resources: {
file_search: {
file_ids: fileIds,
},
},
});
// Grant permission to ADMIN role
await grantPermission({
principalType: PrincipalType.ROLE,
principalId: 'ADMIN',
resourceType: ResourceType.AGENT,
resourceId: agent._id,
accessRoleId: AccessRoleIds.AGENT_EDITOR,
grantedBy: authorId,
});
// Check access with role provided (should avoid DB query)
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
const accessMapWithRole = await hasAccessToFilesViaAgent({
userId: userId,
role: 'ADMIN',
fileIds,
agentId: agent.id,
});
// User should have access through their ADMIN role
expect(accessMapWithRole.get(fileIds[0])).toBe(true);
expect(accessMapWithRole.get(fileIds[1])).toBe(true);
// Check access without role (will query DB to get user's role)
const accessMapWithoutRole = await hasAccessToFilesViaAgent({
userId: userId,
fileIds,
agentId: agent.id,
});
// Should have same result
expect(accessMapWithoutRole.get(fileIds[0])).toBe(true);
expect(accessMapWithoutRole.get(fileIds[1])).toBe(true);
});
it('should deny access when user role changes', async () => {
const userId = new mongoose.Types.ObjectId();
const authorId = new mongoose.Types.ObjectId();
const agentId = uuidv4();
const fileId = uuidv4();
// Create users
await User.create({
_id: userId,
email: 'user@example.com',
emailVerified: true,
provider: 'local',
role: 'EDITOR',
});
await User.create({
_id: authorId,
email: 'author@example.com',
emailVerified: true,
provider: 'local',
});
// Create file
await createFile({
file_id: fileId,
user: authorId,
filename: 'test.txt',
filepath: '/uploads/test.txt',
type: 'text/plain',
bytes: 100,
});
// Create agent
const agent = await createAgent({
id: agentId,
name: 'Test Agent',
author: authorId,
model: 'gpt-4',
provider: 'openai',
tool_resources: {
file_search: {
file_ids: [fileId],
},
},
});
// Grant permission to EDITOR role only
await grantPermission({
principalType: PrincipalType.ROLE,
principalId: 'EDITOR',
resourceType: ResourceType.AGENT,
resourceId: agent._id,
accessRoleId: AccessRoleIds.AGENT_EDITOR,
grantedBy: authorId,
});
const { hasAccessToFilesViaAgent } = require('~/server/services/Files/permissions');
// Check with EDITOR role - should have access
const accessAsEditor = await hasAccessToFilesViaAgent({
userId: userId,
role: 'EDITOR',
fileIds: [fileId],
agentId: agent.id,
});
expect(accessAsEditor.get(fileId)).toBe(true);
// Simulate role change to USER - should lose access
const accessAsUser = await hasAccessToFilesViaAgent({
userId: userId,
role: SystemRoles.USER,
fileIds: [fileId],
agentId: agent.id,
});
expect(accessAsUser.get(fileId)).toBe(false);
});
});
});

View file

@ -1,372 +0,0 @@
const { z } = require('zod');
const { logger } = require('@librechat/data-schemas');
const { createTempChatExpirationDate } = require('@librechat/api');
const { Message } = require('~/db/models');
const idSchema = z.string().uuid();
/**
* Saves a message in the database.
*
* @async
* @function saveMessage
* @param {ServerRequest} req - The request object containing user information.
* @param {Object} params - The message data object.
* @param {string} params.endpoint - The endpoint where the message originated.
* @param {string} params.iconURL - The URL of the sender's icon.
* @param {string} params.messageId - The unique identifier for the message.
* @param {string} params.newMessageId - The new unique identifier for the message (if applicable).
* @param {string} params.conversationId - The identifier of the conversation.
* @param {string} [params.parentMessageId] - The identifier of the parent message, if any.
* @param {string} params.sender - The identifier of the sender.
* @param {string} params.text - The text content of the message.
* @param {boolean} params.isCreatedByUser - Indicates if the message was created by the user.
* @param {string} [params.error] - Any error associated with the message.
* @param {boolean} [params.unfinished] - Indicates if the message is unfinished.
* @param {Object[]} [params.files] - An array of files associated with the message.
* @param {string} [params.finish_reason] - Reason for finishing the message.
* @param {number} [params.tokenCount] - The number of tokens in the message.
* @param {string} [params.plugin] - Plugin associated with the message.
* @param {string[]} [params.plugins] - An array of plugins associated with the message.
* @param {string} [params.model] - The model used to generate the message.
* @param {Object} [metadata] - Additional metadata for this operation
* @param {string} [metadata.context] - The context of the operation
* @returns {Promise<TMessage>} The updated or newly inserted message document.
* @throws {Error} If there is an error in saving the message.
*/
async function saveMessage(req, params, metadata) {
if (!req?.user?.id) {
throw new Error('User not authenticated');
}
const validConvoId = idSchema.safeParse(params.conversationId);
if (!validConvoId.success) {
logger.warn(`Invalid conversation ID: ${params.conversationId}`);
logger.info(`---\`saveMessage\` context: ${metadata?.context}`);
logger.info(`---Invalid conversation ID Params: ${JSON.stringify(params, null, 2)}`);
return;
}
try {
const update = {
...params,
user: req.user.id,
messageId: params.newMessageId || params.messageId,
};
if (req?.body?.isTemporary) {
try {
const appConfig = req.config;
update.expiredAt = createTempChatExpirationDate(appConfig?.interfaceConfig);
} catch (err) {
logger.error('Error creating temporary chat expiration date:', err);
logger.info(`---\`saveMessage\` context: ${metadata?.context}`);
update.expiredAt = null;
}
} else {
update.expiredAt = null;
}
if (update.tokenCount != null && isNaN(update.tokenCount)) {
logger.warn(
`Resetting invalid \`tokenCount\` for message \`${params.messageId}\`: ${update.tokenCount}`,
);
logger.info(`---\`saveMessage\` context: ${metadata?.context}`);
update.tokenCount = 0;
}
const message = await Message.findOneAndUpdate(
{ messageId: params.messageId, user: req.user.id },
update,
{ upsert: true, new: true },
);
return message.toObject();
} catch (err) {
logger.error('Error saving message:', err);
logger.info(`---\`saveMessage\` context: ${metadata?.context}`);
// Check if this is a duplicate key error (MongoDB error code 11000)
if (err.code === 11000 && err.message.includes('duplicate key error')) {
// Log the duplicate key error but don't crash the application
logger.warn(`Duplicate messageId detected: ${params.messageId}. Continuing execution.`);
try {
// Try to find the existing message with this ID
const existingMessage = await Message.findOne({
messageId: params.messageId,
user: req.user.id,
});
// If we found it, return it
if (existingMessage) {
return existingMessage.toObject();
}
// If we can't find it (unlikely but possible in race conditions)
return {
...params,
messageId: params.messageId,
user: req.user.id,
};
} catch (findError) {
// If the findOne also fails, log it but don't crash
logger.warn(
`Could not retrieve existing message with ID ${params.messageId}: ${findError.message}`,
);
return {
...params,
messageId: params.messageId,
user: req.user.id,
};
}
}
throw err; // Re-throw other errors
}
}
/**
* Saves multiple messages in the database in bulk.
*
* @async
* @function bulkSaveMessages
* @param {Object[]} messages - An array of message objects to save.
* @param {boolean} [overrideTimestamp=false] - Indicates whether to override the timestamps of the messages. Defaults to false.
* @returns {Promise<Object>} The result of the bulk write operation.
* @throws {Error} If there is an error in saving messages in bulk.
*/
async function bulkSaveMessages(messages, overrideTimestamp = false) {
try {
const bulkOps = messages.map((message) => ({
updateOne: {
filter: { messageId: message.messageId },
update: message,
timestamps: !overrideTimestamp,
upsert: true,
},
}));
const result = await Message.bulkWrite(bulkOps);
return result;
} catch (err) {
logger.error('Error saving messages in bulk:', err);
throw err;
}
}
/**
* Records a message in the database.
*
* @async
* @function recordMessage
* @param {Object} params - The message data object.
* @param {string} params.user - The identifier of the user.
* @param {string} params.endpoint - The endpoint where the message originated.
* @param {string} params.messageId - The unique identifier for the message.
* @param {string} params.conversationId - The identifier of the conversation.
* @param {string} [params.parentMessageId] - The identifier of the parent message, if any.
* @param {Partial<TMessage>} rest - Any additional properties from the TMessage typedef not explicitly listed.
* @returns {Promise<Object>} The updated or newly inserted message document.
* @throws {Error} If there is an error in saving the message.
*/
async function recordMessage({
user,
endpoint,
messageId,
conversationId,
parentMessageId,
...rest
}) {
try {
// No parsing of convoId as may use threadId
const message = {
user,
endpoint,
messageId,
conversationId,
parentMessageId,
...rest,
};
return await Message.findOneAndUpdate({ user, messageId }, message, {
upsert: true,
new: true,
});
} catch (err) {
logger.error('Error recording message:', err);
throw err;
}
}
/**
* Updates the text of a message.
*
* @async
* @function updateMessageText
* @param {Object} params - The update data object.
* @param {Object} req - The request object.
* @param {string} params.messageId - The unique identifier for the message.
* @param {string} params.text - The new text content of the message.
* @returns {Promise<void>}
* @throws {Error} If there is an error in updating the message text.
*/
async function updateMessageText(req, { messageId, text }) {
try {
await Message.updateOne({ messageId, user: req.user.id }, { text });
} catch (err) {
logger.error('Error updating message text:', err);
throw err;
}
}
/**
* Updates a message.
*
* @async
* @function updateMessage
* @param {Object} req - The request object.
* @param {Object} message - The message object containing update data.
* @param {string} message.messageId - The unique identifier for the message.
* @param {string} [message.text] - The new text content of the message.
* @param {Object[]} [message.files] - The files associated with the message.
* @param {boolean} [message.isCreatedByUser] - Indicates if the message was created by the user.
* @param {string} [message.sender] - The identifier of the sender.
* @param {number} [message.tokenCount] - The number of tokens in the message.
* @param {Object} [metadata] - The operation metadata
* @param {string} [metadata.context] - The operation metadata
* @returns {Promise<TMessage>} The updated message document.
* @throws {Error} If there is an error in updating the message or if the message is not found.
*/
async function updateMessage(req, message, metadata) {
try {
const { messageId, ...update } = message;
const updatedMessage = await Message.findOneAndUpdate(
{ messageId, user: req.user.id },
update,
{
new: true,
},
);
if (!updatedMessage) {
throw new Error('Message not found or user not authorized.');
}
return {
messageId: updatedMessage.messageId,
conversationId: updatedMessage.conversationId,
parentMessageId: updatedMessage.parentMessageId,
sender: updatedMessage.sender,
text: updatedMessage.text,
isCreatedByUser: updatedMessage.isCreatedByUser,
tokenCount: updatedMessage.tokenCount,
feedback: updatedMessage.feedback,
};
} catch (err) {
logger.error('Error updating message:', err);
if (metadata && metadata?.context) {
logger.info(`---\`updateMessage\` context: ${metadata.context}`);
}
throw err;
}
}
/**
* Deletes messages in a conversation since a specific message.
*
* @async
* @function deleteMessagesSince
* @param {Object} params - The parameters object.
* @param {Object} req - The request object.
* @param {string} params.messageId - The unique identifier for the message.
* @param {string} params.conversationId - The identifier of the conversation.
* @returns {Promise<Number>} The number of deleted messages.
* @throws {Error} If there is an error in deleting messages.
*/
async function deleteMessagesSince(req, { messageId, conversationId }) {
try {
const message = await Message.findOne({ messageId, user: req.user.id }).lean();
if (message) {
const query = Message.find({ conversationId, user: req.user.id });
return await query.deleteMany({
createdAt: { $gt: message.createdAt },
});
}
return undefined;
} catch (err) {
logger.error('Error deleting messages:', err);
throw err;
}
}
/**
* Retrieves messages from the database.
* @async
* @function getMessages
* @param {Record<string, unknown>} filter - The filter criteria.
* @param {string | undefined} [select] - The fields to select.
* @returns {Promise<TMessage[]>} The messages that match the filter criteria.
* @throws {Error} If there is an error in retrieving messages.
*/
async function getMessages(filter, select) {
try {
if (select) {
return await Message.find(filter).select(select).sort({ createdAt: 1 }).lean();
}
return await Message.find(filter).sort({ createdAt: 1 }).lean();
} catch (err) {
logger.error('Error getting messages:', err);
throw err;
}
}
/**
* Retrieves a single message from the database.
* @async
* @function getMessage
* @param {{ user: string, messageId: string }} params - The search parameters
* @returns {Promise<TMessage | null>} The message that matches the criteria or null if not found
* @throws {Error} If there is an error in retrieving the message
*/
async function getMessage({ user, messageId }) {
try {
return await Message.findOne({
user,
messageId,
}).lean();
} catch (err) {
logger.error('Error getting message:', err);
throw err;
}
}
/**
* Deletes messages from the database.
*
* @async
* @function deleteMessages
* @param {import('mongoose').FilterQuery<import('mongoose').Document>} filter - The filter criteria to find messages to delete.
* @returns {Promise<import('mongoose').DeleteResult>} The metadata with count of deleted messages.
* @throws {Error} If there is an error in deleting messages.
*/
async function deleteMessages(filter) {
try {
return await Message.deleteMany(filter);
} catch (err) {
logger.error('Error deleting messages:', err);
throw err;
}
}
module.exports = {
saveMessage,
bulkSaveMessages,
recordMessage,
updateMessageText,
updateMessage,
deleteMessagesSince,
getMessages,
getMessage,
deleteMessages,
};

View file

@ -1,898 +0,0 @@
const mongoose = require('mongoose');
const { v4: uuidv4 } = require('uuid');
const { messageSchema } = require('@librechat/data-schemas');
const { MongoMemoryServer } = require('mongodb-memory-server');
const {
saveMessage,
getMessages,
updateMessage,
deleteMessages,
bulkSaveMessages,
updateMessageText,
deleteMessagesSince,
} = require('./Message');
jest.mock('~/server/services/Config/app');
/**
* @type {import('mongoose').Model<import('@librechat/data-schemas').IMessage>}
*/
let Message;
describe('Message Operations', () => {
let mongoServer;
let mockReq;
let mockMessageData;
beforeAll(async () => {
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
Message = mongoose.models.Message || mongoose.model('Message', messageSchema);
await mongoose.connect(mongoUri);
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
});
beforeEach(async () => {
// Clear database
await Message.deleteMany({});
mockReq = {
user: { id: 'user123' },
config: {
interfaceConfig: {
temporaryChatRetention: 24, // Default 24 hours
},
},
};
mockMessageData = {
messageId: 'msg123',
conversationId: uuidv4(),
text: 'Hello, world!',
user: 'user123',
};
});
describe('saveMessage', () => {
it('should save a message for an authenticated user', async () => {
const result = await saveMessage(mockReq, mockMessageData);
expect(result.messageId).toBe('msg123');
expect(result.user).toBe('user123');
expect(result.text).toBe('Hello, world!');
// Verify the message was actually saved to the database
const savedMessage = await Message.findOne({ messageId: 'msg123', user: 'user123' });
expect(savedMessage).toBeTruthy();
expect(savedMessage.text).toBe('Hello, world!');
});
it('should throw an error for unauthenticated user', async () => {
mockReq.user = null;
await expect(saveMessage(mockReq, mockMessageData)).rejects.toThrow('User not authenticated');
});
it('should handle invalid conversation ID gracefully', async () => {
mockMessageData.conversationId = 'invalid-id';
const result = await saveMessage(mockReq, mockMessageData);
expect(result).toBeUndefined();
});
});
describe('updateMessageText', () => {
it('should update message text for the authenticated user', async () => {
// First save a message
await saveMessage(mockReq, mockMessageData);
// Then update it
await updateMessageText(mockReq, { messageId: 'msg123', text: 'Updated text' });
// Verify the update
const updatedMessage = await Message.findOne({ messageId: 'msg123', user: 'user123' });
expect(updatedMessage.text).toBe('Updated text');
});
});
describe('updateMessage', () => {
it('should update a message for the authenticated user', async () => {
// First save a message
await saveMessage(mockReq, mockMessageData);
const result = await updateMessage(mockReq, { messageId: 'msg123', text: 'Updated text' });
expect(result.messageId).toBe('msg123');
expect(result.text).toBe('Updated text');
// Verify in database
const updatedMessage = await Message.findOne({ messageId: 'msg123', user: 'user123' });
expect(updatedMessage.text).toBe('Updated text');
});
it('should throw an error if message is not found', async () => {
await expect(
updateMessage(mockReq, { messageId: 'nonexistent', text: 'Test' }),
).rejects.toThrow('Message not found or user not authorized.');
});
});
describe('deleteMessagesSince', () => {
it('should delete messages only for the authenticated user', async () => {
const conversationId = uuidv4();
// Create multiple messages in the same conversation
await saveMessage(mockReq, {
messageId: 'msg1',
conversationId,
text: 'First message',
user: 'user123',
});
await saveMessage(mockReq, {
messageId: 'msg2',
conversationId,
text: 'Second message',
user: 'user123',
});
await saveMessage(mockReq, {
messageId: 'msg3',
conversationId,
text: 'Third message',
user: 'user123',
});
// Delete messages since message2 (this should only delete messages created AFTER msg2)
await deleteMessagesSince(mockReq, {
messageId: 'msg2',
conversationId,
});
// Verify msg1 and msg2 remain, msg3 is deleted
const remainingMessages = await Message.find({ conversationId, user: 'user123' });
expect(remainingMessages).toHaveLength(2);
expect(remainingMessages.map((m) => m.messageId)).toContain('msg1');
expect(remainingMessages.map((m) => m.messageId)).toContain('msg2');
expect(remainingMessages.map((m) => m.messageId)).not.toContain('msg3');
});
it('should return undefined if no message is found', async () => {
const result = await deleteMessagesSince(mockReq, {
messageId: 'nonexistent',
conversationId: 'convo123',
});
expect(result).toBeUndefined();
});
});
describe('getMessages', () => {
it('should retrieve messages with the correct filter', async () => {
const conversationId = uuidv4();
// Save some messages
await saveMessage(mockReq, {
messageId: 'msg1',
conversationId,
text: 'First message',
user: 'user123',
});
await saveMessage(mockReq, {
messageId: 'msg2',
conversationId,
text: 'Second message',
user: 'user123',
});
const messages = await getMessages({ conversationId });
expect(messages).toHaveLength(2);
expect(messages[0].text).toBe('First message');
expect(messages[1].text).toBe('Second message');
});
});
describe('deleteMessages', () => {
it('should delete messages with the correct filter', async () => {
// Save some messages for different users
await saveMessage(mockReq, mockMessageData);
await saveMessage(
{ user: { id: 'user456' } },
{
messageId: 'msg456',
conversationId: uuidv4(),
text: 'Other user message',
user: 'user456',
},
);
await deleteMessages({ user: 'user123' });
// Verify only user123's messages were deleted
const user123Messages = await Message.find({ user: 'user123' });
const user456Messages = await Message.find({ user: 'user456' });
expect(user123Messages).toHaveLength(0);
expect(user456Messages).toHaveLength(1);
});
});
describe('Conversation Hijacking Prevention', () => {
it("should not allow editing a message in another user's conversation", async () => {
const attackerReq = { user: { id: 'attacker123' } };
const victimConversationId = uuidv4();
const victimMessageId = 'victim-msg-123';
// First, save a message as the victim (but we'll try to edit as attacker)
const victimReq = { user: { id: 'victim123' } };
await saveMessage(victimReq, {
messageId: victimMessageId,
conversationId: victimConversationId,
text: 'Victim message',
user: 'victim123',
});
// Attacker tries to edit the victim's message
await expect(
updateMessage(attackerReq, {
messageId: victimMessageId,
conversationId: victimConversationId,
text: 'Hacked message',
}),
).rejects.toThrow('Message not found or user not authorized.');
// Verify the original message is unchanged
const originalMessage = await Message.findOne({
messageId: victimMessageId,
user: 'victim123',
});
expect(originalMessage.text).toBe('Victim message');
});
it("should not allow deleting messages from another user's conversation", async () => {
const attackerReq = { user: { id: 'attacker123' } };
const victimConversationId = uuidv4();
const victimMessageId = 'victim-msg-123';
// Save a message as the victim
const victimReq = { user: { id: 'victim123' } };
await saveMessage(victimReq, {
messageId: victimMessageId,
conversationId: victimConversationId,
text: 'Victim message',
user: 'victim123',
});
// Attacker tries to delete from victim's conversation
const result = await deleteMessagesSince(attackerReq, {
messageId: victimMessageId,
conversationId: victimConversationId,
});
expect(result).toBeUndefined();
// Verify the victim's message still exists
const victimMessage = await Message.findOne({
messageId: victimMessageId,
user: 'victim123',
});
expect(victimMessage).toBeTruthy();
expect(victimMessage.text).toBe('Victim message');
});
it("should not allow inserting a new message into another user's conversation", async () => {
const attackerReq = { user: { id: 'attacker123' } };
const victimConversationId = uuidv4();
// Attacker tries to save a message - this should succeed but with attacker's user ID
const result = await saveMessage(attackerReq, {
conversationId: victimConversationId,
text: 'Inserted malicious message',
messageId: 'new-msg-123',
user: 'attacker123',
});
expect(result).toBeTruthy();
expect(result.user).toBe('attacker123');
// Verify the message was saved with the attacker's user ID, not as an anonymous message
const savedMessage = await Message.findOne({ messageId: 'new-msg-123' });
expect(savedMessage.user).toBe('attacker123');
expect(savedMessage.conversationId).toBe(victimConversationId);
});
it('should allow retrieving messages from any conversation', async () => {
const victimConversationId = uuidv4();
// Save a message in the victim's conversation
const victimReq = { user: { id: 'victim123' } };
await saveMessage(victimReq, {
messageId: 'victim-msg',
conversationId: victimConversationId,
text: 'Victim message',
user: 'victim123',
});
// Anyone should be able to retrieve messages by conversation ID
const messages = await getMessages({ conversationId: victimConversationId });
expect(messages).toHaveLength(1);
expect(messages[0].text).toBe('Victim message');
});
});
describe('isTemporary message handling', () => {
beforeEach(() => {
// Reset mocks before each test
jest.clearAllMocks();
});
it('should save a message with expiredAt when isTemporary is true', async () => {
// Mock app config with 24 hour retention
mockReq.config.interfaceConfig.temporaryChatRetention = 24;
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveMessage(mockReq, mockMessageData);
const afterSave = new Date();
expect(result.messageId).toBe('msg123');
expect(result.expiredAt).toBeDefined();
expect(result.expiredAt).toBeInstanceOf(Date);
// Verify expiredAt is approximately 24 hours in the future
const expectedExpirationTime = new Date(beforeSave.getTime() + 24 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
new Date(afterSave.getTime() + 24 * 60 * 60 * 1000 + 1000).getTime(),
);
});
it('should save a message without expiredAt when isTemporary is false', async () => {
mockReq.body = { isTemporary: false };
const result = await saveMessage(mockReq, mockMessageData);
expect(result.messageId).toBe('msg123');
expect(result.expiredAt).toBeNull();
});
it('should save a message without expiredAt when isTemporary is not provided', async () => {
// No isTemporary in body
mockReq.body = {};
const result = await saveMessage(mockReq, mockMessageData);
expect(result.messageId).toBe('msg123');
expect(result.expiredAt).toBeNull();
});
it('should use custom retention period from config', async () => {
// Mock app config with 48 hour retention
mockReq.config.interfaceConfig.temporaryChatRetention = 48;
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveMessage(mockReq, mockMessageData);
expect(result.expiredAt).toBeDefined();
// Verify expiredAt is approximately 48 hours in the future
const expectedExpirationTime = new Date(beforeSave.getTime() + 48 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should handle minimum retention period (1 hour)', async () => {
// Mock app config with less than minimum retention
mockReq.config.interfaceConfig.temporaryChatRetention = 0.5; // Half hour - should be clamped to 1 hour
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveMessage(mockReq, mockMessageData);
expect(result.expiredAt).toBeDefined();
// Verify expiredAt is approximately 1 hour in the future (minimum)
const expectedExpirationTime = new Date(beforeSave.getTime() + 1 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should handle maximum retention period (8760 hours)', async () => {
// Mock app config with more than maximum retention
mockReq.config.interfaceConfig.temporaryChatRetention = 10000; // Should be clamped to 8760 hours
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveMessage(mockReq, mockMessageData);
expect(result.expiredAt).toBeDefined();
// Verify expiredAt is approximately 8760 hours (1 year) in the future
const expectedExpirationTime = new Date(beforeSave.getTime() + 8760 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should handle missing config gracefully', async () => {
// Simulate missing config - should use default retention period
delete mockReq.config;
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveMessage(mockReq, mockMessageData);
const afterSave = new Date();
// Should still save the message with default retention period (30 days)
expect(result.messageId).toBe('msg123');
expect(result.expiredAt).toBeDefined();
expect(result.expiredAt).toBeInstanceOf(Date);
// Verify expiredAt is approximately 30 days in the future (720 hours)
const expectedExpirationTime = new Date(beforeSave.getTime() + 720 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
new Date(afterSave.getTime() + 720 * 60 * 60 * 1000 + 1000).getTime(),
);
});
it('should use default retention when config is not provided', async () => {
// Mock getAppConfig to return empty config
mockReq.config = {}; // Empty config
mockReq.body = { isTemporary: true };
const beforeSave = new Date();
const result = await saveMessage(mockReq, mockMessageData);
expect(result.expiredAt).toBeDefined();
// Default retention is 30 days (720 hours)
const expectedExpirationTime = new Date(beforeSave.getTime() + 30 * 24 * 60 * 60 * 1000);
const actualExpirationTime = new Date(result.expiredAt);
expect(actualExpirationTime.getTime()).toBeGreaterThanOrEqual(
expectedExpirationTime.getTime() - 1000,
);
expect(actualExpirationTime.getTime()).toBeLessThanOrEqual(
expectedExpirationTime.getTime() + 1000,
);
});
it('should not update expiredAt on message update', async () => {
// First save a temporary message
mockReq.config.interfaceConfig.temporaryChatRetention = 24;
mockReq.body = { isTemporary: true };
const savedMessage = await saveMessage(mockReq, mockMessageData);
const originalExpiredAt = savedMessage.expiredAt;
// Now update the message without isTemporary flag
mockReq.body = {};
const updatedMessage = await updateMessage(mockReq, {
messageId: 'msg123',
text: 'Updated text',
});
// expiredAt should not be in the returned updated message object
expect(updatedMessage.expiredAt).toBeUndefined();
// Verify in database that expiredAt wasn't changed
const dbMessage = await Message.findOne({ messageId: 'msg123', user: 'user123' });
expect(dbMessage.expiredAt).toEqual(originalExpiredAt);
});
it('should preserve expiredAt when saving existing temporary message', async () => {
// First save a temporary message
mockReq.config.interfaceConfig.temporaryChatRetention = 24;
mockReq.body = { isTemporary: true };
const firstSave = await saveMessage(mockReq, mockMessageData);
const originalExpiredAt = firstSave.expiredAt;
// Wait a bit to ensure time difference
await new Promise((resolve) => setTimeout(resolve, 100));
// Save again with same messageId but different text
const updatedData = { ...mockMessageData, text: 'Updated text' };
const secondSave = await saveMessage(mockReq, updatedData);
// Should update text but create new expiredAt
expect(secondSave.text).toBe('Updated text');
expect(secondSave.expiredAt).toBeDefined();
expect(new Date(secondSave.expiredAt).getTime()).toBeGreaterThan(
new Date(originalExpiredAt).getTime(),
);
});
it('should handle bulk operations with temporary messages', async () => {
// This test verifies bulkSaveMessages doesn't interfere with expiredAt
const messages = [
{
messageId: 'bulk1',
conversationId: uuidv4(),
text: 'Bulk message 1',
user: 'user123',
expiredAt: new Date(Date.now() + 24 * 60 * 60 * 1000),
},
{
messageId: 'bulk2',
conversationId: uuidv4(),
text: 'Bulk message 2',
user: 'user123',
expiredAt: null,
},
];
await bulkSaveMessages(messages);
const savedMessages = await Message.find({
messageId: { $in: ['bulk1', 'bulk2'] },
}).lean();
expect(savedMessages).toHaveLength(2);
const bulk1 = savedMessages.find((m) => m.messageId === 'bulk1');
const bulk2 = savedMessages.find((m) => m.messageId === 'bulk2');
expect(bulk1.expiredAt).toBeDefined();
expect(bulk2.expiredAt).toBeNull();
});
});
describe('Message cursor pagination', () => {
/**
* Helper to create messages with specific timestamps
* Uses collection.insertOne to bypass Mongoose timestamps
*/
const createMessageWithTimestamp = async (index, conversationId, createdAt) => {
const messageId = uuidv4();
await Message.collection.insertOne({
messageId,
conversationId,
user: 'user123',
text: `Message ${index}`,
isCreatedByUser: index % 2 === 0,
createdAt,
updatedAt: createdAt,
});
return Message.findOne({ messageId }).lean();
};
/**
* Simulates the pagination logic from api/server/routes/messages.js
* This tests the exact query pattern used in the route
*/
const getMessagesByCursor = async ({
conversationId,
user,
pageSize = 25,
cursor = null,
sortBy = 'createdAt',
sortDirection = 'desc',
}) => {
const sortOrder = sortDirection === 'asc' ? 1 : -1;
const sortField = ['createdAt', 'updatedAt'].includes(sortBy) ? sortBy : 'createdAt';
const cursorOperator = sortDirection === 'asc' ? '$gt' : '$lt';
const filter = { conversationId, user };
if (cursor) {
filter[sortField] = { [cursorOperator]: new Date(cursor) };
}
const messages = await Message.find(filter)
.sort({ [sortField]: sortOrder })
.limit(pageSize + 1)
.lean();
let nextCursor = null;
if (messages.length > pageSize) {
messages.pop(); // Remove extra item used to detect next page
// Create cursor from the last RETURNED item (not the popped one)
nextCursor = messages[messages.length - 1][sortField];
}
return { messages, nextCursor };
};
it('should return messages for a conversation with pagination', async () => {
const conversationId = uuidv4();
const baseTime = new Date('2026-01-01T00:00:00.000Z');
// Create 30 messages to test pagination
for (let i = 0; i < 30; i++) {
const createdAt = new Date(baseTime.getTime() - i * 60000); // Each 1 minute apart
await createMessageWithTimestamp(i, conversationId, createdAt);
}
// Fetch first page (pageSize 25)
const page1 = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 25,
});
expect(page1.messages).toHaveLength(25);
expect(page1.nextCursor).toBeTruthy();
// Fetch second page using cursor
const page2 = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 25,
cursor: page1.nextCursor,
});
// Should get remaining 5 messages
expect(page2.messages).toHaveLength(5);
expect(page2.nextCursor).toBeNull();
// Verify no duplicates and no gaps
const allMessageIds = [
...page1.messages.map((m) => m.messageId),
...page2.messages.map((m) => m.messageId),
];
const uniqueIds = new Set(allMessageIds);
expect(uniqueIds.size).toBe(30); // All 30 messages accounted for
expect(allMessageIds.length).toBe(30); // No duplicates
});
it('should not skip message at page boundary (item 26 bug fix)', async () => {
const conversationId = uuidv4();
const baseTime = new Date('2026-01-01T12:00:00.000Z');
// Create exactly 26 messages
const messages = [];
for (let i = 0; i < 26; i++) {
const createdAt = new Date(baseTime.getTime() - i * 60000);
const msg = await createMessageWithTimestamp(i, conversationId, createdAt);
messages.push(msg);
}
// The 26th message (index 25) should be on page 2
const item26 = messages[25];
// Fetch first page with pageSize 25
const page1 = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 25,
});
expect(page1.messages).toHaveLength(25);
expect(page1.nextCursor).toBeTruthy();
// Item 26 should NOT be in page 1
const page1Ids = page1.messages.map((m) => m.messageId);
expect(page1Ids).not.toContain(item26.messageId);
// Fetch second page
const page2 = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 25,
cursor: page1.nextCursor,
});
// Item 26 MUST be in page 2 (this was the bug - it was being skipped)
expect(page2.messages).toHaveLength(1);
expect(page2.messages[0].messageId).toBe(item26.messageId);
});
it('should sort by createdAt DESC by default', async () => {
const conversationId = uuidv4();
// Create messages with specific timestamps
const msg1 = await createMessageWithTimestamp(
1,
conversationId,
new Date('2026-01-01T00:00:00.000Z'),
);
const msg2 = await createMessageWithTimestamp(
2,
conversationId,
new Date('2026-01-02T00:00:00.000Z'),
);
const msg3 = await createMessageWithTimestamp(
3,
conversationId,
new Date('2026-01-03T00:00:00.000Z'),
);
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
});
// Should be sorted by createdAt DESC (newest first) by default
expect(result.messages).toHaveLength(3);
expect(result.messages[0].messageId).toBe(msg3.messageId);
expect(result.messages[1].messageId).toBe(msg2.messageId);
expect(result.messages[2].messageId).toBe(msg1.messageId);
});
it('should support ascending sort direction', async () => {
const conversationId = uuidv4();
const msg1 = await createMessageWithTimestamp(
1,
conversationId,
new Date('2026-01-01T00:00:00.000Z'),
);
const msg2 = await createMessageWithTimestamp(
2,
conversationId,
new Date('2026-01-02T00:00:00.000Z'),
);
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
sortDirection: 'asc',
});
// Should be sorted by createdAt ASC (oldest first)
expect(result.messages).toHaveLength(2);
expect(result.messages[0].messageId).toBe(msg1.messageId);
expect(result.messages[1].messageId).toBe(msg2.messageId);
});
it('should handle empty conversation', async () => {
const conversationId = uuidv4();
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
});
expect(result.messages).toHaveLength(0);
expect(result.nextCursor).toBeNull();
});
it('should only return messages for the specified user', async () => {
const conversationId = uuidv4();
const createdAt = new Date();
// Create a message for user123
await Message.collection.insertOne({
messageId: uuidv4(),
conversationId,
user: 'user123',
text: 'User message',
createdAt,
updatedAt: createdAt,
});
// Create a message for a different user
await Message.collection.insertOne({
messageId: uuidv4(),
conversationId,
user: 'otherUser',
text: 'Other user message',
createdAt,
updatedAt: createdAt,
});
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
});
// Should only return user123's message
expect(result.messages).toHaveLength(1);
expect(result.messages[0].user).toBe('user123');
});
it('should handle exactly pageSize number of messages (no next page)', async () => {
const conversationId = uuidv4();
const baseTime = new Date('2026-01-01T00:00:00.000Z');
// Create exactly 25 messages (equal to default pageSize)
for (let i = 0; i < 25; i++) {
const createdAt = new Date(baseTime.getTime() - i * 60000);
await createMessageWithTimestamp(i, conversationId, createdAt);
}
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 25,
});
expect(result.messages).toHaveLength(25);
expect(result.nextCursor).toBeNull(); // No next page
});
it('should handle pageSize of 1', async () => {
const conversationId = uuidv4();
const baseTime = new Date('2026-01-01T00:00:00.000Z');
// Create 3 messages
for (let i = 0; i < 3; i++) {
const createdAt = new Date(baseTime.getTime() - i * 60000);
await createMessageWithTimestamp(i, conversationId, createdAt);
}
// Fetch with pageSize 1
let cursor = null;
const allMessages = [];
for (let page = 0; page < 5; page++) {
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 1,
cursor,
});
allMessages.push(...result.messages);
cursor = result.nextCursor;
if (!cursor) {
break;
}
}
// Should get all 3 messages without duplicates
expect(allMessages).toHaveLength(3);
const uniqueIds = new Set(allMessages.map((m) => m.messageId));
expect(uniqueIds.size).toBe(3);
});
it('should handle messages with same createdAt timestamp', async () => {
const conversationId = uuidv4();
const sameTime = new Date('2026-01-01T12:00:00.000Z');
// Create multiple messages with the exact same timestamp
const messages = [];
for (let i = 0; i < 5; i++) {
const msg = await createMessageWithTimestamp(i, conversationId, sameTime);
messages.push(msg);
}
const result = await getMessagesByCursor({
conversationId,
user: 'user123',
pageSize: 10,
});
// All messages should be returned
expect(result.messages).toHaveLength(5);
});
});
});

View file

@ -1,82 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { Preset } = require('~/db/models');
const getPreset = async (user, presetId) => {
try {
return await Preset.findOne({ user, presetId }).lean();
} catch (error) {
logger.error('[getPreset] Error getting single preset', error);
return { message: 'Error getting single preset' };
}
};
module.exports = {
getPreset,
getPresets: async (user, filter) => {
try {
const presets = await Preset.find({ ...filter, user }).lean();
const defaultValue = 10000;
presets.sort((a, b) => {
let orderA = a.order !== undefined ? a.order : defaultValue;
let orderB = b.order !== undefined ? b.order : defaultValue;
if (orderA !== orderB) {
return orderA - orderB;
}
return b.updatedAt - a.updatedAt;
});
return presets;
} catch (error) {
logger.error('[getPresets] Error getting presets', error);
return { message: 'Error retrieving presets' };
}
},
savePreset: async (user, { presetId, newPresetId, defaultPreset, ...preset }) => {
try {
const setter = { $set: {} };
const { user: _, ...cleanPreset } = preset;
const update = { presetId, ...cleanPreset };
if (preset.tools && Array.isArray(preset.tools)) {
update.tools =
preset.tools
.map((tool) => tool?.pluginKey ?? tool)
.filter((toolName) => typeof toolName === 'string') ?? [];
}
if (newPresetId) {
update.presetId = newPresetId;
}
if (defaultPreset) {
update.defaultPreset = defaultPreset;
update.order = 0;
const currentDefault = await Preset.findOne({ defaultPreset: true, user });
if (currentDefault && currentDefault.presetId !== presetId) {
await Preset.findByIdAndUpdate(currentDefault._id, {
$unset: { defaultPreset: '', order: '' },
});
}
} else if (defaultPreset === false) {
update.defaultPreset = undefined;
update.order = undefined;
setter['$unset'] = { defaultPreset: '', order: '' };
}
setter.$set = update;
return await Preset.findOneAndUpdate({ presetId, user }, setter, { new: true, upsert: true });
} catch (error) {
logger.error('[savePreset] Error saving preset', error);
return { message: 'Error saving preset' };
}
},
deletePresets: async (user, filter) => {
// let toRemove = await Preset.find({ ...filter, user }).select('presetId');
// const ids = toRemove.map((instance) => instance.presetId);
let deleteCount = await Preset.deleteMany({ ...filter, user });
return deleteCount;
},
};

View file

@ -1,565 +0,0 @@
const { ObjectId } = require('mongodb');
const { escapeRegExp } = require('@librechat/api');
const { logger } = require('@librechat/data-schemas');
const { SystemRoles, ResourceType, SystemCategories } = require('librechat-data-provider');
const { removeAllPermissions } = require('~/server/services/PermissionService');
const { PromptGroup, Prompt, AclEntry } = require('~/db/models');
/**
* Batch-fetches production prompts for an array of prompt groups
* and attaches them as `productionPrompt` field.
* Replaces $lookup aggregation for FerretDB compatibility.
*/
const attachProductionPrompts = async (groups) => {
const uniqueIds = [...new Set(groups.map((g) => g.productionId?.toString()).filter(Boolean))];
if (uniqueIds.length === 0) {
return groups.map((g) => ({ ...g, productionPrompt: null }));
}
const prompts = await Prompt.find({ _id: { $in: uniqueIds } })
.select('prompt')
.lean();
const promptMap = new Map(prompts.map((p) => [p._id.toString(), p]));
return groups.map((g) => ({
...g,
productionPrompt: g.productionId ? (promptMap.get(g.productionId.toString()) ?? null) : null,
}));
};
/**
* Get all prompt groups with filters
* @param {ServerRequest} req
* @param {TPromptGroupsWithFilterRequest} filter
* @returns {Promise<PromptGroupListResponse>}
*/
const getAllPromptGroups = async (req, filter) => {
try {
const { name, ...query } = filter;
if (name) {
query.name = new RegExp(escapeRegExp(name), 'i');
}
if (!query.category) {
delete query.category;
} else if (query.category === SystemCategories.MY_PROMPTS) {
delete query.category;
} else if (query.category === SystemCategories.NO_CATEGORY) {
query.category = '';
} else if (query.category === SystemCategories.SHARED_PROMPTS) {
delete query.category;
}
let combinedQuery = query;
const groups = await PromptGroup.find(combinedQuery)
.sort({ createdAt: -1 })
.select('name oneliner category author authorName createdAt updatedAt command productionId')
.lean();
return await attachProductionPrompts(groups);
} catch (error) {
console.error('Error getting all prompt groups', error);
return { message: 'Error getting all prompt groups' };
}
};
/**
* Get prompt groups with filters
* @param {ServerRequest} req
* @param {TPromptGroupsWithFilterRequest} filter
* @returns {Promise<PromptGroupListResponse>}
*/
const getPromptGroups = async (req, filter) => {
try {
const { pageNumber = 1, pageSize = 10, name, ...query } = filter;
const validatedPageNumber = Math.max(parseInt(pageNumber, 10), 1);
const validatedPageSize = Math.max(parseInt(pageSize, 10), 1);
if (name) {
query.name = new RegExp(escapeRegExp(name), 'i');
}
if (!query.category) {
delete query.category;
} else if (query.category === SystemCategories.MY_PROMPTS) {
delete query.category;
} else if (query.category === SystemCategories.NO_CATEGORY) {
query.category = '';
} else if (query.category === SystemCategories.SHARED_PROMPTS) {
delete query.category;
}
let combinedQuery = query;
const skip = (validatedPageNumber - 1) * validatedPageSize;
const limit = validatedPageSize;
const [groups, totalPromptGroups] = await Promise.all([
PromptGroup.find(combinedQuery)
.sort({ createdAt: -1 })
.skip(skip)
.limit(limit)
.select(
'name numberOfGenerations oneliner category productionId author authorName createdAt updatedAt',
)
.lean(),
PromptGroup.countDocuments(combinedQuery),
]);
const promptGroups = await attachProductionPrompts(groups);
return {
promptGroups,
pageNumber: validatedPageNumber.toString(),
pageSize: validatedPageSize.toString(),
pages: Math.ceil(totalPromptGroups / validatedPageSize).toString(),
};
} catch (error) {
console.error('Error getting prompt groups', error);
return { message: 'Error getting prompt groups' };
}
};
/**
* @param {Object} fields
* @param {string} fields._id
* @param {string} fields.author
* @param {string} fields.role
* @returns {Promise<TDeletePromptGroupResponse>}
*/
const deletePromptGroup = async ({ _id, author, role }) => {
// Build query - with ACL, author is optional
const query = { _id };
const groupQuery = { groupId: new ObjectId(_id) };
// Legacy: Add author filter if provided (backward compatibility)
if (author && role !== SystemRoles.ADMIN) {
query.author = author;
groupQuery.author = author;
}
const response = await PromptGroup.deleteOne(query);
if (!response || response.deletedCount === 0) {
throw new Error('Prompt group not found');
}
await Prompt.deleteMany(groupQuery);
try {
await removeAllPermissions({ resourceType: ResourceType.PROMPTGROUP, resourceId: _id });
} catch (error) {
logger.error('Error removing promptGroup permissions:', error);
}
return { message: 'Prompt group deleted successfully' };
};
/**
* Get prompt groups by accessible IDs with optional cursor-based pagination.
* @param {Object} params - The parameters for getting accessible prompt groups.
* @param {Array} [params.accessibleIds] - Array of prompt group ObjectIds the user has ACL access to.
* @param {Object} [params.otherParams] - Additional query parameters (including author filter).
* @param {number} [params.limit] - Number of prompt groups to return (max 100). If not provided, returns all prompt groups.
* @param {string} [params.after] - Cursor for pagination - get prompt groups after this cursor. // base64 encoded JSON string with updatedAt and _id.
* @returns {Promise<Object>} A promise that resolves to an object containing the prompt groups data and pagination info.
*/
async function getListPromptGroupsByAccess({
accessibleIds = [],
otherParams = {},
limit = null,
after = null,
}) {
const isPaginated = limit !== null && limit !== undefined;
const normalizedLimit = isPaginated ? Math.min(Math.max(1, parseInt(limit) || 20), 100) : null;
const baseQuery = { ...otherParams, _id: { $in: accessibleIds } };
if (after && typeof after === 'string' && after !== 'undefined' && after !== 'null') {
try {
const cursor = JSON.parse(Buffer.from(after, 'base64').toString('utf8'));
const { updatedAt, _id } = cursor;
const cursorCondition = {
$or: [
{ updatedAt: { $lt: new Date(updatedAt) } },
{ updatedAt: new Date(updatedAt), _id: { $gt: new ObjectId(_id) } },
],
};
if (Object.keys(baseQuery).length > 0) {
baseQuery.$and = [{ ...baseQuery }, cursorCondition];
Object.keys(baseQuery).forEach((key) => {
if (key !== '$and') delete baseQuery[key];
});
} else {
Object.assign(baseQuery, cursorCondition);
}
} catch (error) {
logger.warn('Invalid cursor:', error.message);
}
}
const findQuery = PromptGroup.find(baseQuery)
.sort({ updatedAt: -1, _id: 1 })
.select(
'name numberOfGenerations oneliner category productionId author authorName createdAt updatedAt',
);
if (isPaginated) {
findQuery.limit(normalizedLimit + 1);
}
const groups = await findQuery.lean();
const promptGroups = await attachProductionPrompts(groups);
const hasMore = isPaginated ? promptGroups.length > normalizedLimit : false;
const data = (isPaginated ? promptGroups.slice(0, normalizedLimit) : promptGroups).map(
(group) => {
if (group.author) {
group.author = group.author.toString();
}
return group;
},
);
let nextCursor = null;
if (isPaginated && hasMore && data.length > 0) {
const lastGroup = promptGroups[normalizedLimit - 1];
nextCursor = Buffer.from(
JSON.stringify({
updatedAt: lastGroup.updatedAt.toISOString(),
_id: lastGroup._id.toString(),
}),
).toString('base64');
}
return {
object: 'list',
data,
first_id: data.length > 0 ? data[0]._id.toString() : null,
last_id: data.length > 0 ? data[data.length - 1]._id.toString() : null,
has_more: hasMore,
after: nextCursor,
};
}
module.exports = {
getPromptGroups,
deletePromptGroup,
getAllPromptGroups,
getListPromptGroupsByAccess,
/**
* Create a prompt and its respective group
* @param {TCreatePromptRecord} saveData
* @returns {Promise<TCreatePromptResponse>}
*/
createPromptGroup: async (saveData) => {
try {
const { prompt, group, author, authorName } = saveData;
let newPromptGroup = await PromptGroup.findOneAndUpdate(
{ ...group, author, authorName, productionId: null },
{ $setOnInsert: { ...group, author, authorName, productionId: null } },
{ new: true, upsert: true },
)
.lean()
.select('-__v')
.exec();
const newPrompt = await Prompt.findOneAndUpdate(
{ ...prompt, author, groupId: newPromptGroup._id },
{ $setOnInsert: { ...prompt, author, groupId: newPromptGroup._id } },
{ new: true, upsert: true },
)
.lean()
.select('-__v')
.exec();
newPromptGroup = await PromptGroup.findByIdAndUpdate(
newPromptGroup._id,
{ productionId: newPrompt._id },
{ new: true },
)
.lean()
.select('-__v')
.exec();
return {
prompt: newPrompt,
group: {
...newPromptGroup,
productionPrompt: { prompt: newPrompt.prompt },
},
};
} catch (error) {
logger.error('Error saving prompt group', error);
throw new Error('Error saving prompt group');
}
},
/**
* Save a prompt
* @param {TCreatePromptRecord} saveData
* @returns {Promise<TCreatePromptResponse>}
*/
savePrompt: async (saveData) => {
try {
const { prompt, author } = saveData;
const newPromptData = {
...prompt,
author,
};
/** @type {TPrompt} */
let newPrompt;
try {
newPrompt = await Prompt.create(newPromptData);
} catch (error) {
if (error?.message?.includes('groupId_1_version_1')) {
await Prompt.db.collection('prompts').dropIndex('groupId_1_version_1');
} else {
throw error;
}
newPrompt = await Prompt.create(newPromptData);
}
return { prompt: newPrompt };
} catch (error) {
logger.error('Error saving prompt', error);
return { message: 'Error saving prompt' };
}
},
getPrompts: async (filter) => {
try {
return await Prompt.find(filter).sort({ createdAt: -1 }).lean();
} catch (error) {
logger.error('Error getting prompts', error);
return { message: 'Error getting prompts' };
}
},
getPrompt: async (filter) => {
try {
if (filter.groupId) {
filter.groupId = new ObjectId(filter.groupId);
}
return await Prompt.findOne(filter).lean();
} catch (error) {
logger.error('Error getting prompt', error);
return { message: 'Error getting prompt' };
}
},
/**
* Get prompt groups with filters
* @param {TGetRandomPromptsRequest} filter
* @returns {Promise<TGetRandomPromptsResponse>}
*/
getRandomPromptGroups: async (filter) => {
try {
const categories = await PromptGroup.distinct('category', { category: { $ne: '' } });
for (let i = categories.length - 1; i > 0; i--) {
const j = Math.floor(Math.random() * (i + 1));
[categories[i], categories[j]] = [categories[j], categories[i]];
}
const skip = +filter.skip;
const limit = +filter.limit;
const selectedCategories = categories.slice(skip, skip + limit);
if (selectedCategories.length === 0) {
return { prompts: [] };
}
const groups = await PromptGroup.find({ category: { $in: selectedCategories } }).lean();
const groupByCategory = new Map();
for (const group of groups) {
if (!groupByCategory.has(group.category)) {
groupByCategory.set(group.category, group);
}
}
const prompts = selectedCategories.map((cat) => groupByCategory.get(cat)).filter(Boolean);
return { prompts };
} catch (error) {
logger.error('Error getting prompt groups', error);
return { message: 'Error getting prompt groups' };
}
},
getPromptGroupsWithPrompts: async (filter) => {
try {
return await PromptGroup.findOne(filter)
.populate({
path: 'prompts',
select: '-_id -__v -user',
})
.select('-_id -__v -user')
.lean();
} catch (error) {
logger.error('Error getting prompt groups', error);
return { message: 'Error getting prompt groups' };
}
},
getPromptGroup: async (filter) => {
try {
return await PromptGroup.findOne(filter).lean();
} catch (error) {
logger.error('Error getting prompt group', error);
return { message: 'Error getting prompt group' };
}
},
/**
* Deletes a prompt and its corresponding prompt group if it is the last prompt in the group.
*
* @param {Object} options - The options for deleting the prompt.
* @param {ObjectId|string} options.promptId - The ID of the prompt to delete.
* @param {ObjectId|string} options.groupId - The ID of the prompt's group.
* @param {ObjectId|string} options.author - The ID of the prompt's author.
* @param {string} options.role - The role of the prompt's author.
* @return {Promise<TDeletePromptResponse>} An object containing the result of the deletion.
* If the prompt was deleted successfully, the object will have a property 'prompt' with the value 'Prompt deleted successfully'.
* If the prompt group was deleted successfully, the object will have a property 'promptGroup' with the message 'Prompt group deleted successfully' and id of the deleted group.
* If there was an error deleting the prompt, the object will have a property 'message' with the value 'Error deleting prompt'.
*/
deletePrompt: async ({ promptId, groupId, author, role }) => {
const query = { _id: promptId, groupId, author };
if (role === SystemRoles.ADMIN) {
delete query.author;
}
const { deletedCount } = await Prompt.deleteOne(query);
if (deletedCount === 0) {
throw new Error('Failed to delete the prompt');
}
const remainingPrompts = await Prompt.find({ groupId })
.select('_id')
.sort({ createdAt: 1 })
.lean();
if (remainingPrompts.length === 0) {
// Remove all ACL entries for the promptGroup when deleting the last prompt
try {
await removeAllPermissions({
resourceType: ResourceType.PROMPTGROUP,
resourceId: groupId,
});
} catch (error) {
logger.error('Error removing promptGroup permissions:', error);
}
await PromptGroup.deleteOne({ _id: groupId });
return {
prompt: 'Prompt deleted successfully',
promptGroup: {
message: 'Prompt group deleted successfully',
id: groupId,
},
};
} else {
const promptGroup = await PromptGroup.findById(groupId).lean();
if (promptGroup.productionId.toString() === promptId.toString()) {
await PromptGroup.updateOne(
{ _id: groupId },
{ productionId: remainingPrompts[remainingPrompts.length - 1]._id },
);
}
return { prompt: 'Prompt deleted successfully' };
}
},
/**
* Delete all prompts and prompt groups created by a specific user.
* @param {ServerRequest} req - The server request object.
* @param {string} userId - The ID of the user whose prompts and prompt groups are to be deleted.
*/
deleteUserPrompts: async (req, userId) => {
try {
const promptGroups = await getAllPromptGroups(req, { author: new ObjectId(userId) });
if (promptGroups.length === 0) {
return;
}
const groupIds = promptGroups.map((group) => group._id);
await AclEntry.deleteMany({
resourceType: ResourceType.PROMPTGROUP,
resourceId: { $in: groupIds },
});
await PromptGroup.deleteMany({ author: new ObjectId(userId) });
await Prompt.deleteMany({ author: new ObjectId(userId) });
} catch (error) {
logger.error('[deleteUserPrompts] General error:', error);
}
},
/**
* Update prompt group
* @param {Partial<MongoPromptGroup>} filter - Filter to find prompt group
* @param {Partial<MongoPromptGroup>} data - Data to update
* @returns {Promise<TUpdatePromptGroupResponse>}
*/
updatePromptGroup: async (filter, data) => {
try {
const updateOps = {};
const updateData = { ...data, ...updateOps };
const updatedDoc = await PromptGroup.findOneAndUpdate(filter, updateData, {
new: true,
upsert: false,
});
if (!updatedDoc) {
throw new Error('Prompt group not found');
}
return updatedDoc;
} catch (error) {
logger.error('Error updating prompt group', error);
return { message: 'Error updating prompt group' };
}
},
/**
* Function to make a prompt production based on its ID.
* @param {String} promptId - The ID of the prompt to make production.
* @returns {Object} The result of the production operation.
*/
makePromptProduction: async (promptId) => {
try {
const prompt = await Prompt.findById(promptId).lean();
if (!prompt) {
throw new Error('Prompt not found');
}
await PromptGroup.findByIdAndUpdate(
prompt.groupId,
{ productionId: prompt._id },
{ new: true },
)
.lean()
.exec();
return {
message: 'Prompt production made successfully',
};
} catch (error) {
logger.error('Error making prompt production', error);
return { message: 'Error making prompt production' };
}
},
updatePromptLabels: async (_id, labels) => {
try {
const response = await Prompt.updateOne({ _id }, { $set: { labels } });
if (response.matchedCount === 0) {
return { message: 'Prompt not found' };
}
return { message: 'Prompt labels updated successfully' };
} catch (error) {
logger.error('Error updating prompt labels', error);
return { message: 'Error updating prompt labels' };
}
},
};

View file

@ -1,557 +0,0 @@
const mongoose = require('mongoose');
const { ObjectId } = require('mongodb');
const { logger } = require('@librechat/data-schemas');
const { MongoMemoryServer } = require('mongodb-memory-server');
const {
SystemRoles,
ResourceType,
AccessRoleIds,
PrincipalType,
PermissionBits,
} = require('librechat-data-provider');
// Mock the config/connect module to prevent connection attempts during tests
jest.mock('../../config/connect', () => jest.fn().mockResolvedValue(true));
const dbModels = require('~/db/models');
// Disable console for tests
logger.silent = true;
let mongoServer;
let Prompt, PromptGroup, AclEntry, AccessRole, User, Group;
let promptFns, permissionService;
let testUsers, testGroups, testRoles;
beforeAll(async () => {
// Set up MongoDB memory server
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
await mongoose.connect(mongoUri);
// Initialize models
Prompt = dbModels.Prompt;
PromptGroup = dbModels.PromptGroup;
AclEntry = dbModels.AclEntry;
AccessRole = dbModels.AccessRole;
User = dbModels.User;
Group = dbModels.Group;
promptFns = require('~/models/Prompt');
permissionService = require('~/server/services/PermissionService');
// Create test data
await setupTestData();
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
jest.clearAllMocks();
});
async function setupTestData() {
// Create access roles for promptGroups
testRoles = {
viewer: await AccessRole.create({
accessRoleId: AccessRoleIds.PROMPTGROUP_VIEWER,
name: 'Viewer',
description: 'Can view promptGroups',
resourceType: ResourceType.PROMPTGROUP,
permBits: PermissionBits.VIEW,
}),
editor: await AccessRole.create({
accessRoleId: AccessRoleIds.PROMPTGROUP_EDITOR,
name: 'Editor',
description: 'Can view and edit promptGroups',
resourceType: ResourceType.PROMPTGROUP,
permBits: PermissionBits.VIEW | PermissionBits.EDIT,
}),
owner: await AccessRole.create({
accessRoleId: AccessRoleIds.PROMPTGROUP_OWNER,
name: 'Owner',
description: 'Full control over promptGroups',
resourceType: ResourceType.PROMPTGROUP,
permBits:
PermissionBits.VIEW | PermissionBits.EDIT | PermissionBits.DELETE | PermissionBits.SHARE,
}),
};
// Create test users
testUsers = {
owner: await User.create({
name: 'Prompt Owner',
email: 'owner@example.com',
role: SystemRoles.USER,
}),
editor: await User.create({
name: 'Prompt Editor',
email: 'editor@example.com',
role: SystemRoles.USER,
}),
viewer: await User.create({
name: 'Prompt Viewer',
email: 'viewer@example.com',
role: SystemRoles.USER,
}),
admin: await User.create({
name: 'Admin User',
email: 'admin@example.com',
role: SystemRoles.ADMIN,
}),
noAccess: await User.create({
name: 'No Access User',
email: 'noaccess@example.com',
role: SystemRoles.USER,
}),
};
// Create test groups
testGroups = {
editors: await Group.create({
name: 'Prompt Editors',
description: 'Group with editor access',
}),
viewers: await Group.create({
name: 'Prompt Viewers',
description: 'Group with viewer access',
}),
};
}
describe('Prompt ACL Permissions', () => {
describe('Creating Prompts with Permissions', () => {
it('should grant owner permissions when creating a prompt', async () => {
// First create a group
const testGroup = await PromptGroup.create({
name: 'Test Group',
category: 'testing',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new mongoose.Types.ObjectId(),
});
const promptData = {
prompt: {
prompt: 'Test prompt content',
name: 'Test Prompt',
type: 'text',
groupId: testGroup._id,
},
author: testUsers.owner._id,
};
await promptFns.savePrompt(promptData);
// Manually grant permissions as would happen in the route
await permissionService.grantPermission({
principalType: PrincipalType.USER,
principalId: testUsers.owner._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_OWNER,
grantedBy: testUsers.owner._id,
});
// Check ACL entry
const aclEntry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: testGroup._id,
principalType: PrincipalType.USER,
principalId: testUsers.owner._id,
});
expect(aclEntry).toBeTruthy();
expect(aclEntry.permBits).toBe(testRoles.owner.permBits);
});
});
describe('Accessing Prompts', () => {
let testPromptGroup;
beforeEach(async () => {
// Create a prompt group
testPromptGroup = await PromptGroup.create({
name: 'Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
// Create a prompt
await Prompt.create({
prompt: 'Test prompt for access control',
name: 'Access Test Prompt',
author: testUsers.owner._id,
groupId: testPromptGroup._id,
type: 'text',
});
// Grant owner permissions
await permissionService.grantPermission({
principalType: PrincipalType.USER,
principalId: testUsers.owner._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_OWNER,
grantedBy: testUsers.owner._id,
});
});
afterEach(async () => {
await Prompt.deleteMany({});
await PromptGroup.deleteMany({});
await AclEntry.deleteMany({});
});
it('owner should have full access to their prompt', async () => {
const hasAccess = await permissionService.checkPermission({
userId: testUsers.owner._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.VIEW,
});
expect(hasAccess).toBe(true);
const canEdit = await permissionService.checkPermission({
userId: testUsers.owner._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.EDIT,
});
expect(canEdit).toBe(true);
});
it('user with viewer role should only have view access', async () => {
// Grant viewer permissions
await permissionService.grantPermission({
principalType: PrincipalType.USER,
principalId: testUsers.viewer._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_VIEWER,
grantedBy: testUsers.owner._id,
});
const canView = await permissionService.checkPermission({
userId: testUsers.viewer._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.VIEW,
});
const canEdit = await permissionService.checkPermission({
userId: testUsers.viewer._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.EDIT,
});
expect(canView).toBe(true);
expect(canEdit).toBe(false);
});
it('user without permissions should have no access', async () => {
const hasAccess = await permissionService.checkPermission({
userId: testUsers.noAccess._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.VIEW,
});
expect(hasAccess).toBe(false);
});
it('admin should have access regardless of permissions', async () => {
// Admin users should work through normal permission system
// The middleware layer handles admin bypass, not the permission service
const hasAccess = await permissionService.checkPermission({
userId: testUsers.admin._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.VIEW,
});
// Without explicit permissions, even admin won't have access at this layer
expect(hasAccess).toBe(false);
// The actual admin bypass happens in the middleware layer (`canAccessPromptViaGroup`/`canAccessPromptGroupResource`)
// which checks req.user.role === SystemRoles.ADMIN
});
});
describe('Group-based Access', () => {
let testPromptGroup;
beforeEach(async () => {
// Create a prompt group first
testPromptGroup = await PromptGroup.create({
name: 'Group Access Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
await Prompt.create({
prompt: 'Group access test prompt',
name: 'Group Test',
author: testUsers.owner._id,
groupId: testPromptGroup._id,
type: 'text',
});
// Add users to groups
await User.findByIdAndUpdate(testUsers.editor._id, {
$push: { groups: testGroups.editors._id },
});
await User.findByIdAndUpdate(testUsers.viewer._id, {
$push: { groups: testGroups.viewers._id },
});
});
afterEach(async () => {
await Prompt.deleteMany({});
await AclEntry.deleteMany({});
await User.updateMany({}, { $set: { groups: [] } });
});
it('group members should inherit group permissions', async () => {
// Create a prompt group
const testPromptGroup = await PromptGroup.create({
name: 'Group Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
const { addUserToGroup } = require('~/models');
await addUserToGroup(testUsers.editor._id, testGroups.editors._id);
const prompt = await promptFns.savePrompt({
author: testUsers.owner._id,
prompt: {
prompt: 'Group test prompt',
name: 'Group Test',
groupId: testPromptGroup._id,
type: 'text',
},
});
// Check if savePrompt returned an error
if (!prompt || !prompt.prompt) {
throw new Error(`Failed to save prompt: ${prompt?.message || 'Unknown error'}`);
}
// Grant edit permissions to the group
await permissionService.grantPermission({
principalType: PrincipalType.GROUP,
principalId: testGroups.editors._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_EDITOR,
grantedBy: testUsers.owner._id,
});
// Check if group member has access
const hasAccess = await permissionService.checkPermission({
userId: testUsers.editor._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.EDIT,
});
expect(hasAccess).toBe(true);
// Check that non-member doesn't have access
const nonMemberAccess = await permissionService.checkPermission({
userId: testUsers.viewer._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
requiredPermission: PermissionBits.EDIT,
});
expect(nonMemberAccess).toBe(false);
});
});
describe('Public Access', () => {
let publicPromptGroup, privatePromptGroup;
beforeEach(async () => {
// Create separate prompt groups for public and private access
publicPromptGroup = await PromptGroup.create({
name: 'Public Access Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
privatePromptGroup = await PromptGroup.create({
name: 'Private Access Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
// Create prompts in their respective groups
await Prompt.create({
prompt: 'Public prompt',
name: 'Public',
author: testUsers.owner._id,
groupId: publicPromptGroup._id,
type: 'text',
});
await Prompt.create({
prompt: 'Private prompt',
name: 'Private',
author: testUsers.owner._id,
groupId: privatePromptGroup._id,
type: 'text',
});
// Grant public view access to publicPromptGroup
await permissionService.grantPermission({
principalType: PrincipalType.PUBLIC,
principalId: null,
resourceType: ResourceType.PROMPTGROUP,
resourceId: publicPromptGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_VIEWER,
grantedBy: testUsers.owner._id,
});
// Grant only owner access to privatePromptGroup
await permissionService.grantPermission({
principalType: PrincipalType.USER,
principalId: testUsers.owner._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: privatePromptGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_OWNER,
grantedBy: testUsers.owner._id,
});
});
afterEach(async () => {
await Prompt.deleteMany({});
await PromptGroup.deleteMany({});
await AclEntry.deleteMany({});
});
it('public prompt should be accessible to any user', async () => {
const hasAccess = await permissionService.checkPermission({
userId: testUsers.noAccess._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: publicPromptGroup._id,
requiredPermission: PermissionBits.VIEW,
includePublic: true,
});
expect(hasAccess).toBe(true);
});
it('private prompt should not be accessible to unauthorized users', async () => {
const hasAccess = await permissionService.checkPermission({
userId: testUsers.noAccess._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: privatePromptGroup._id,
requiredPermission: PermissionBits.VIEW,
includePublic: true,
});
expect(hasAccess).toBe(false);
});
});
describe('Prompt Deletion', () => {
let testPromptGroup;
it('should remove ACL entries when prompt is deleted', async () => {
testPromptGroup = await PromptGroup.create({
name: 'Deletion Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
const prompt = await promptFns.savePrompt({
author: testUsers.owner._id,
prompt: {
prompt: 'To be deleted',
name: 'Delete Test',
groupId: testPromptGroup._id,
type: 'text',
},
});
// Check if savePrompt returned an error
if (!prompt || !prompt.prompt) {
throw new Error(`Failed to save prompt: ${prompt?.message || 'Unknown error'}`);
}
const testPromptId = prompt.prompt._id;
const promptGroupId = testPromptGroup._id;
// Grant permission
await permissionService.grantPermission({
principalType: PrincipalType.USER,
principalId: testUsers.owner._id,
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
accessRoleId: AccessRoleIds.PROMPTGROUP_OWNER,
grantedBy: testUsers.owner._id,
});
// Verify ACL entry exists
const beforeDelete = await AclEntry.find({
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
});
expect(beforeDelete).toHaveLength(1);
// Delete the prompt
await promptFns.deletePrompt({
promptId: testPromptId,
groupId: promptGroupId,
author: testUsers.owner._id,
role: SystemRoles.USER,
});
// Verify ACL entries are removed
const aclEntries = await AclEntry.find({
resourceType: ResourceType.PROMPTGROUP,
resourceId: testPromptGroup._id,
});
expect(aclEntries).toHaveLength(0);
});
});
describe('Backwards Compatibility', () => {
it('should handle prompts without ACL entries gracefully', async () => {
// Create a prompt group first
const promptGroup = await PromptGroup.create({
name: 'Legacy Test Group',
author: testUsers.owner._id,
authorName: testUsers.owner.name,
productionId: new ObjectId(),
});
// Create a prompt without ACL entries (legacy prompt)
const legacyPrompt = await Prompt.create({
prompt: 'Legacy prompt without ACL',
name: 'Legacy',
author: testUsers.owner._id,
groupId: promptGroup._id,
type: 'text',
});
// The system should handle this gracefully
const prompt = await promptFns.getPrompt({ _id: legacyPrompt._id });
expect(prompt).toBeTruthy();
expect(prompt._id.toString()).toBe(legacyPrompt._id.toString());
});
});
});

View file

@ -1,279 +0,0 @@
const mongoose = require('mongoose');
const { ObjectId } = require('mongodb');
const { logger } = require('@librechat/data-schemas');
const { MongoMemoryServer } = require('mongodb-memory-server');
const {
ResourceType,
AccessRoleIds,
PrincipalType,
PrincipalModel,
PermissionBits,
} = require('librechat-data-provider');
// Mock the config/connect module to prevent connection attempts during tests
jest.mock('../../config/connect', () => jest.fn().mockResolvedValue(true));
// Disable console for tests
logger.silent = true;
describe('PromptGroup Migration Script', () => {
let mongoServer;
let Prompt, PromptGroup, AclEntry, AccessRole, User;
let migrateToPromptGroupPermissions;
let testOwner;
let ownerRole, viewerRole;
beforeAll(async () => {
// Set up MongoDB memory server
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
await mongoose.connect(mongoUri);
// Initialize models
const dbModels = require('~/db/models');
Prompt = dbModels.Prompt;
PromptGroup = dbModels.PromptGroup;
AclEntry = dbModels.AclEntry;
AccessRole = dbModels.AccessRole;
User = dbModels.User;
// Create test user
testOwner = await User.create({
name: 'Test Owner',
email: 'owner@test.com',
role: 'USER',
});
// Create test project document in the raw `projects` collection
const projectName = 'instance';
await mongoose.connection.db.collection('projects').insertOne({
name: projectName,
promptGroupIds: [],
});
// Create promptGroup access roles
ownerRole = await AccessRole.create({
accessRoleId: AccessRoleIds.PROMPTGROUP_OWNER,
name: 'Owner',
description: 'Full control over promptGroups',
resourceType: ResourceType.PROMPTGROUP,
permBits:
PermissionBits.VIEW | PermissionBits.EDIT | PermissionBits.DELETE | PermissionBits.SHARE,
});
viewerRole = await AccessRole.create({
accessRoleId: AccessRoleIds.PROMPTGROUP_VIEWER,
name: 'Viewer',
description: 'Can view promptGroups',
resourceType: ResourceType.PROMPTGROUP,
permBits: PermissionBits.VIEW,
});
await AccessRole.create({
accessRoleId: AccessRoleIds.PROMPTGROUP_EDITOR,
name: 'Editor',
description: 'Can view and edit promptGroups',
resourceType: ResourceType.PROMPTGROUP,
permBits: PermissionBits.VIEW | PermissionBits.EDIT,
});
// Import migration function
const migration = require('../../config/migrate-prompt-permissions');
migrateToPromptGroupPermissions = migration.migrateToPromptGroupPermissions;
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
});
beforeEach(async () => {
// Clean up before each test
await Prompt.deleteMany({});
await PromptGroup.deleteMany({});
await AclEntry.deleteMany({});
await mongoose.connection.db
.collection('projects')
.updateOne({ name: 'instance' }, { $set: { promptGroupIds: [] } });
});
it('should categorize promptGroups correctly in dry run', async () => {
// Create global prompt group (in Global project)
const globalPromptGroup = await PromptGroup.create({
name: 'Global Group',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
// Create private prompt group (not in any project)
await PromptGroup.create({
name: 'Private Group',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
// Add global group to project's promptGroupIds array
await mongoose.connection.db
.collection('projects')
.updateOne({ name: 'instance' }, { $set: { promptGroupIds: [globalPromptGroup._id] } });
const result = await migrateToPromptGroupPermissions({ dryRun: true });
expect(result.dryRun).toBe(true);
expect(result.summary.total).toBe(2);
expect(result.summary.globalViewAccess).toBe(1);
expect(result.summary.privateGroups).toBe(1);
});
it('should grant appropriate permissions during migration', async () => {
// Create prompt groups
const globalPromptGroup = await PromptGroup.create({
name: 'Global Group',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
const privatePromptGroup = await PromptGroup.create({
name: 'Private Group',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
// Add global group to project's promptGroupIds array
await mongoose.connection.db
.collection('projects')
.updateOne({ name: 'instance' }, { $set: { promptGroupIds: [globalPromptGroup._id] } });
const result = await migrateToPromptGroupPermissions({ dryRun: false });
expect(result.migrated).toBe(2);
expect(result.errors).toBe(0);
expect(result.ownerGrants).toBe(2);
expect(result.publicViewGrants).toBe(1);
// Check global promptGroup permissions
const globalOwnerEntry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: globalPromptGroup._id,
principalType: PrincipalType.USER,
principalId: testOwner._id,
});
expect(globalOwnerEntry).toBeTruthy();
expect(globalOwnerEntry.permBits).toBe(ownerRole.permBits);
const globalPublicEntry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: globalPromptGroup._id,
principalType: PrincipalType.PUBLIC,
});
expect(globalPublicEntry).toBeTruthy();
expect(globalPublicEntry.permBits).toBe(viewerRole.permBits);
// Check private promptGroup permissions
const privateOwnerEntry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: privatePromptGroup._id,
principalType: PrincipalType.USER,
principalId: testOwner._id,
});
expect(privateOwnerEntry).toBeTruthy();
expect(privateOwnerEntry.permBits).toBe(ownerRole.permBits);
const privatePublicEntry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: privatePromptGroup._id,
principalType: PrincipalType.PUBLIC,
});
expect(privatePublicEntry).toBeNull();
});
it('should skip promptGroups that already have ACL entries', async () => {
// Create prompt groups
const promptGroup1 = await PromptGroup.create({
name: 'Group 1',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
const promptGroup2 = await PromptGroup.create({
name: 'Group 2',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
// Grant permission to one promptGroup manually (simulating it already has ACL)
await AclEntry.create({
principalType: PrincipalType.USER,
principalId: testOwner._id,
principalModel: PrincipalModel.USER,
resourceType: ResourceType.PROMPTGROUP,
resourceId: promptGroup1._id,
permBits: ownerRole.permBits,
roleId: ownerRole._id,
grantedBy: testOwner._id,
grantedAt: new Date(),
});
const result = await migrateToPromptGroupPermissions({ dryRun: false });
// Should only migrate promptGroup2, skip promptGroup1
expect(result.migrated).toBe(1);
expect(result.errors).toBe(0);
// Verify promptGroup2 now has permissions
const group2Entry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: promptGroup2._id,
});
expect(group2Entry).toBeTruthy();
});
it('should handle promptGroups with prompts correctly', async () => {
// Create a promptGroup with some prompts
const promptGroup = await PromptGroup.create({
name: 'Group with Prompts',
author: testOwner._id,
authorName: testOwner.name,
productionId: new ObjectId(),
});
// Create some prompts in this group
await Prompt.create({
prompt: 'First prompt',
author: testOwner._id,
groupId: promptGroup._id,
type: 'text',
});
await Prompt.create({
prompt: 'Second prompt',
author: testOwner._id,
groupId: promptGroup._id,
type: 'text',
});
const result = await migrateToPromptGroupPermissions({ dryRun: false });
expect(result.migrated).toBe(1);
expect(result.errors).toBe(0);
// Verify the promptGroup has permissions
const groupEntry = await AclEntry.findOne({
resourceType: ResourceType.PROMPTGROUP,
resourceId: promptGroup._id,
});
expect(groupEntry).toBeTruthy();
// Verify no prompt-level permissions were created
const promptEntries = await AclEntry.find({
resourceType: 'prompt',
});
expect(promptEntries).toHaveLength(0);
});
});

View file

@ -1,256 +0,0 @@
const {
CacheKeys,
SystemRoles,
roleDefaults,
permissionsSchema,
removeNullishValues,
} = require('librechat-data-provider');
const { logger } = require('@librechat/data-schemas');
const getLogStores = require('~/cache/getLogStores');
const { Role } = require('~/db/models');
/**
* Retrieve a role by name and convert the found role document to a plain object.
* If the role with the given name doesn't exist and the name is a system defined role,
* create it and return the lean version.
*
* @param {string} roleName - The name of the role to find or create.
* @param {string|string[]} [fieldsToSelect] - The fields to include or exclude in the returned document.
* @returns {Promise<IRole>} Role document.
*/
const getRoleByName = async function (roleName, fieldsToSelect = null) {
const cache = getLogStores(CacheKeys.ROLES);
try {
const cachedRole = await cache.get(roleName);
if (cachedRole) {
return cachedRole;
}
let query = Role.findOne({ name: roleName });
if (fieldsToSelect) {
query = query.select(fieldsToSelect);
}
let role = await query.lean().exec();
if (!role && SystemRoles[roleName]) {
role = await new Role(roleDefaults[roleName]).save();
await cache.set(roleName, role);
return role.toObject();
}
await cache.set(roleName, role);
return role;
} catch (error) {
throw new Error(`Failed to retrieve or create role: ${error.message}`);
}
};
/**
* Update role values by name.
*
* @param {string} roleName - The name of the role to update.
* @param {Partial<TRole>} updates - The fields to update.
* @returns {Promise<TRole>} Updated role document.
*/
const updateRoleByName = async function (roleName, updates) {
const cache = getLogStores(CacheKeys.ROLES);
try {
const role = await Role.findOneAndUpdate(
{ name: roleName },
{ $set: updates },
{ new: true, lean: true },
)
.select('-__v')
.lean()
.exec();
await cache.set(roleName, role);
return role;
} catch (error) {
throw new Error(`Failed to update role: ${error.message}`);
}
};
/**
* Updates access permissions for a specific role and multiple permission types.
* @param {string} roleName - The role to update.
* @param {Object.<PermissionTypes, Object.<Permissions, boolean>>} permissionsUpdate - Permissions to update and their values.
* @param {IRole} [roleData] - Optional role data to use instead of fetching from the database.
*/
async function updateAccessPermissions(roleName, permissionsUpdate, roleData) {
// Filter and clean the permission updates based on our schema definition.
const updates = {};
for (const [permissionType, permissions] of Object.entries(permissionsUpdate)) {
if (permissionsSchema.shape && permissionsSchema.shape[permissionType]) {
updates[permissionType] = removeNullishValues(permissions);
}
}
if (!Object.keys(updates).length) {
return;
}
try {
const role = roleData ?? (await getRoleByName(roleName));
if (!role) {
return;
}
const currentPermissions = role.permissions || {};
const updatedPermissions = { ...currentPermissions };
let hasChanges = false;
const unsetFields = {};
const permissionTypes = Object.keys(permissionsSchema.shape || {});
for (const permType of permissionTypes) {
if (role[permType] && typeof role[permType] === 'object') {
logger.info(
`Migrating '${roleName}' role from old schema: found '${permType}' at top level`,
);
updatedPermissions[permType] = {
...updatedPermissions[permType],
...role[permType],
};
unsetFields[permType] = 1;
hasChanges = true;
}
}
for (const [permissionType, permissions] of Object.entries(updates)) {
const currentTypePermissions = currentPermissions[permissionType] || {};
updatedPermissions[permissionType] = { ...currentTypePermissions };
for (const [permission, value] of Object.entries(permissions)) {
if (currentTypePermissions[permission] !== value) {
updatedPermissions[permissionType][permission] = value;
hasChanges = true;
logger.info(
`Updating '${roleName}' role permission '${permissionType}' '${permission}' from ${currentTypePermissions[permission]} to: ${value}`,
);
}
}
}
if (hasChanges) {
const updateObj = { permissions: updatedPermissions };
if (Object.keys(unsetFields).length > 0) {
logger.info(
`Unsetting old schema fields for '${roleName}' role: ${Object.keys(unsetFields).join(', ')}`,
);
try {
await Role.updateOne(
{ name: roleName },
{
$set: updateObj,
$unset: unsetFields,
},
);
const cache = getLogStores(CacheKeys.ROLES);
const updatedRole = await Role.findOne({ name: roleName }).select('-__v').lean().exec();
await cache.set(roleName, updatedRole);
logger.info(`Updated role '${roleName}' and removed old schema fields`);
} catch (updateError) {
logger.error(`Error during role migration update: ${updateError.message}`);
throw updateError;
}
} else {
// Standard update if no migration needed
await updateRoleByName(roleName, updateObj);
}
logger.info(`Updated '${roleName}' role permissions`);
} else {
logger.info(`No changes needed for '${roleName}' role permissions`);
}
} catch (error) {
logger.error(`Failed to update ${roleName} role permissions:`, error);
}
}
/**
* Migrates roles from old schema to new schema structure.
* This can be called directly to fix existing roles.
*
* @param {string} [roleName] - Optional specific role to migrate. If not provided, migrates all roles.
* @returns {Promise<number>} Number of roles migrated.
*/
const migrateRoleSchema = async function (roleName) {
try {
// Get roles to migrate
let roles;
if (roleName) {
const role = await Role.findOne({ name: roleName });
roles = role ? [role] : [];
} else {
roles = await Role.find({});
}
logger.info(`Migrating ${roles.length} roles to new schema structure`);
let migratedCount = 0;
for (const role of roles) {
const permissionTypes = Object.keys(permissionsSchema.shape || {});
const unsetFields = {};
let hasOldSchema = false;
// Check for old schema fields
for (const permType of permissionTypes) {
if (role[permType] && typeof role[permType] === 'object') {
hasOldSchema = true;
// Ensure permissions object exists
role.permissions = role.permissions || {};
// Migrate permissions from old location to new
role.permissions[permType] = {
...role.permissions[permType],
...role[permType],
};
// Mark field for removal
unsetFields[permType] = 1;
}
}
if (hasOldSchema) {
try {
logger.info(`Migrating role '${role.name}' from old schema structure`);
// Simple update operation
await Role.updateOne(
{ _id: role._id },
{
$set: { permissions: role.permissions },
$unset: unsetFields,
},
);
// Refresh cache
const cache = getLogStores(CacheKeys.ROLES);
const updatedRole = await Role.findById(role._id).lean().exec();
await cache.set(role.name, updatedRole);
migratedCount++;
logger.info(`Migrated role '${role.name}'`);
} catch (error) {
logger.error(`Failed to migrate role '${role.name}': ${error.message}`);
}
}
}
logger.info(`Migration complete: ${migratedCount} roles migrated`);
return migratedCount;
} catch (error) {
logger.error(`Role schema migration failed: ${error.message}`);
throw error;
}
};
module.exports = {
getRoleByName,
updateRoleByName,
migrateRoleSchema,
updateAccessPermissions,
};

View file

@ -1,405 +0,0 @@
const mongoose = require('mongoose');
const { MongoMemoryServer } = require('mongodb-memory-server');
const {
SystemRoles,
Permissions,
roleDefaults,
PermissionTypes,
} = require('librechat-data-provider');
const { getRoleByName, updateAccessPermissions } = require('~/models/Role');
const getLogStores = require('~/cache/getLogStores');
const { initializeRoles } = require('~/models');
const { Role } = require('~/db/models');
// Mock the cache
jest.mock('~/cache/getLogStores', () =>
jest.fn().mockReturnValue({
get: jest.fn(),
set: jest.fn(),
del: jest.fn(),
}),
);
let mongoServer;
beforeAll(async () => {
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
await mongoose.connect(mongoUri);
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
});
beforeEach(async () => {
await Role.deleteMany({});
getLogStores.mockClear();
});
describe('updateAccessPermissions', () => {
it('should update permissions when changes are needed', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: {
CREATE: true,
USE: true,
SHARE: false,
},
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: {
CREATE: true,
USE: true,
SHARE: true,
},
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: true,
SHARE: true,
});
});
it('should not update permissions when no changes are needed', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: {
CREATE: true,
USE: true,
SHARE: false,
},
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: {
CREATE: true,
USE: true,
SHARE: false,
},
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: true,
SHARE: false,
});
});
it('should handle non-existent roles', async () => {
await updateAccessPermissions('NON_EXISTENT_ROLE', {
[PermissionTypes.PROMPTS]: { CREATE: true },
});
const role = await Role.findOne({ name: 'NON_EXISTENT_ROLE' });
expect(role).toBeNull();
});
it('should update only specified permissions', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: {
CREATE: true,
USE: true,
SHARE: false,
},
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { SHARE: true },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: true,
SHARE: true,
});
});
it('should handle partial updates', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: {
CREATE: true,
USE: true,
SHARE: false,
},
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { USE: false },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: false,
SHARE: false,
});
});
it('should update multiple permission types at once', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: { CREATE: true, USE: true, SHARE: false },
[PermissionTypes.BOOKMARKS]: { USE: true },
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { USE: false, SHARE: true },
[PermissionTypes.BOOKMARKS]: { USE: false },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: false,
SHARE: true,
});
expect(updatedRole.permissions[PermissionTypes.BOOKMARKS]).toEqual({ USE: false });
});
it('should handle updates for a single permission type', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: { CREATE: true, USE: true, SHARE: false },
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { USE: false, SHARE: true },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: false,
SHARE: true,
});
});
it('should update MULTI_CONVO permissions', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.MULTI_CONVO]: { USE: false },
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.MULTI_CONVO]: { USE: true },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.MULTI_CONVO]).toEqual({ USE: true });
});
it('should update MULTI_CONVO permissions along with other permission types', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: { CREATE: true, USE: true, SHARE: false },
[PermissionTypes.MULTI_CONVO]: { USE: false },
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { SHARE: true },
[PermissionTypes.MULTI_CONVO]: { USE: true },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.PROMPTS]).toEqual({
CREATE: true,
USE: true,
SHARE: true,
});
expect(updatedRole.permissions[PermissionTypes.MULTI_CONVO]).toEqual({ USE: true });
});
it('should not update MULTI_CONVO permissions when no changes are needed', async () => {
await new Role({
name: SystemRoles.USER,
permissions: {
[PermissionTypes.MULTI_CONVO]: { USE: true },
},
}).save();
await updateAccessPermissions(SystemRoles.USER, {
[PermissionTypes.MULTI_CONVO]: { USE: true },
});
const updatedRole = await getRoleByName(SystemRoles.USER);
expect(updatedRole.permissions[PermissionTypes.MULTI_CONVO]).toEqual({ USE: true });
});
});
describe('initializeRoles', () => {
beforeEach(async () => {
await Role.deleteMany({});
});
it('should create default roles if they do not exist', async () => {
await initializeRoles();
const adminRole = await getRoleByName(SystemRoles.ADMIN);
const userRole = await getRoleByName(SystemRoles.USER);
expect(adminRole).toBeTruthy();
expect(userRole).toBeTruthy();
// Check if all permission types exist in the permissions field
Object.values(PermissionTypes).forEach((permType) => {
expect(adminRole.permissions[permType]).toBeDefined();
expect(userRole.permissions[permType]).toBeDefined();
});
// Example: Check default values for ADMIN role
expect(adminRole.permissions[PermissionTypes.PROMPTS].SHARE).toBe(true);
expect(adminRole.permissions[PermissionTypes.BOOKMARKS].USE).toBe(true);
expect(adminRole.permissions[PermissionTypes.AGENTS].CREATE).toBe(true);
});
it('should not modify existing permissions for existing roles', async () => {
const customUserRole = {
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]: {
[Permissions.USE]: false,
[Permissions.CREATE]: true,
[Permissions.SHARE]: true,
},
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: false },
},
};
await new Role(customUserRole).save();
await initializeRoles();
const userRole = await getRoleByName(SystemRoles.USER);
expect(userRole.permissions[PermissionTypes.PROMPTS]).toEqual(
customUserRole.permissions[PermissionTypes.PROMPTS],
);
expect(userRole.permissions[PermissionTypes.BOOKMARKS]).toEqual(
customUserRole.permissions[PermissionTypes.BOOKMARKS],
);
expect(userRole.permissions[PermissionTypes.AGENTS]).toBeDefined();
});
it('should add new permission types to existing roles', async () => {
const partialUserRole = {
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]:
roleDefaults[SystemRoles.USER].permissions[PermissionTypes.PROMPTS],
[PermissionTypes.BOOKMARKS]:
roleDefaults[SystemRoles.USER].permissions[PermissionTypes.BOOKMARKS],
},
};
await new Role(partialUserRole).save();
await initializeRoles();
const userRole = await getRoleByName(SystemRoles.USER);
expect(userRole.permissions[PermissionTypes.AGENTS]).toBeDefined();
expect(userRole.permissions[PermissionTypes.AGENTS].CREATE).toBeDefined();
expect(userRole.permissions[PermissionTypes.AGENTS].USE).toBeDefined();
expect(userRole.permissions[PermissionTypes.AGENTS].SHARE).toBeDefined();
});
it('should handle multiple runs without duplicating or modifying data', async () => {
await initializeRoles();
await initializeRoles();
const adminRoles = await Role.find({ name: SystemRoles.ADMIN });
const userRoles = await Role.find({ name: SystemRoles.USER });
expect(adminRoles).toHaveLength(1);
expect(userRoles).toHaveLength(1);
const adminPerms = adminRoles[0].toObject().permissions;
const userPerms = userRoles[0].toObject().permissions;
Object.values(PermissionTypes).forEach((permType) => {
expect(adminPerms[permType]).toBeDefined();
expect(userPerms[permType]).toBeDefined();
});
});
it('should update roles with missing permission types from roleDefaults', async () => {
const partialAdminRole = {
name: SystemRoles.ADMIN,
permissions: {
[PermissionTypes.PROMPTS]: {
[Permissions.USE]: false,
[Permissions.CREATE]: false,
[Permissions.SHARE]: false,
},
[PermissionTypes.BOOKMARKS]:
roleDefaults[SystemRoles.ADMIN].permissions[PermissionTypes.BOOKMARKS],
},
};
await new Role(partialAdminRole).save();
await initializeRoles();
const adminRole = await getRoleByName(SystemRoles.ADMIN);
expect(adminRole.permissions[PermissionTypes.PROMPTS]).toEqual(
partialAdminRole.permissions[PermissionTypes.PROMPTS],
);
expect(adminRole.permissions[PermissionTypes.AGENTS]).toBeDefined();
expect(adminRole.permissions[PermissionTypes.AGENTS].CREATE).toBeDefined();
expect(adminRole.permissions[PermissionTypes.AGENTS].USE).toBeDefined();
expect(adminRole.permissions[PermissionTypes.AGENTS].SHARE).toBeDefined();
});
it('should include MULTI_CONVO permissions when creating default roles', async () => {
await initializeRoles();
const adminRole = await getRoleByName(SystemRoles.ADMIN);
const userRole = await getRoleByName(SystemRoles.USER);
expect(adminRole.permissions[PermissionTypes.MULTI_CONVO]).toBeDefined();
expect(userRole.permissions[PermissionTypes.MULTI_CONVO]).toBeDefined();
expect(adminRole.permissions[PermissionTypes.MULTI_CONVO].USE).toBe(
roleDefaults[SystemRoles.ADMIN].permissions[PermissionTypes.MULTI_CONVO].USE,
);
expect(userRole.permissions[PermissionTypes.MULTI_CONVO].USE).toBe(
roleDefaults[SystemRoles.USER].permissions[PermissionTypes.MULTI_CONVO].USE,
);
});
it('should add MULTI_CONVO permissions to existing roles without them', async () => {
const partialUserRole = {
name: SystemRoles.USER,
permissions: {
[PermissionTypes.PROMPTS]:
roleDefaults[SystemRoles.USER].permissions[PermissionTypes.PROMPTS],
[PermissionTypes.BOOKMARKS]:
roleDefaults[SystemRoles.USER].permissions[PermissionTypes.BOOKMARKS],
},
};
await new Role(partialUserRole).save();
await initializeRoles();
const userRole = await getRoleByName(SystemRoles.USER);
expect(userRole.permissions[PermissionTypes.MULTI_CONVO]).toBeDefined();
expect(userRole.permissions[PermissionTypes.MULTI_CONVO].USE).toBeDefined();
});
});

View file

@ -1,96 +0,0 @@
const { ToolCall } = require('~/db/models');
/**
* Create a new tool call
* @param {IToolCallData} toolCallData - The tool call data
* @returns {Promise<IToolCallData>} The created tool call document
*/
async function createToolCall(toolCallData) {
try {
return await ToolCall.create(toolCallData);
} catch (error) {
throw new Error(`Error creating tool call: ${error.message}`);
}
}
/**
* Get a tool call by ID
* @param {string} id - The tool call document ID
* @returns {Promise<IToolCallData|null>} The tool call document or null if not found
*/
async function getToolCallById(id) {
try {
return await ToolCall.findById(id).lean();
} catch (error) {
throw new Error(`Error fetching tool call: ${error.message}`);
}
}
/**
* Get tool calls by message ID and user
* @param {string} messageId - The message ID
* @param {string} userId - The user's ObjectId
* @returns {Promise<Array>} Array of tool call documents
*/
async function getToolCallsByMessage(messageId, userId) {
try {
return await ToolCall.find({ messageId, user: userId }).lean();
} catch (error) {
throw new Error(`Error fetching tool calls: ${error.message}`);
}
}
/**
* Get tool calls by conversation ID and user
* @param {string} conversationId - The conversation ID
* @param {string} userId - The user's ObjectId
* @returns {Promise<IToolCallData[]>} Array of tool call documents
*/
async function getToolCallsByConvo(conversationId, userId) {
try {
return await ToolCall.find({ conversationId, user: userId }).lean();
} catch (error) {
throw new Error(`Error fetching tool calls: ${error.message}`);
}
}
/**
* Update a tool call
* @param {string} id - The tool call document ID
* @param {Partial<IToolCallData>} updateData - The data to update
* @returns {Promise<IToolCallData|null>} The updated tool call document or null if not found
*/
async function updateToolCall(id, updateData) {
try {
return await ToolCall.findByIdAndUpdate(id, updateData, { new: true }).lean();
} catch (error) {
throw new Error(`Error updating tool call: ${error.message}`);
}
}
/**
* Delete a tool call
* @param {string} userId - The related user's ObjectId
* @param {string} [conversationId] - The tool call conversation ID
* @returns {Promise<{ ok?: number; n?: number; deletedCount?: number }>} The result of the delete operation
*/
async function deleteToolCalls(userId, conversationId) {
try {
const query = { user: userId };
if (conversationId) {
query.conversationId = conversationId;
}
return await ToolCall.deleteMany(query);
} catch (error) {
throw new Error(`Error deleting tool call: ${error.message}`);
}
}
module.exports = {
createToolCall,
updateToolCall,
deleteToolCalls,
getToolCallById,
getToolCallsByConvo,
getToolCallsByMessage,
};

View file

@ -1,356 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { getMultiplier, getCacheMultiplier } = require('./tx');
const { Transaction, Balance } = require('~/db/models');
const cancelRate = 1.15;
/**
* Updates a user's token balance based on a transaction using optimistic concurrency control
* without schema changes. Compatible with DocumentDB.
* @async
* @function
* @param {Object} params - The function parameters.
* @param {string|mongoose.Types.ObjectId} params.user - The user ID.
* @param {number} params.incrementValue - The value to increment the balance by (can be negative).
* @param {import('mongoose').UpdateQuery<import('@librechat/data-schemas').IBalance>['$set']} [params.setValues] - Optional additional fields to set.
* @returns {Promise<Object>} Returns the updated balance document (lean).
* @throws {Error} Throws an error if the update fails after multiple retries.
*/
const updateBalance = async ({ user, incrementValue, setValues }) => {
let maxRetries = 10; // Number of times to retry on conflict
let delay = 50; // Initial retry delay in ms
let lastError = null;
for (let attempt = 1; attempt <= maxRetries; attempt++) {
let currentBalanceDoc;
try {
// 1. Read the current document state
currentBalanceDoc = await Balance.findOne({ user }).lean();
const currentCredits = currentBalanceDoc ? currentBalanceDoc.tokenCredits : 0;
// 2. Calculate the desired new state
const potentialNewCredits = currentCredits + incrementValue;
const newCredits = Math.max(0, potentialNewCredits); // Ensure balance doesn't go below zero
// 3. Prepare the update payload
const updatePayload = {
$set: {
tokenCredits: newCredits,
...(setValues || {}), // Merge other values to set
},
};
// 4. Attempt the conditional update or upsert
let updatedBalance = null;
if (currentBalanceDoc) {
// --- Document Exists: Perform Conditional Update ---
// Try to update only if the tokenCredits match the value we read (currentCredits)
updatedBalance = await Balance.findOneAndUpdate(
{
user: user,
tokenCredits: currentCredits, // Optimistic lock: condition based on the read value
},
updatePayload,
{
new: true, // Return the modified document
// lean: true, // .lean() is applied after query execution in Mongoose >= 6
},
).lean(); // Use lean() for plain JS object
if (updatedBalance) {
// Success! The update was applied based on the expected current state.
return updatedBalance;
}
// If updatedBalance is null, it means tokenCredits changed between read and write (conflict).
lastError = new Error(`Concurrency conflict for user ${user} on attempt ${attempt}.`);
// Proceed to retry logic below.
} else {
// --- Document Does Not Exist: Perform Conditional Upsert ---
// Try to insert the document, but only if it still doesn't exist.
// Using tokenCredits: {$exists: false} helps prevent race conditions where
// another process creates the doc between our findOne and findOneAndUpdate.
try {
updatedBalance = await Balance.findOneAndUpdate(
{
user: user,
// Attempt to match only if the document doesn't exist OR was just created
// without tokenCredits (less likely but possible). A simple { user } filter
// might also work, relying on the retry for conflicts.
// Let's use a simpler filter and rely on retry for races.
// tokenCredits: { $exists: false } // This condition might be too strict if doc exists with 0 credits
},
updatePayload,
{
upsert: true, // Create if doesn't exist
new: true, // Return the created/updated document
// setDefaultsOnInsert: true, // Ensure schema defaults are applied on insert
// lean: true,
},
).lean();
if (updatedBalance) {
// Upsert succeeded (likely created the document)
return updatedBalance;
}
// If null, potentially a rare race condition during upsert. Retry should handle it.
lastError = new Error(
`Upsert race condition suspected for user ${user} on attempt ${attempt}.`,
);
} catch (error) {
if (error.code === 11000) {
// E11000 duplicate key error on index
// This means another process created the document *just* before our upsert.
// It's a concurrency conflict during creation. We should retry.
lastError = error; // Store the error
// Proceed to retry logic below.
} else {
// Different error, rethrow
throw error;
}
}
} // End if/else (document exists?)
} catch (error) {
// Catch errors from findOne or unexpected findOneAndUpdate errors
logger.error(`[updateBalance] Error during attempt ${attempt} for user ${user}:`, error);
lastError = error; // Store the error
// Consider stopping retries for non-transient errors, but for now, we retry.
}
// If we reached here, it means the update failed (conflict or error), wait and retry
if (attempt < maxRetries) {
const jitter = Math.random() * delay * 0.5; // Add jitter to delay
await new Promise((resolve) => setTimeout(resolve, delay + jitter));
delay = Math.min(delay * 2, 2000); // Exponential backoff with cap
}
} // End for loop (retries)
// If loop finishes without success, throw the last encountered error or a generic one
logger.error(
`[updateBalance] Failed to update balance for user ${user} after ${maxRetries} attempts.`,
);
throw (
lastError ||
new Error(
`Failed to update balance for user ${user} after maximum retries due to persistent conflicts.`,
)
);
};
/** Method to calculate and set the tokenValue for a transaction */
function calculateTokenValue(txn) {
const { valueKey, tokenType, model, endpointTokenConfig, inputTokenCount } = txn;
const multiplier = Math.abs(
getMultiplier({ valueKey, tokenType, model, endpointTokenConfig, inputTokenCount }),
);
txn.rate = multiplier;
txn.tokenValue = txn.rawAmount * multiplier;
if (txn.context && txn.tokenType === 'completion' && txn.context === 'incomplete') {
txn.tokenValue = Math.ceil(txn.tokenValue * cancelRate);
txn.rate *= cancelRate;
}
}
/**
* New static method to create an auto-refill transaction that does NOT trigger a balance update.
* @param {object} txData - Transaction data.
* @param {string} txData.user - The user ID.
* @param {string} txData.tokenType - The type of token.
* @param {string} txData.context - The context of the transaction.
* @param {number} txData.rawAmount - The raw amount of tokens.
* @returns {Promise<object>} - The created transaction.
*/
async function createAutoRefillTransaction(txData) {
if (txData.rawAmount != null && isNaN(txData.rawAmount)) {
return;
}
const transaction = new Transaction(txData);
transaction.endpointTokenConfig = txData.endpointTokenConfig;
transaction.inputTokenCount = txData.inputTokenCount;
calculateTokenValue(transaction);
await transaction.save();
const balanceResponse = await updateBalance({
user: transaction.user,
incrementValue: txData.rawAmount,
setValues: { lastRefill: new Date() },
});
const result = {
rate: transaction.rate,
user: transaction.user.toString(),
balance: balanceResponse.tokenCredits,
};
logger.debug('[Balance.check] Auto-refill performed', result);
result.transaction = transaction;
return result;
}
/**
* Static method to create a transaction and update the balance
* @param {txData} _txData - Transaction data.
*/
async function createTransaction(_txData) {
const { balance, transactions, ...txData } = _txData;
if (txData.rawAmount != null && isNaN(txData.rawAmount)) {
return;
}
if (transactions?.enabled === false) {
return;
}
const transaction = new Transaction(txData);
transaction.endpointTokenConfig = txData.endpointTokenConfig;
transaction.inputTokenCount = txData.inputTokenCount;
calculateTokenValue(transaction);
await transaction.save();
if (!balance?.enabled) {
return;
}
let incrementValue = transaction.tokenValue;
const balanceResponse = await updateBalance({
user: transaction.user,
incrementValue,
});
return {
rate: transaction.rate,
user: transaction.user.toString(),
balance: balanceResponse.tokenCredits,
[transaction.tokenType]: incrementValue,
};
}
/**
* Static method to create a structured transaction and update the balance
* @param {txData} _txData - Transaction data.
*/
async function createStructuredTransaction(_txData) {
const { balance, transactions, ...txData } = _txData;
if (transactions?.enabled === false) {
return;
}
const transaction = new Transaction(txData);
transaction.endpointTokenConfig = txData.endpointTokenConfig;
transaction.inputTokenCount = txData.inputTokenCount;
calculateStructuredTokenValue(transaction);
await transaction.save();
if (!balance?.enabled) {
return;
}
let incrementValue = transaction.tokenValue;
const balanceResponse = await updateBalance({
user: transaction.user,
incrementValue,
});
return {
rate: transaction.rate,
user: transaction.user.toString(),
balance: balanceResponse.tokenCredits,
[transaction.tokenType]: incrementValue,
};
}
/** Method to calculate token value for structured tokens */
function calculateStructuredTokenValue(txn) {
if (!txn.tokenType) {
txn.tokenValue = txn.rawAmount;
return;
}
const { model, endpointTokenConfig, inputTokenCount } = txn;
if (txn.tokenType === 'prompt') {
const inputMultiplier = getMultiplier({
tokenType: 'prompt',
model,
endpointTokenConfig,
inputTokenCount,
});
const writeMultiplier =
getCacheMultiplier({ cacheType: 'write', model, endpointTokenConfig }) ?? inputMultiplier;
const readMultiplier =
getCacheMultiplier({ cacheType: 'read', model, endpointTokenConfig }) ?? inputMultiplier;
txn.rateDetail = {
input: inputMultiplier,
write: writeMultiplier,
read: readMultiplier,
};
const totalPromptTokens =
Math.abs(txn.inputTokens || 0) +
Math.abs(txn.writeTokens || 0) +
Math.abs(txn.readTokens || 0);
if (totalPromptTokens > 0) {
txn.rate =
(Math.abs(inputMultiplier * (txn.inputTokens || 0)) +
Math.abs(writeMultiplier * (txn.writeTokens || 0)) +
Math.abs(readMultiplier * (txn.readTokens || 0))) /
totalPromptTokens;
} else {
txn.rate = Math.abs(inputMultiplier); // Default to input rate if no tokens
}
txn.tokenValue = -(
Math.abs(txn.inputTokens || 0) * inputMultiplier +
Math.abs(txn.writeTokens || 0) * writeMultiplier +
Math.abs(txn.readTokens || 0) * readMultiplier
);
txn.rawAmount = -totalPromptTokens;
} else if (txn.tokenType === 'completion') {
const multiplier = getMultiplier({
tokenType: txn.tokenType,
model,
endpointTokenConfig,
inputTokenCount,
});
txn.rate = Math.abs(multiplier);
txn.tokenValue = -Math.abs(txn.rawAmount) * multiplier;
txn.rawAmount = -Math.abs(txn.rawAmount);
}
if (txn.context && txn.tokenType === 'completion' && txn.context === 'incomplete') {
txn.tokenValue = Math.ceil(txn.tokenValue * cancelRate);
txn.rate *= cancelRate;
if (txn.rateDetail) {
txn.rateDetail = Object.fromEntries(
Object.entries(txn.rateDetail).map(([k, v]) => [k, v * cancelRate]),
);
}
}
}
/**
* Queries and retrieves transactions based on a given filter.
* @async
* @function getTransactions
* @param {Object} filter - MongoDB filter object to apply when querying transactions.
* @returns {Promise<Array>} A promise that resolves to an array of matched transactions.
* @throws {Error} Throws an error if querying the database fails.
*/
async function getTransactions(filter) {
try {
return await Transaction.find(filter).lean();
} catch (error) {
logger.error('Error querying transactions:', error);
throw error;
}
}
module.exports = {
getTransactions,
createTransaction,
createAutoRefillTransaction,
createStructuredTransaction,
};

View file

@ -1,854 +0,0 @@
const mongoose = require('mongoose');
const { MongoMemoryServer } = require('mongodb-memory-server');
const { spendTokens, spendStructuredTokens } = require('./spendTokens');
const { getMultiplier, getCacheMultiplier, premiumTokenValues, tokenValues } = require('./tx');
const { createTransaction, createStructuredTransaction } = require('./Transaction');
const { Balance, Transaction } = require('~/db/models');
let mongoServer;
beforeAll(async () => {
mongoServer = await MongoMemoryServer.create();
const mongoUri = mongoServer.getUri();
await mongoose.connect(mongoUri);
});
afterAll(async () => {
await mongoose.disconnect();
await mongoServer.stop();
});
beforeEach(async () => {
await mongoose.connection.dropDatabase();
});
describe('Regular Token Spending Tests', () => {
test('Balance should decrease when spending tokens with spendTokens', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000; // $10.00
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: 100,
completionTokens: 50,
};
// Act
await spendTokens(txData, tokenUsage);
// Assert
const updatedBalance = await Balance.findOne({ user: userId });
const promptMultiplier = getMultiplier({ model, tokenType: 'prompt' });
const completionMultiplier = getMultiplier({ model, tokenType: 'completion' });
const expectedTotalCost = 100 * promptMultiplier + 50 * completionMultiplier;
const expectedBalance = initialBalance - expectedTotalCost;
expect(updatedBalance.tokenCredits).toBeCloseTo(expectedBalance, 0);
});
test('spendTokens should handle zero completion tokens', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: 100,
completionTokens: 0,
};
// Act
await spendTokens(txData, tokenUsage);
// Assert
const updatedBalance = await Balance.findOne({ user: userId });
const promptMultiplier = getMultiplier({ model, tokenType: 'prompt' });
const expectedCost = 100 * promptMultiplier;
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedCost, 0);
});
test('spendTokens should handle undefined token counts', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = {};
// Act
const result = await spendTokens(txData, tokenUsage);
// Assert: No transaction should be created
expect(result).toBeUndefined();
});
test('spendTokens should handle only prompt tokens', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = { promptTokens: 100 };
// Act
await spendTokens(txData, tokenUsage);
// Assert
const updatedBalance = await Balance.findOne({ user: userId });
const promptMultiplier = getMultiplier({ model, tokenType: 'prompt' });
const expectedCost = 100 * promptMultiplier;
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedCost, 0);
});
test('spendTokens should not update balance when balance feature is disabled', async () => {
// Arrange: Balance config is now passed directly in txData
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: false },
};
const tokenUsage = {
promptTokens: 100,
completionTokens: 50,
};
// Act
await spendTokens(txData, tokenUsage);
// Assert: Balance should remain unchanged.
const updatedBalance = await Balance.findOne({ user: userId });
expect(updatedBalance.tokenCredits).toBe(initialBalance);
});
});
describe('Structured Token Spending Tests', () => {
test('Balance should decrease and rawAmount should be set when spending a large number of structured tokens', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 17613154.55; // $17.61
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'c23a18da-706c-470a-ac28-ec87ed065199',
model,
context: 'message',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: {
input: 11,
write: 140522,
read: 0,
},
completionTokens: 5,
};
const promptMultiplier = getMultiplier({ model, tokenType: 'prompt' });
const completionMultiplier = getMultiplier({ model, tokenType: 'completion' });
const writeMultiplier = getCacheMultiplier({ model, cacheType: 'write' });
const readMultiplier = getCacheMultiplier({ model, cacheType: 'read' });
// Act
const result = await spendStructuredTokens(txData, tokenUsage);
// Calculate expected costs.
const expectedPromptCost =
tokenUsage.promptTokens.input * promptMultiplier +
tokenUsage.promptTokens.write * writeMultiplier +
tokenUsage.promptTokens.read * readMultiplier;
const expectedCompletionCost = tokenUsage.completionTokens * completionMultiplier;
const expectedTotalCost = expectedPromptCost + expectedCompletionCost;
const expectedBalance = initialBalance - expectedTotalCost;
// Assert
expect(result.completion.balance).toBeLessThan(initialBalance);
const allowedDifference = 100;
expect(Math.abs(result.completion.balance - expectedBalance)).toBeLessThan(allowedDifference);
const balanceDecrease = initialBalance - result.completion.balance;
expect(balanceDecrease).toBeCloseTo(expectedTotalCost, 0);
const expectedPromptTokenValue = -expectedPromptCost;
const expectedCompletionTokenValue = -expectedCompletionCost;
expect(result.prompt.prompt).toBeCloseTo(expectedPromptTokenValue, 1);
expect(result.completion.completion).toBe(expectedCompletionTokenValue);
});
test('should handle zero completion tokens in structured spending', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 17613154.55;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'test-convo',
model,
context: 'message',
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: {
input: 10,
write: 100,
read: 5,
},
completionTokens: 0,
};
// Act
const result = await spendStructuredTokens(txData, tokenUsage);
// Assert
expect(result.prompt).toBeDefined();
expect(result.completion).toBeUndefined();
expect(result.prompt.prompt).toBeLessThan(0);
});
test('should handle only prompt tokens in structured spending', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 17613154.55;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'test-convo',
model,
context: 'message',
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: {
input: 10,
write: 100,
read: 5,
},
};
// Act
const result = await spendStructuredTokens(txData, tokenUsage);
// Assert
expect(result.prompt).toBeDefined();
expect(result.completion).toBeUndefined();
expect(result.prompt.prompt).toBeLessThan(0);
});
test('should handle undefined token counts in structured spending', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 17613154.55;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'test-convo',
model,
context: 'message',
balance: { enabled: true },
};
const tokenUsage = {};
// Act
const result = await spendStructuredTokens(txData, tokenUsage);
// Assert
expect(result).toEqual({
prompt: undefined,
completion: undefined,
});
});
test('should handle incomplete context for completion tokens', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 17613154.55;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'test-convo',
model,
context: 'incomplete',
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: {
input: 10,
write: 100,
read: 5,
},
completionTokens: 50,
};
// Act
const result = await spendStructuredTokens(txData, tokenUsage);
// Assert:
// (Assuming a multiplier for completion of 15 and a cancel rate of 1.15 as noted in the original test.)
expect(result.completion.completion).toBeCloseTo(-50 * 15 * 1.15, 0);
});
});
describe('NaN Handling Tests', () => {
test('should skip transaction creation when rawAmount is NaN', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
rawAmount: NaN,
tokenType: 'prompt',
balance: { enabled: true },
};
// Act
const result = await createTransaction(txData);
// Assert: No transaction should be created and balance remains unchanged.
expect(result).toBeUndefined();
const balance = await Balance.findOne({ user: userId });
expect(balance.tokenCredits).toBe(initialBalance);
});
});
describe('Transactions Config Tests', () => {
test('createTransaction should not save when transactions.enabled is false', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
rawAmount: -100,
tokenType: 'prompt',
transactions: { enabled: false },
};
// Act
const result = await createTransaction(txData);
// Assert: No transaction should be created
expect(result).toBeUndefined();
const transactions = await Transaction.find({ user: userId });
expect(transactions).toHaveLength(0);
const balance = await Balance.findOne({ user: userId });
expect(balance.tokenCredits).toBe(initialBalance);
});
test('createTransaction should save when transactions.enabled is true', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
rawAmount: -100,
tokenType: 'prompt',
transactions: { enabled: true },
balance: { enabled: true },
};
// Act
const result = await createTransaction(txData);
// Assert: Transaction should be created
expect(result).toBeDefined();
expect(result.balance).toBeLessThan(initialBalance);
const transactions = await Transaction.find({ user: userId });
expect(transactions).toHaveLength(1);
expect(transactions[0].rawAmount).toBe(-100);
});
test('createTransaction should save when balance.enabled is true even if transactions config is missing', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
rawAmount: -100,
tokenType: 'prompt',
balance: { enabled: true },
// No transactions config provided
};
// Act
const result = await createTransaction(txData);
// Assert: Transaction should be created (backward compatibility)
expect(result).toBeDefined();
expect(result.balance).toBeLessThan(initialBalance);
const transactions = await Transaction.find({ user: userId });
expect(transactions).toHaveLength(1);
});
test('createTransaction should save transaction but not update balance when balance is disabled but transactions enabled', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-3.5-turbo';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'test',
endpointTokenConfig: null,
rawAmount: -100,
tokenType: 'prompt',
transactions: { enabled: true },
balance: { enabled: false },
};
// Act
const result = await createTransaction(txData);
// Assert: Transaction should be created but balance unchanged
expect(result).toBeUndefined();
const transactions = await Transaction.find({ user: userId });
expect(transactions).toHaveLength(1);
expect(transactions[0].rawAmount).toBe(-100);
const balance = await Balance.findOne({ user: userId });
expect(balance.tokenCredits).toBe(initialBalance);
});
test('createStructuredTransaction should not save when transactions.enabled is false', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'message',
tokenType: 'prompt',
inputTokens: -10,
writeTokens: -100,
readTokens: -5,
transactions: { enabled: false },
};
// Act
const result = await createStructuredTransaction(txData);
// Assert: No transaction should be created
expect(result).toBeUndefined();
const transactions = await Transaction.find({ user: userId });
expect(transactions).toHaveLength(0);
const balance = await Balance.findOne({ user: userId });
expect(balance.tokenCredits).toBe(initialBalance);
});
test('createStructuredTransaction should save transaction but not update balance when balance is disabled but transactions enabled', async () => {
// Arrange
const userId = new mongoose.Types.ObjectId();
const initialBalance = 10000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-3-5-sonnet';
const txData = {
user: userId,
conversationId: 'test-conversation-id',
model,
context: 'message',
tokenType: 'prompt',
inputTokens: -10,
writeTokens: -100,
readTokens: -5,
transactions: { enabled: true },
balance: { enabled: false },
};
// Act
const result = await createStructuredTransaction(txData);
// Assert: Transaction should be created but balance unchanged
expect(result).toBeUndefined();
const transactions = await Transaction.find({ user: userId });
expect(transactions).toHaveLength(1);
expect(transactions[0].inputTokens).toBe(-10);
expect(transactions[0].writeTokens).toBe(-100);
expect(transactions[0].readTokens).toBe(-5);
const balance = await Balance.findOne({ user: userId });
expect(balance.tokenCredits).toBe(initialBalance);
});
});
describe('calculateTokenValue Edge Cases', () => {
test('should derive multiplier from model when valueKey is not provided', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'gpt-4';
const promptTokens = 1000;
const result = await createTransaction({
user: userId,
conversationId: 'test-no-valuekey',
model,
tokenType: 'prompt',
rawAmount: -promptTokens,
context: 'test',
balance: { enabled: true },
});
const expectedRate = getMultiplier({ model, tokenType: 'prompt' });
expect(result.rate).toBe(expectedRate);
const tx = await Transaction.findOne({ user: userId });
expect(tx.tokenValue).toBe(-promptTokens * expectedRate);
expect(tx.rate).toBe(expectedRate);
});
test('should derive valueKey and apply correct rate for an unknown model with tokenType', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
await createTransaction({
user: userId,
conversationId: 'test-unknown-model',
model: 'some-unrecognized-model-xyz',
tokenType: 'prompt',
rawAmount: -500,
context: 'test',
balance: { enabled: true },
});
const tx = await Transaction.findOne({ user: userId });
expect(tx.rate).toBeDefined();
expect(tx.rate).toBeGreaterThan(0);
expect(tx.tokenValue).toBe(tx.rawAmount * tx.rate);
});
test('should correctly apply model-derived multiplier without valueKey for completion', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-6';
const completionTokens = 500;
const result = await createTransaction({
user: userId,
conversationId: 'test-completion-no-valuekey',
model,
tokenType: 'completion',
rawAmount: -completionTokens,
context: 'test',
balance: { enabled: true },
});
const expectedRate = getMultiplier({ model, tokenType: 'completion' });
expect(expectedRate).toBe(tokenValues[model].completion);
expect(result.rate).toBe(expectedRate);
const updatedBalance = await Balance.findOne({ user: userId });
expect(updatedBalance.tokenCredits).toBeCloseTo(
initialBalance - completionTokens * expectedRate,
0,
);
});
});
describe('Premium Token Pricing Integration Tests', () => {
test('spendTokens should apply standard pricing when prompt tokens are below premium threshold', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-6';
const promptTokens = 100000;
const completionTokens = 500;
const txData = {
user: userId,
conversationId: 'test-premium-below',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
await spendTokens(txData, { promptTokens, completionTokens });
const standardPromptRate = tokenValues[model].prompt;
const standardCompletionRate = tokenValues[model].completion;
const expectedCost =
promptTokens * standardPromptRate + completionTokens * standardCompletionRate;
const updatedBalance = await Balance.findOne({ user: userId });
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedCost, 0);
});
test('spendTokens should apply premium pricing when prompt tokens exceed premium threshold', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-6';
const promptTokens = 250000;
const completionTokens = 500;
const txData = {
user: userId,
conversationId: 'test-premium-above',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
await spendTokens(txData, { promptTokens, completionTokens });
const premiumPromptRate = premiumTokenValues[model].prompt;
const premiumCompletionRate = premiumTokenValues[model].completion;
const expectedCost =
promptTokens * premiumPromptRate + completionTokens * premiumCompletionRate;
const updatedBalance = await Balance.findOne({ user: userId });
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedCost, 0);
});
test('spendTokens should apply standard pricing at exactly the premium threshold', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-6';
const promptTokens = premiumTokenValues[model].threshold;
const completionTokens = 500;
const txData = {
user: userId,
conversationId: 'test-premium-exact',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
await spendTokens(txData, { promptTokens, completionTokens });
const standardPromptRate = tokenValues[model].prompt;
const standardCompletionRate = tokenValues[model].completion;
const expectedCost =
promptTokens * standardPromptRate + completionTokens * standardCompletionRate;
const updatedBalance = await Balance.findOne({ user: userId });
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedCost, 0);
});
test('spendStructuredTokens should apply premium pricing when total input tokens exceed threshold', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-6';
const txData = {
user: userId,
conversationId: 'test-structured-premium',
model,
context: 'message',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: {
input: 200000,
write: 10000,
read: 5000,
},
completionTokens: 1000,
};
const totalInput =
tokenUsage.promptTokens.input + tokenUsage.promptTokens.write + tokenUsage.promptTokens.read;
await spendStructuredTokens(txData, tokenUsage);
const premiumPromptRate = premiumTokenValues[model].prompt;
const premiumCompletionRate = premiumTokenValues[model].completion;
const writeMultiplier = getCacheMultiplier({ model, cacheType: 'write' });
const readMultiplier = getCacheMultiplier({ model, cacheType: 'read' });
const expectedPromptCost =
tokenUsage.promptTokens.input * premiumPromptRate +
tokenUsage.promptTokens.write * writeMultiplier +
tokenUsage.promptTokens.read * readMultiplier;
const expectedCompletionCost = tokenUsage.completionTokens * premiumCompletionRate;
const expectedTotalCost = expectedPromptCost + expectedCompletionCost;
const updatedBalance = await Balance.findOne({ user: userId });
expect(totalInput).toBeGreaterThan(premiumTokenValues[model].threshold);
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedTotalCost, 0);
});
test('spendStructuredTokens should apply standard pricing when total input tokens are below threshold', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-6';
const txData = {
user: userId,
conversationId: 'test-structured-standard',
model,
context: 'message',
endpointTokenConfig: null,
balance: { enabled: true },
};
const tokenUsage = {
promptTokens: {
input: 50000,
write: 10000,
read: 5000,
},
completionTokens: 1000,
};
const totalInput =
tokenUsage.promptTokens.input + tokenUsage.promptTokens.write + tokenUsage.promptTokens.read;
await spendStructuredTokens(txData, tokenUsage);
const standardPromptRate = tokenValues[model].prompt;
const standardCompletionRate = tokenValues[model].completion;
const writeMultiplier = getCacheMultiplier({ model, cacheType: 'write' });
const readMultiplier = getCacheMultiplier({ model, cacheType: 'read' });
const expectedPromptCost =
tokenUsage.promptTokens.input * standardPromptRate +
tokenUsage.promptTokens.write * writeMultiplier +
tokenUsage.promptTokens.read * readMultiplier;
const expectedCompletionCost = tokenUsage.completionTokens * standardCompletionRate;
const expectedTotalCost = expectedPromptCost + expectedCompletionCost;
const updatedBalance = await Balance.findOne({ user: userId });
expect(totalInput).toBeLessThanOrEqual(premiumTokenValues[model].threshold);
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedTotalCost, 0);
});
test('non-premium models should not be affected by inputTokenCount regardless of prompt size', async () => {
const userId = new mongoose.Types.ObjectId();
const initialBalance = 100000000;
await Balance.create({ user: userId, tokenCredits: initialBalance });
const model = 'claude-opus-4-5';
const promptTokens = 300000;
const completionTokens = 500;
const txData = {
user: userId,
conversationId: 'test-no-premium',
model,
context: 'test',
endpointTokenConfig: null,
balance: { enabled: true },
};
await spendTokens(txData, { promptTokens, completionTokens });
const standardPromptRate = getMultiplier({ model, tokenType: 'prompt' });
const standardCompletionRate = getMultiplier({ model, tokenType: 'completion' });
const expectedCost =
promptTokens * standardPromptRate + completionTokens * standardCompletionRate;
const updatedBalance = await Balance.findOne({ user: userId });
expect(updatedBalance.tokenCredits).toBeCloseTo(initialBalance - expectedCost, 0);
});
});

View file

@ -1,156 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { ViolationTypes } = require('librechat-data-provider');
const { createAutoRefillTransaction } = require('./Transaction');
const { logViolation } = require('~/cache');
const { getMultiplier } = require('./tx');
const { Balance } = require('~/db/models');
function isInvalidDate(date) {
return isNaN(date);
}
/**
* Simple check method that calculates token cost and returns balance info.
* The auto-refill logic has been moved to balanceMethods.js to prevent circular dependencies.
*/
const checkBalanceRecord = async function ({
user,
model,
endpoint,
valueKey,
tokenType,
amount,
endpointTokenConfig,
}) {
const multiplier = getMultiplier({ valueKey, tokenType, model, endpoint, endpointTokenConfig });
const tokenCost = amount * multiplier;
// Retrieve the balance record
let record = await Balance.findOne({ user }).lean();
if (!record) {
logger.debug('[Balance.check] No balance record found for user', { user });
return {
canSpend: false,
balance: 0,
tokenCost,
};
}
let balance = record.tokenCredits;
logger.debug('[Balance.check] Initial state', {
user,
model,
endpoint,
valueKey,
tokenType,
amount,
balance,
multiplier,
endpointTokenConfig: !!endpointTokenConfig,
});
// Only perform auto-refill if spending would bring the balance to 0 or below
if (balance - tokenCost <= 0 && record.autoRefillEnabled && record.refillAmount > 0) {
const lastRefillDate = new Date(record.lastRefill);
const now = new Date();
if (
isInvalidDate(lastRefillDate) ||
now >=
addIntervalToDate(lastRefillDate, record.refillIntervalValue, record.refillIntervalUnit)
) {
try {
/** @type {{ rate: number, user: string, balance: number, transaction: import('@librechat/data-schemas').ITransaction}} */
const result = await createAutoRefillTransaction({
user: user,
tokenType: 'credits',
context: 'autoRefill',
rawAmount: record.refillAmount,
});
balance = result.balance;
} catch (error) {
logger.error('[Balance.check] Failed to record transaction for auto-refill', error);
}
}
}
logger.debug('[Balance.check] Token cost', { tokenCost });
return { canSpend: balance >= tokenCost, balance, tokenCost };
};
/**
* Adds a time interval to a given date.
* @param {Date} date - The starting date.
* @param {number} value - The numeric value of the interval.
* @param {'seconds'|'minutes'|'hours'|'days'|'weeks'|'months'} unit - The unit of time.
* @returns {Date} A new Date representing the starting date plus the interval.
*/
const addIntervalToDate = (date, value, unit) => {
const result = new Date(date);
switch (unit) {
case 'seconds':
result.setSeconds(result.getSeconds() + value);
break;
case 'minutes':
result.setMinutes(result.getMinutes() + value);
break;
case 'hours':
result.setHours(result.getHours() + value);
break;
case 'days':
result.setDate(result.getDate() + value);
break;
case 'weeks':
result.setDate(result.getDate() + value * 7);
break;
case 'months':
result.setMonth(result.getMonth() + value);
break;
default:
break;
}
return result;
};
/**
* Checks the balance for a user and determines if they can spend a certain amount.
* If the user cannot spend the amount, it logs a violation and denies the request.
*
* @async
* @function
* @param {Object} params - The function parameters.
* @param {ServerRequest} params.req - The Express request object.
* @param {Express.Response} params.res - The Express response object.
* @param {Object} params.txData - The transaction data.
* @param {string} params.txData.user - The user ID or identifier.
* @param {('prompt' | 'completion')} params.txData.tokenType - The type of token.
* @param {number} params.txData.amount - The amount of tokens.
* @param {string} params.txData.model - The model name or identifier.
* @param {string} [params.txData.endpointTokenConfig] - The token configuration for the endpoint.
* @returns {Promise<boolean>} Throws error if the user cannot spend the amount.
* @throws {Error} Throws an error if there's an issue with the balance check.
*/
const checkBalance = async ({ req, res, txData }) => {
const { canSpend, balance, tokenCost } = await checkBalanceRecord(txData);
if (canSpend) {
return true;
}
const type = ViolationTypes.TOKEN_BALANCE;
const errorMessage = {
type,
balance,
tokenCost,
promptTokens: txData.amount,
};
if (txData.generations && txData.generations.length > 0) {
errorMessage.generations = txData.generations;
}
await logViolation(req, res, type, errorMessage, 0);
throw new Error(JSON.stringify(errorMessage));
};
module.exports = {
checkBalance,
};

View file

@ -1,275 +0,0 @@
const mongoose = require('mongoose');
const { buildTree } = require('librechat-data-provider');
const { MongoMemoryServer } = require('mongodb-memory-server');
const { getMessages, bulkSaveMessages } = require('./Message');
const { Message } = require('~/db/models');
let mongod;
beforeAll(async () => {
mongod = await MongoMemoryServer.create();
const uri = mongod.getUri();
await mongoose.connect(uri);
});
afterAll(async () => {
await mongoose.disconnect();
await mongod.stop();
});
beforeEach(async () => {
await Message.deleteMany({});
});
describe('Conversation Structure Tests', () => {
test('Conversation folding/corrupting with inconsistent timestamps', async () => {
const userId = 'testUser';
const conversationId = 'testConversation';
// Create messages with inconsistent timestamps
const messages = [
{
messageId: 'message0',
parentMessageId: null,
text: 'Message 0',
createdAt: new Date('2023-01-01T00:00:00Z'),
},
{
messageId: 'message1',
parentMessageId: 'message0',
text: 'Message 1',
createdAt: new Date('2023-01-01T00:02:00Z'),
},
{
messageId: 'message2',
parentMessageId: 'message1',
text: 'Message 2',
createdAt: new Date('2023-01-01T00:01:00Z'),
}, // Note: Earlier than its parent
{
messageId: 'message3',
parentMessageId: 'message1',
text: 'Message 3',
createdAt: new Date('2023-01-01T00:03:00Z'),
},
{
messageId: 'message4',
parentMessageId: 'message2',
text: 'Message 4',
createdAt: new Date('2023-01-01T00:04:00Z'),
},
];
// Add common properties to all messages
messages.forEach((msg) => {
msg.conversationId = conversationId;
msg.user = userId;
msg.isCreatedByUser = false;
msg.error = false;
msg.unfinished = false;
});
// Save messages with overrideTimestamp omitted (default is false)
await bulkSaveMessages(messages, true);
// Retrieve messages (this will sort by createdAt)
const retrievedMessages = await getMessages({ conversationId, user: userId });
// Build tree
const tree = buildTree({ messages: retrievedMessages });
// Check if the tree is incorrect (folded/corrupted)
expect(tree.length).toBeGreaterThan(1); // Should have multiple root messages, indicating corruption
});
test('Fix: Conversation structure maintained with more than 16 messages', async () => {
const userId = 'testUser';
const conversationId = 'testConversation';
// Create more than 16 messages
const messages = Array.from({ length: 20 }, (_, i) => ({
messageId: `message${i}`,
parentMessageId: i === 0 ? null : `message${i - 1}`,
conversationId,
user: userId,
text: `Message ${i}`,
createdAt: new Date(Date.now() + (i % 2 === 0 ? i * 500000 : -i * 500000)),
}));
// Save messages with new timestamps being generated (message objects ignored)
await bulkSaveMessages(messages);
// Retrieve messages (this will sort by createdAt, but it shouldn't matter now)
const retrievedMessages = await getMessages({ conversationId, user: userId });
// Build tree
const tree = buildTree({ messages: retrievedMessages });
// Check if the tree is correct
expect(tree.length).toBe(1); // Should have only one root message
let currentNode = tree[0];
for (let i = 1; i < 20; i++) {
expect(currentNode.children.length).toBe(1);
currentNode = currentNode.children[0];
expect(currentNode.text).toBe(`Message ${i}`);
}
expect(currentNode.children.length).toBe(0); // Last message should have no children
});
test('Simulate MongoDB ordering issue with more than 16 messages and close timestamps', async () => {
const userId = 'testUser';
const conversationId = 'testConversation';
// Create more than 16 messages with very close timestamps
const messages = Array.from({ length: 20 }, (_, i) => ({
messageId: `message${i}`,
parentMessageId: i === 0 ? null : `message${i - 1}`,
conversationId,
user: userId,
text: `Message ${i}`,
createdAt: new Date(Date.now() + (i % 2 === 0 ? i * 1 : -i * 1)),
}));
// Add common properties to all messages
messages.forEach((msg) => {
msg.isCreatedByUser = false;
msg.error = false;
msg.unfinished = false;
});
await bulkSaveMessages(messages, true);
const retrievedMessages = await getMessages({ conversationId, user: userId });
const tree = buildTree({ messages: retrievedMessages });
expect(tree.length).toBeGreaterThan(1);
});
test('Fix: Preserve order with more than 16 messages by maintaining original timestamps', async () => {
const userId = 'testUser';
const conversationId = 'testConversation';
// Create more than 16 messages with distinct timestamps
const messages = Array.from({ length: 20 }, (_, i) => ({
messageId: `message${i}`,
parentMessageId: i === 0 ? null : `message${i - 1}`,
conversationId,
user: userId,
text: `Message ${i}`,
createdAt: new Date(Date.now() + i * 1000), // Ensure each message has a distinct timestamp
}));
// Add common properties to all messages
messages.forEach((msg) => {
msg.isCreatedByUser = false;
msg.error = false;
msg.unfinished = false;
});
// Save messages with overriding timestamps (preserve original timestamps)
await bulkSaveMessages(messages, true);
// Retrieve messages (this will sort by createdAt)
const retrievedMessages = await getMessages({ conversationId, user: userId });
// Build tree
const tree = buildTree({ messages: retrievedMessages });
// Check if the tree is correct
expect(tree.length).toBe(1); // Should have only one root message
let currentNode = tree[0];
for (let i = 1; i < 20; i++) {
expect(currentNode.children.length).toBe(1);
currentNode = currentNode.children[0];
expect(currentNode.text).toBe(`Message ${i}`);
}
expect(currentNode.children.length).toBe(0); // Last message should have no children
});
test('Random order dates between parent and children messages', async () => {
const userId = 'testUser';
const conversationId = 'testConversation';
// Create messages with deliberately out-of-order timestamps but sequential creation
const messages = [
{
messageId: 'parent',
parentMessageId: null,
text: 'Parent Message',
createdAt: new Date('2023-01-01T00:00:00Z'), // Make parent earliest
},
{
messageId: 'child1',
parentMessageId: 'parent',
text: 'Child Message 1',
createdAt: new Date('2023-01-01T00:01:00Z'),
},
{
messageId: 'child2',
parentMessageId: 'parent',
text: 'Child Message 2',
createdAt: new Date('2023-01-01T00:02:00Z'),
},
{
messageId: 'grandchild1',
parentMessageId: 'child1',
text: 'Grandchild Message 1',
createdAt: new Date('2023-01-01T00:03:00Z'),
},
];
// Add common properties to all messages
messages.forEach((msg) => {
msg.conversationId = conversationId;
msg.user = userId;
msg.isCreatedByUser = false;
msg.error = false;
msg.unfinished = false;
});
// Save messages with overrideTimestamp set to true
await bulkSaveMessages(messages, true);
// Retrieve messages
const retrievedMessages = await getMessages({ conversationId, user: userId });
// Debug log to see what's being returned
console.log(
'Retrieved Messages:',
retrievedMessages.map((msg) => ({
messageId: msg.messageId,
parentMessageId: msg.parentMessageId,
createdAt: msg.createdAt,
})),
);
// Build tree
const tree = buildTree({ messages: retrievedMessages });
// Debug log to see the tree structure
console.log(
'Tree structure:',
tree.map((root) => ({
messageId: root.messageId,
children: root.children.map((child) => ({
messageId: child.messageId,
children: child.children.map((grandchild) => ({
messageId: grandchild.messageId,
})),
})),
})),
);
// Verify the structure before making assertions
expect(retrievedMessages.length).toBe(4); // Should have all 4 messages
// Check if messages are properly linked
const parentMsg = retrievedMessages.find((msg) => msg.messageId === 'parent');
expect(parentMsg.parentMessageId).toBeNull(); // Parent should have null parentMessageId
const childMsg1 = retrievedMessages.find((msg) => msg.messageId === 'child1');
expect(childMsg1.parentMessageId).toBe('parent');
// Then check tree structure
expect(tree.length).toBe(1); // Should have only one root message
expect(tree[0].messageId).toBe('parent');
expect(tree[0].children.length).toBe(2); // Should have two children
});
});

View file

@ -1,19 +1,13 @@
const mongoose = require('mongoose');
const { createMethods } = require('@librechat/data-schemas');
const methods = createMethods(mongoose);
const { comparePassword } = require('./userMethods');
const {
getMessage,
getMessages,
saveMessage,
recordMessage,
updateMessage,
deleteMessagesSince,
deleteMessages,
} = require('./Message');
const { getConvoTitle, getConvo, saveConvo, deleteConvos } = require('./Conversation');
const { getPreset, getPresets, savePreset, deletePresets } = require('./Preset');
const { File } = require('~/db/models');
const { matchModelName, findMatchingPattern } = require('@librechat/api');
const getLogStores = require('~/cache/getLogStores');
const methods = createMethods(mongoose, {
matchModelName,
findMatchingPattern,
getCache: getLogStores,
});
const seedDatabase = async () => {
await methods.initializeRoles();
@ -24,25 +18,4 @@ const seedDatabase = async () => {
module.exports = {
...methods,
seedDatabase,
comparePassword,
getMessage,
getMessages,
saveMessage,
recordMessage,
updateMessage,
deleteMessagesSince,
deleteMessages,
getConvoTitle,
getConvo,
saveConvo,
deleteConvos,
getPreset,
getPresets,
savePreset,
deletePresets,
Files: File,
};

View file

@ -1,24 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { updateInterfacePermissions: updateInterfacePerms } = require('@librechat/api');
const { getRoleByName, updateAccessPermissions } = require('./Role');
/**
* Update interface permissions based on app configuration.
* Must be done independently from loading the app config.
* @param {AppConfig} appConfig
*/
async function updateInterfacePermissions(appConfig) {
try {
await updateInterfacePerms({
appConfig,
getRoleByName,
updateAccessPermissions,
});
} catch (error) {
logger.error('Error updating interface permissions:', error);
}
}
module.exports = {
updateInterfacePermissions,
};

View file

@ -1,68 +0,0 @@
const mongoose = require('mongoose');
const { logger, hashToken, getRandomValues } = require('@librechat/data-schemas');
const { createToken, findToken } = require('~/models');
/**
* @module inviteUser
* @description This module provides functions to create and get user invites
*/
/**
* @function createInvite
* @description This function creates a new user invite
* @param {string} email - The email of the user to invite
* @returns {Promise<Object>} A promise that resolves to the saved invite document
* @throws {Error} If there is an error creating the invite
*/
const createInvite = async (email) => {
try {
const token = await getRandomValues(32);
const hash = await hashToken(token);
const encodedToken = encodeURIComponent(token);
const fakeUserId = new mongoose.Types.ObjectId();
await createToken({
userId: fakeUserId,
email,
token: hash,
createdAt: Date.now(),
expiresIn: 604800,
});
return encodedToken;
} catch (error) {
logger.error('[createInvite] Error creating invite', error);
return { message: 'Error creating invite' };
}
};
/**
* @function getInvite
* @description This function retrieves a user invite
* @param {string} encodedToken - The token of the invite to retrieve
* @param {string} email - The email of the user to validate
* @returns {Promise<Object>} A promise that resolves to the retrieved invite document
* @throws {Error} If there is an error retrieving the invite, if the invite does not exist, or if the email does not match
*/
const getInvite = async (encodedToken, email) => {
try {
const token = decodeURIComponent(encodedToken);
const hash = await hashToken(token);
const invite = await findToken({ token: hash, email });
if (!invite) {
throw new Error('Invite not found or email does not match');
}
return invite;
} catch (error) {
logger.error('[getInvite] Error getting invite:', error);
return { error: true, message: error.message };
}
};
module.exports = {
createInvite,
getInvite,
};

View file

@ -1,218 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { getCustomEndpointConfig } = require('@librechat/api');
const {
Tools,
Constants,
isAgentsEndpoint,
isEphemeralAgentId,
appendAgentIdSuffix,
encodeEphemeralAgentId,
} = require('librechat-data-provider');
const { getMCPServerTools } = require('~/server/services/Config');
const { mcp_all, mcp_delimiter } = Constants;
/**
* Constant for added conversation agent ID
*/
const ADDED_AGENT_ID = 'added_agent';
/**
* Get an agent document based on the provided ID.
* @param {Object} searchParameter - The search parameters to find the agent.
* @param {string} searchParameter.id - The ID of the agent.
* @returns {Promise<import('librechat-data-provider').Agent|null>}
*/
let getAgent;
/**
* Set the getAgent function (dependency injection to avoid circular imports)
* @param {Function} fn
*/
const setGetAgent = (fn) => {
getAgent = fn;
};
/**
* Load an agent from an added conversation (TConversation).
* Used for multi-convo parallel agent execution.
*
* @param {Object} params
* @param {import('express').Request} params.req
* @param {import('librechat-data-provider').TConversation} params.conversation - The added conversation
* @param {import('librechat-data-provider').Agent} [params.primaryAgent] - The primary agent (used to duplicate tools when both are ephemeral)
* @returns {Promise<import('librechat-data-provider').Agent|null>} The agent config as a plain object, or null if invalid.
*/
const loadAddedAgent = async ({ req, conversation, primaryAgent }) => {
if (!conversation) {
return null;
}
// If there's an agent_id, load the existing agent
if (conversation.agent_id && !isEphemeralAgentId(conversation.agent_id)) {
if (!getAgent) {
throw new Error('getAgent not initialized - call setGetAgent first');
}
const agent = await getAgent({
id: conversation.agent_id,
});
if (!agent) {
logger.warn(`[loadAddedAgent] Agent ${conversation.agent_id} not found`);
return null;
}
agent.version = agent.versions ? agent.versions.length : 0;
// Append suffix to distinguish from primary agent (matches ephemeral format)
// This is needed when both agents have the same ID or for consistent parallel content attribution
agent.id = appendAgentIdSuffix(agent.id, 1);
return agent;
}
// Otherwise, create an ephemeral agent config from the conversation
const { model, endpoint, promptPrefix, spec, ...rest } = conversation;
if (!endpoint || !model) {
logger.warn('[loadAddedAgent] Missing required endpoint or model for ephemeral agent');
return null;
}
// If both primary and added agents are ephemeral, duplicate tools from primary agent
const primaryIsEphemeral = primaryAgent && isEphemeralAgentId(primaryAgent.id);
if (primaryIsEphemeral && Array.isArray(primaryAgent.tools)) {
// Get endpoint config and model spec for display name fallbacks
const appConfig = req.config;
let endpointConfig = appConfig?.endpoints?.[endpoint];
if (!isAgentsEndpoint(endpoint) && !endpointConfig) {
try {
endpointConfig = getCustomEndpointConfig({ endpoint, appConfig });
} catch (err) {
logger.error('[loadAddedAgent] Error getting custom endpoint config', err);
}
}
// Look up model spec for label fallback
const modelSpecs = appConfig?.modelSpecs?.list;
const modelSpec = spec != null && spec !== '' ? modelSpecs?.find((s) => s.name === spec) : null;
// For ephemeral agents, use modelLabel if provided, then model spec's label,
// then modelDisplayLabel from endpoint config, otherwise empty string to show model name
const sender = rest.modelLabel ?? modelSpec?.label ?? endpointConfig?.modelDisplayLabel ?? '';
const ephemeralId = encodeEphemeralAgentId({ endpoint, model, sender, index: 1 });
return {
id: ephemeralId,
instructions: promptPrefix || '',
provider: endpoint,
model_parameters: {},
model,
tools: [...primaryAgent.tools],
};
}
// Extract ephemeral agent options from conversation if present
const ephemeralAgent = rest.ephemeralAgent;
const mcpServers = new Set(ephemeralAgent?.mcp);
const userId = req.user?.id;
// Check model spec for MCP servers
const modelSpecs = req.config?.modelSpecs?.list;
let modelSpec = null;
if (spec != null && spec !== '') {
modelSpec = modelSpecs?.find((s) => s.name === spec) || null;
}
if (modelSpec?.mcpServers) {
for (const mcpServer of modelSpec.mcpServers) {
mcpServers.add(mcpServer);
}
}
/** @type {string[]} */
const tools = [];
if (ephemeralAgent?.execute_code === true || modelSpec?.executeCode === true) {
tools.push(Tools.execute_code);
}
if (ephemeralAgent?.file_search === true || modelSpec?.fileSearch === true) {
tools.push(Tools.file_search);
}
if (ephemeralAgent?.web_search === true || modelSpec?.webSearch === true) {
tools.push(Tools.web_search);
}
const addedServers = new Set();
if (mcpServers.size > 0) {
for (const mcpServer of mcpServers) {
if (addedServers.has(mcpServer)) {
continue;
}
const serverTools = await getMCPServerTools(userId, mcpServer);
if (!serverTools) {
tools.push(`${mcp_all}${mcp_delimiter}${mcpServer}`);
addedServers.add(mcpServer);
continue;
}
tools.push(...Object.keys(serverTools));
addedServers.add(mcpServer);
}
}
// Build model_parameters from conversation fields
const model_parameters = {};
const paramKeys = [
'temperature',
'top_p',
'topP',
'topK',
'presence_penalty',
'frequency_penalty',
'maxOutputTokens',
'maxTokens',
'max_tokens',
];
for (const key of paramKeys) {
if (rest[key] != null) {
model_parameters[key] = rest[key];
}
}
// Get endpoint config for modelDisplayLabel fallback
const appConfig = req.config;
let endpointConfig = appConfig?.endpoints?.[endpoint];
if (!isAgentsEndpoint(endpoint) && !endpointConfig) {
try {
endpointConfig = getCustomEndpointConfig({ endpoint, appConfig });
} catch (err) {
logger.error('[loadAddedAgent] Error getting custom endpoint config', err);
}
}
// For ephemeral agents, use modelLabel if provided, then model spec's label,
// then modelDisplayLabel from endpoint config, otherwise empty string to show model name
const sender = rest.modelLabel ?? modelSpec?.label ?? endpointConfig?.modelDisplayLabel ?? '';
/** Encoded ephemeral agent ID with endpoint, model, sender, and index=1 to distinguish from primary */
const ephemeralId = encodeEphemeralAgentId({ endpoint, model, sender, index: 1 });
const result = {
id: ephemeralId,
instructions: promptPrefix || '',
provider: endpoint,
model_parameters,
model,
tools,
};
if (ephemeralAgent?.artifacts != null && ephemeralAgent.artifacts) {
result.artifacts = ephemeralAgent.artifacts;
}
return result;
};
module.exports = {
ADDED_AGENT_ID,
loadAddedAgent,
setGetAgent,
};

View file

@ -1,140 +0,0 @@
const { logger } = require('@librechat/data-schemas');
const { createTransaction, createStructuredTransaction } = require('./Transaction');
/**
* Creates up to two transactions to record the spending of tokens.
*
* @function
* @async
* @param {txData} txData - Transaction data.
* @param {Object} tokenUsage - The number of tokens used.
* @param {Number} tokenUsage.promptTokens - The number of prompt tokens used.
* @param {Number} tokenUsage.completionTokens - The number of completion tokens used.
* @returns {Promise<void>} - Returns nothing.
* @throws {Error} - Throws an error if there's an issue creating the transactions.
*/
const spendTokens = async (txData, tokenUsage) => {
const { promptTokens, completionTokens } = tokenUsage;
logger.debug(
`[spendTokens] conversationId: ${txData.conversationId}${
txData?.context ? ` | Context: ${txData?.context}` : ''
} | Token usage: `,
{
promptTokens,
completionTokens,
},
);
let prompt, completion;
const normalizedPromptTokens = Math.max(promptTokens ?? 0, 0);
try {
if (promptTokens !== undefined) {
prompt = await createTransaction({
...txData,
tokenType: 'prompt',
rawAmount: promptTokens === 0 ? 0 : -normalizedPromptTokens,
inputTokenCount: normalizedPromptTokens,
});
}
if (completionTokens !== undefined) {
completion = await createTransaction({
...txData,
tokenType: 'completion',
rawAmount: completionTokens === 0 ? 0 : -Math.max(completionTokens, 0),
inputTokenCount: normalizedPromptTokens,
});
}
if (prompt || completion) {
logger.debug('[spendTokens] Transaction data record against balance:', {
user: txData.user,
prompt: prompt?.prompt,
promptRate: prompt?.rate,
completion: completion?.completion,
completionRate: completion?.rate,
balance: completion?.balance ?? prompt?.balance,
});
} else {
logger.debug('[spendTokens] No transactions incurred against balance');
}
} catch (err) {
logger.error('[spendTokens]', err);
}
};
/**
* Creates transactions to record the spending of structured tokens.
*
* @function
* @async
* @param {txData} txData - Transaction data.
* @param {Object} tokenUsage - The number of tokens used.
* @param {Object} tokenUsage.promptTokens - The number of prompt tokens used.
* @param {Number} tokenUsage.promptTokens.input - The number of input tokens.
* @param {Number} tokenUsage.promptTokens.write - The number of write tokens.
* @param {Number} tokenUsage.promptTokens.read - The number of read tokens.
* @param {Number} tokenUsage.completionTokens - The number of completion tokens used.
* @returns {Promise<void>} - Returns nothing.
* @throws {Error} - Throws an error if there's an issue creating the transactions.
*/
const spendStructuredTokens = async (txData, tokenUsage) => {
const { promptTokens, completionTokens } = tokenUsage;
logger.debug(
`[spendStructuredTokens] conversationId: ${txData.conversationId}${
txData?.context ? ` | Context: ${txData?.context}` : ''
} | Token usage: `,
{
promptTokens,
completionTokens,
},
);
let prompt, completion;
try {
if (promptTokens) {
const input = Math.max(promptTokens.input ?? 0, 0);
const write = Math.max(promptTokens.write ?? 0, 0);
const read = Math.max(promptTokens.read ?? 0, 0);
const totalInputTokens = input + write + read;
prompt = await createStructuredTransaction({
...txData,
tokenType: 'prompt',
inputTokens: -input,
writeTokens: -write,
readTokens: -read,
inputTokenCount: totalInputTokens,
});
}
if (completionTokens) {
const totalInputTokens = promptTokens
? Math.max(promptTokens.input ?? 0, 0) +
Math.max(promptTokens.write ?? 0, 0) +
Math.max(promptTokens.read ?? 0, 0)
: undefined;
completion = await createTransaction({
...txData,
tokenType: 'completion',
rawAmount: -Math.max(completionTokens, 0),
inputTokenCount: totalInputTokens,
});
}
if (prompt || completion) {
logger.debug('[spendStructuredTokens] Transaction data record against balance:', {
user: txData.user,
prompt: prompt?.prompt,
promptRate: prompt?.rate,
completion: completion?.completion,
completionRate: completion?.rate,
balance: completion?.balance ?? prompt?.balance,
});
} else {
logger.debug('[spendStructuredTokens] No transactions incurred against balance');
}
} catch (err) {
logger.error('[spendStructuredTokens]', err);
}
return { prompt, completion };
};
module.exports = { spendTokens, spendStructuredTokens };

File diff suppressed because it is too large Load diff

View file

@ -1,503 +0,0 @@
const { matchModelName, findMatchingPattern } = require('@librechat/api');
const defaultRate = 6;
/**
* Token Pricing Configuration
*
* IMPORTANT: Key Ordering for Pattern Matching
* ============================================
* The `findMatchingPattern` function iterates through object keys in REVERSE order
* (last-defined keys are checked first) and uses `modelName.includes(key)` for matching.
*
* This means:
* 1. BASE PATTERNS must be defined FIRST (e.g., "kimi", "moonshot")
* 2. SPECIFIC PATTERNS must be defined AFTER their base patterns (e.g., "kimi-k2", "kimi-k2.5")
*
* Example ordering for Kimi models:
* kimi: { prompt: 0.6, completion: 2.5 }, // Base pattern - checked last
* 'kimi-k2': { prompt: 0.6, completion: 2.5 }, // More specific - checked before "kimi"
* 'kimi-k2.5': { prompt: 0.6, completion: 3.0 }, // Most specific - checked first
*
* Why this matters:
* - Model name "kimi-k2.5" contains both "kimi" and "kimi-k2" as substrings
* - If "kimi" were checked first, it would incorrectly match and return wrong pricing
* - By defining specific patterns AFTER base patterns, they're checked first in reverse iteration
*
* This applies to BOTH `tokenValues` and `cacheTokenValues` objects.
*
* When adding new model families:
* 1. Define the base/generic pattern first
* 2. Define increasingly specific patterns after
* 3. Ensure no pattern is a substring of another that should match differently
*/
/**
* AWS Bedrock pricing
* source: https://aws.amazon.com/bedrock/pricing/
*/
const bedrockValues = {
// Basic llama2 patterns (base defaults to smallest variant)
llama2: { prompt: 0.75, completion: 1.0 },
'llama-2': { prompt: 0.75, completion: 1.0 },
'llama2-13b': { prompt: 0.75, completion: 1.0 },
'llama2:70b': { prompt: 1.95, completion: 2.56 },
'llama2-70b': { prompt: 1.95, completion: 2.56 },
// Basic llama3 patterns (base defaults to smallest variant)
llama3: { prompt: 0.3, completion: 0.6 },
'llama-3': { prompt: 0.3, completion: 0.6 },
'llama3-8b': { prompt: 0.3, completion: 0.6 },
'llama3:8b': { prompt: 0.3, completion: 0.6 },
'llama3-70b': { prompt: 2.65, completion: 3.5 },
'llama3:70b': { prompt: 2.65, completion: 3.5 },
// llama3-x-Nb pattern (base defaults to smallest variant)
'llama3-1': { prompt: 0.22, completion: 0.22 },
'llama3-1-8b': { prompt: 0.22, completion: 0.22 },
'llama3-1-70b': { prompt: 0.72, completion: 0.72 },
'llama3-1-405b': { prompt: 2.4, completion: 2.4 },
'llama3-2': { prompt: 0.1, completion: 0.1 },
'llama3-2-1b': { prompt: 0.1, completion: 0.1 },
'llama3-2-3b': { prompt: 0.15, completion: 0.15 },
'llama3-2-11b': { prompt: 0.16, completion: 0.16 },
'llama3-2-90b': { prompt: 0.72, completion: 0.72 },
'llama3-3': { prompt: 2.65, completion: 3.5 },
'llama3-3-70b': { prompt: 2.65, completion: 3.5 },
// llama3.x:Nb pattern (base defaults to smallest variant)
'llama3.1': { prompt: 0.22, completion: 0.22 },
'llama3.1:8b': { prompt: 0.22, completion: 0.22 },
'llama3.1:70b': { prompt: 0.72, completion: 0.72 },
'llama3.1:405b': { prompt: 2.4, completion: 2.4 },
'llama3.2': { prompt: 0.1, completion: 0.1 },
'llama3.2:1b': { prompt: 0.1, completion: 0.1 },
'llama3.2:3b': { prompt: 0.15, completion: 0.15 },
'llama3.2:11b': { prompt: 0.16, completion: 0.16 },
'llama3.2:90b': { prompt: 0.72, completion: 0.72 },
'llama3.3': { prompt: 2.65, completion: 3.5 },
'llama3.3:70b': { prompt: 2.65, completion: 3.5 },
// llama-3.x-Nb pattern (base defaults to smallest variant)
'llama-3.1': { prompt: 0.22, completion: 0.22 },
'llama-3.1-8b': { prompt: 0.22, completion: 0.22 },
'llama-3.1-70b': { prompt: 0.72, completion: 0.72 },
'llama-3.1-405b': { prompt: 2.4, completion: 2.4 },
'llama-3.2': { prompt: 0.1, completion: 0.1 },
'llama-3.2-1b': { prompt: 0.1, completion: 0.1 },
'llama-3.2-3b': { prompt: 0.15, completion: 0.15 },
'llama-3.2-11b': { prompt: 0.16, completion: 0.16 },
'llama-3.2-90b': { prompt: 0.72, completion: 0.72 },
'llama-3.3': { prompt: 2.65, completion: 3.5 },
'llama-3.3-70b': { prompt: 2.65, completion: 3.5 },
'mistral-7b': { prompt: 0.15, completion: 0.2 },
'mistral-small': { prompt: 0.15, completion: 0.2 },
'mixtral-8x7b': { prompt: 0.45, completion: 0.7 },
'mistral-large-2402': { prompt: 4.0, completion: 12.0 },
'mistral-large-2407': { prompt: 3.0, completion: 9.0 },
'command-text': { prompt: 1.5, completion: 2.0 },
'command-light': { prompt: 0.3, completion: 0.6 },
// AI21 models
'j2-mid': { prompt: 12.5, completion: 12.5 },
'j2-ultra': { prompt: 18.8, completion: 18.8 },
'jamba-instruct': { prompt: 0.5, completion: 0.7 },
// Amazon Titan models
'titan-text-lite': { prompt: 0.15, completion: 0.2 },
'titan-text-express': { prompt: 0.2, completion: 0.6 },
'titan-text-premier': { prompt: 0.5, completion: 1.5 },
// Amazon Nova models
'nova-micro': { prompt: 0.035, completion: 0.14 },
'nova-lite': { prompt: 0.06, completion: 0.24 },
'nova-pro': { prompt: 0.8, completion: 3.2 },
'nova-premier': { prompt: 2.5, completion: 12.5 },
'deepseek.r1': { prompt: 1.35, completion: 5.4 },
// Moonshot/Kimi models on Bedrock
'moonshot.kimi': { prompt: 0.6, completion: 2.5 },
'moonshot.kimi-k2': { prompt: 0.6, completion: 2.5 },
'moonshot.kimi-k2.5': { prompt: 0.6, completion: 3.0 },
'moonshot.kimi-k2-thinking': { prompt: 0.6, completion: 2.5 },
};
/**
* Mapping of model token sizes to their respective multipliers for prompt and completion.
* The rates are 1 USD per 1M tokens.
* @type {Object.<string, {prompt: number, completion: number}>}
*/
const tokenValues = Object.assign(
{
// Legacy token size mappings (generic patterns - check LAST)
'8k': { prompt: 30, completion: 60 },
'32k': { prompt: 60, completion: 120 },
'4k': { prompt: 1.5, completion: 2 },
'16k': { prompt: 3, completion: 4 },
// Generic fallback patterns (check LAST)
'claude-': { prompt: 0.8, completion: 2.4 },
deepseek: { prompt: 0.28, completion: 0.42 },
command: { prompt: 0.38, completion: 0.38 },
gemma: { prompt: 0.02, completion: 0.04 }, // Base pattern (using gemma-3n-e4b pricing)
gemini: { prompt: 0.5, completion: 1.5 },
'gpt-oss': { prompt: 0.05, completion: 0.2 },
// Specific model variants (check FIRST - more specific patterns at end)
'gpt-3.5-turbo-1106': { prompt: 1, completion: 2 },
'gpt-3.5-turbo-0125': { prompt: 0.5, completion: 1.5 },
'gpt-4-1106': { prompt: 10, completion: 30 },
'gpt-4.1': { prompt: 2, completion: 8 },
'gpt-4.1-nano': { prompt: 0.1, completion: 0.4 },
'gpt-4.1-mini': { prompt: 0.4, completion: 1.6 },
'gpt-4.5': { prompt: 75, completion: 150 },
'gpt-4o': { prompt: 2.5, completion: 10 },
'gpt-4o-2024-05-13': { prompt: 5, completion: 15 },
'gpt-4o-mini': { prompt: 0.15, completion: 0.6 },
'gpt-5': { prompt: 1.25, completion: 10 },
'gpt-5.1': { prompt: 1.25, completion: 10 },
'gpt-5.2': { prompt: 1.75, completion: 14 },
'gpt-5-nano': { prompt: 0.05, completion: 0.4 },
'gpt-5-mini': { prompt: 0.25, completion: 2 },
'gpt-5-pro': { prompt: 15, completion: 120 },
o1: { prompt: 15, completion: 60 },
'o1-mini': { prompt: 1.1, completion: 4.4 },
'o1-preview': { prompt: 15, completion: 60 },
o3: { prompt: 2, completion: 8 },
'o3-mini': { prompt: 1.1, completion: 4.4 },
'o4-mini': { prompt: 1.1, completion: 4.4 },
'claude-instant': { prompt: 0.8, completion: 2.4 },
'claude-2': { prompt: 8, completion: 24 },
'claude-2.1': { prompt: 8, completion: 24 },
'claude-3-haiku': { prompt: 0.25, completion: 1.25 },
'claude-3-sonnet': { prompt: 3, completion: 15 },
'claude-3-opus': { prompt: 15, completion: 75 },
'claude-3-5-haiku': { prompt: 0.8, completion: 4 },
'claude-3.5-haiku': { prompt: 0.8, completion: 4 },
'claude-3-5-sonnet': { prompt: 3, completion: 15 },
'claude-3.5-sonnet': { prompt: 3, completion: 15 },
'claude-3-7-sonnet': { prompt: 3, completion: 15 },
'claude-3.7-sonnet': { prompt: 3, completion: 15 },
'claude-haiku-4-5': { prompt: 1, completion: 5 },
'claude-opus-4': { prompt: 15, completion: 75 },
'claude-opus-4-5': { prompt: 5, completion: 25 },
'claude-opus-4-6': { prompt: 5, completion: 25 },
'claude-sonnet-4': { prompt: 3, completion: 15 },
'claude-sonnet-4-6': { prompt: 3, completion: 15 },
'command-r': { prompt: 0.5, completion: 1.5 },
'command-r-plus': { prompt: 3, completion: 15 },
'command-text': { prompt: 1.5, completion: 2.0 },
'deepseek-chat': { prompt: 0.28, completion: 0.42 },
'deepseek-reasoner': { prompt: 0.28, completion: 0.42 },
'deepseek-r1': { prompt: 0.4, completion: 2.0 },
'deepseek-v3': { prompt: 0.2, completion: 0.8 },
'gemma-2': { prompt: 0.01, completion: 0.03 }, // Base pattern (using gemma-2-9b pricing)
'gemma-3': { prompt: 0.02, completion: 0.04 }, // Base pattern (using gemma-3n-e4b pricing)
'gemma-3-27b': { prompt: 0.09, completion: 0.16 },
'gemini-1.5': { prompt: 2.5, completion: 10 },
'gemini-1.5-flash': { prompt: 0.15, completion: 0.6 },
'gemini-1.5-flash-8b': { prompt: 0.075, completion: 0.3 },
'gemini-2.0': { prompt: 0.1, completion: 0.4 }, // Base pattern (using 2.0-flash pricing)
'gemini-2.0-flash': { prompt: 0.1, completion: 0.4 },
'gemini-2.0-flash-lite': { prompt: 0.075, completion: 0.3 },
'gemini-2.5': { prompt: 0.3, completion: 2.5 }, // Base pattern (using 2.5-flash pricing)
'gemini-2.5-flash': { prompt: 0.3, completion: 2.5 },
'gemini-2.5-flash-lite': { prompt: 0.1, completion: 0.4 },
'gemini-2.5-pro': { prompt: 1.25, completion: 10 },
'gemini-2.5-flash-image': { prompt: 0.15, completion: 30 },
'gemini-3': { prompt: 2, completion: 12 },
'gemini-3-pro-image': { prompt: 2, completion: 120 },
'gemini-pro-vision': { prompt: 0.5, completion: 1.5 },
grok: { prompt: 2.0, completion: 10.0 }, // Base pattern defaults to grok-2
'grok-beta': { prompt: 5.0, completion: 15.0 },
'grok-vision-beta': { prompt: 5.0, completion: 15.0 },
'grok-2': { prompt: 2.0, completion: 10.0 },
'grok-2-1212': { prompt: 2.0, completion: 10.0 },
'grok-2-latest': { prompt: 2.0, completion: 10.0 },
'grok-2-vision': { prompt: 2.0, completion: 10.0 },
'grok-2-vision-1212': { prompt: 2.0, completion: 10.0 },
'grok-2-vision-latest': { prompt: 2.0, completion: 10.0 },
'grok-3': { prompt: 3.0, completion: 15.0 },
'grok-3-fast': { prompt: 5.0, completion: 25.0 },
'grok-3-mini': { prompt: 0.3, completion: 0.5 },
'grok-3-mini-fast': { prompt: 0.6, completion: 4 },
'grok-4': { prompt: 3.0, completion: 15.0 },
'grok-4-fast': { prompt: 0.2, completion: 0.5 },
'grok-4-1-fast': { prompt: 0.2, completion: 0.5 }, // covers reasoning & non-reasoning variants
'grok-code-fast': { prompt: 0.2, completion: 1.5 },
codestral: { prompt: 0.3, completion: 0.9 },
'ministral-3b': { prompt: 0.04, completion: 0.04 },
'ministral-8b': { prompt: 0.1, completion: 0.1 },
'mistral-nemo': { prompt: 0.15, completion: 0.15 },
'mistral-saba': { prompt: 0.2, completion: 0.6 },
'pixtral-large': { prompt: 2.0, completion: 6.0 },
'mistral-large': { prompt: 2.0, completion: 6.0 },
'mixtral-8x22b': { prompt: 0.65, completion: 0.65 },
// Moonshot/Kimi models (base patterns first, specific patterns last for correct matching)
kimi: { prompt: 0.6, completion: 2.5 }, // Base pattern
moonshot: { prompt: 2.0, completion: 5.0 }, // Base pattern (using 128k pricing)
'kimi-latest': { prompt: 0.2, completion: 2.0 }, // Uses 8k/32k/128k pricing dynamically
'kimi-k2': { prompt: 0.6, completion: 2.5 },
'kimi-k2.5': { prompt: 0.6, completion: 3.0 },
'kimi-k2-turbo': { prompt: 1.15, completion: 8.0 },
'kimi-k2-turbo-preview': { prompt: 1.15, completion: 8.0 },
'kimi-k2-0905': { prompt: 0.6, completion: 2.5 },
'kimi-k2-0905-preview': { prompt: 0.6, completion: 2.5 },
'kimi-k2-0711': { prompt: 0.6, completion: 2.5 },
'kimi-k2-0711-preview': { prompt: 0.6, completion: 2.5 },
'kimi-k2-thinking': { prompt: 0.6, completion: 2.5 },
'kimi-k2-thinking-turbo': { prompt: 1.15, completion: 8.0 },
'moonshot-v1': { prompt: 2.0, completion: 5.0 },
'moonshot-v1-auto': { prompt: 2.0, completion: 5.0 },
'moonshot-v1-8k': { prompt: 0.2, completion: 2.0 },
'moonshot-v1-8k-vision': { prompt: 0.2, completion: 2.0 },
'moonshot-v1-8k-vision-preview': { prompt: 0.2, completion: 2.0 },
'moonshot-v1-32k': { prompt: 1.0, completion: 3.0 },
'moonshot-v1-32k-vision': { prompt: 1.0, completion: 3.0 },
'moonshot-v1-32k-vision-preview': { prompt: 1.0, completion: 3.0 },
'moonshot-v1-128k': { prompt: 2.0, completion: 5.0 },
'moonshot-v1-128k-vision': { prompt: 2.0, completion: 5.0 },
'moonshot-v1-128k-vision-preview': { prompt: 2.0, completion: 5.0 },
// GPT-OSS models (specific sizes)
'gpt-oss:20b': { prompt: 0.05, completion: 0.2 },
'gpt-oss-20b': { prompt: 0.05, completion: 0.2 },
'gpt-oss:120b': { prompt: 0.15, completion: 0.6 },
'gpt-oss-120b': { prompt: 0.15, completion: 0.6 },
// GLM models (Zhipu AI) - general to specific
glm4: { prompt: 0.1, completion: 0.1 },
'glm-4': { prompt: 0.1, completion: 0.1 },
'glm-4-32b': { prompt: 0.1, completion: 0.1 },
'glm-4.5': { prompt: 0.35, completion: 1.55 },
'glm-4.5-air': { prompt: 0.14, completion: 0.86 },
'glm-4.5v': { prompt: 0.6, completion: 1.8 },
'glm-4.6': { prompt: 0.5, completion: 1.75 },
// Qwen models
qwen: { prompt: 0.08, completion: 0.33 }, // Qwen base pattern (using qwen2.5-72b pricing)
'qwen2.5': { prompt: 0.08, completion: 0.33 }, // Qwen 2.5 base pattern
'qwen-turbo': { prompt: 0.05, completion: 0.2 },
'qwen-plus': { prompt: 0.4, completion: 1.2 },
'qwen-max': { prompt: 1.6, completion: 6.4 },
'qwq-32b': { prompt: 0.15, completion: 0.4 },
// Qwen3 models
qwen3: { prompt: 0.035, completion: 0.138 }, // Qwen3 base pattern (using qwen3-4b pricing)
'qwen3-8b': { prompt: 0.035, completion: 0.138 },
'qwen3-14b': { prompt: 0.05, completion: 0.22 },
'qwen3-30b-a3b': { prompt: 0.06, completion: 0.22 },
'qwen3-32b': { prompt: 0.05, completion: 0.2 },
'qwen3-235b-a22b': { prompt: 0.08, completion: 0.55 },
// Qwen3 VL (Vision-Language) models
'qwen3-vl-8b-thinking': { prompt: 0.18, completion: 2.1 },
'qwen3-vl-8b-instruct': { prompt: 0.18, completion: 0.69 },
'qwen3-vl-30b-a3b': { prompt: 0.29, completion: 1.0 },
'qwen3-vl-235b-a22b': { prompt: 0.3, completion: 1.2 },
// Qwen3 specialized models
'qwen3-max': { prompt: 1.2, completion: 6 },
'qwen3-coder': { prompt: 0.22, completion: 0.95 },
'qwen3-coder-30b-a3b': { prompt: 0.06, completion: 0.25 },
'qwen3-coder-plus': { prompt: 1, completion: 5 },
'qwen3-coder-flash': { prompt: 0.3, completion: 1.5 },
'qwen3-next-80b-a3b': { prompt: 0.1, completion: 0.8 },
},
bedrockValues,
);
/**
* Mapping of model token sizes to their respective multipliers for cached input, read and write.
* See Anthropic's documentation on this: https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching#pricing
* The rates are 1 USD per 1M tokens.
* @type {Object.<string, {write: number, read: number }>}
*/
const cacheTokenValues = {
'claude-3.7-sonnet': { write: 3.75, read: 0.3 },
'claude-3-7-sonnet': { write: 3.75, read: 0.3 },
'claude-3.5-sonnet': { write: 3.75, read: 0.3 },
'claude-3-5-sonnet': { write: 3.75, read: 0.3 },
'claude-3.5-haiku': { write: 1, read: 0.08 },
'claude-3-5-haiku': { write: 1, read: 0.08 },
'claude-3-haiku': { write: 0.3, read: 0.03 },
'claude-haiku-4-5': { write: 1.25, read: 0.1 },
'claude-sonnet-4': { write: 3.75, read: 0.3 },
'claude-sonnet-4-6': { write: 3.75, read: 0.3 },
'claude-opus-4': { write: 18.75, read: 1.5 },
'claude-opus-4-5': { write: 6.25, read: 0.5 },
'claude-opus-4-6': { write: 6.25, read: 0.5 },
// DeepSeek models - cache hit: $0.028/1M, cache miss: $0.28/1M
deepseek: { write: 0.28, read: 0.028 },
'deepseek-chat': { write: 0.28, read: 0.028 },
'deepseek-reasoner': { write: 0.28, read: 0.028 },
// Moonshot/Kimi models - cache hit: $0.15/1M (k2) or $0.10/1M (k2.5), cache miss: $0.60/1M
kimi: { write: 0.6, read: 0.15 },
'kimi-k2': { write: 0.6, read: 0.15 },
'kimi-k2.5': { write: 0.6, read: 0.1 },
'kimi-k2-turbo': { write: 1.15, read: 0.15 },
'kimi-k2-turbo-preview': { write: 1.15, read: 0.15 },
'kimi-k2-0905': { write: 0.6, read: 0.15 },
'kimi-k2-0905-preview': { write: 0.6, read: 0.15 },
'kimi-k2-0711': { write: 0.6, read: 0.15 },
'kimi-k2-0711-preview': { write: 0.6, read: 0.15 },
'kimi-k2-thinking': { write: 0.6, read: 0.15 },
'kimi-k2-thinking-turbo': { write: 1.15, read: 0.15 },
};
/**
* Premium (tiered) pricing for models whose rates change based on prompt size.
* Each entry specifies the token threshold and the rates that apply above it.
* @type {Object.<string, {threshold: number, prompt: number, completion: number}>}
*/
const premiumTokenValues = {
'claude-opus-4-6': { threshold: 200000, prompt: 10, completion: 37.5 },
'claude-sonnet-4-6': { threshold: 200000, prompt: 6, completion: 22.5 },
};
/**
* Retrieves the key associated with a given model name.
*
* @param {string} model - The model name to match.
* @param {string} endpoint - The endpoint name to match.
* @returns {string|undefined} The key corresponding to the model name, or undefined if no match is found.
*/
const getValueKey = (model, endpoint) => {
if (!model || typeof model !== 'string') {
return undefined;
}
// Use findMatchingPattern directly against tokenValues for efficient lookup
if (!endpoint || (typeof endpoint === 'string' && !tokenValues[endpoint])) {
const matchedKey = findMatchingPattern(model, tokenValues);
if (matchedKey) {
return matchedKey;
}
}
// Fallback: use matchModelName for edge cases and legacy handling
const modelName = matchModelName(model, endpoint);
if (!modelName) {
return undefined;
}
// Legacy token size mappings and aliases for older models
if (modelName.includes('gpt-3.5-turbo-16k')) {
return '16k';
} else if (modelName.includes('gpt-3.5')) {
return '4k';
} else if (modelName.includes('gpt-4-vision')) {
return 'gpt-4-1106'; // Alias for gpt-4-vision
} else if (modelName.includes('gpt-4-0125')) {
return 'gpt-4-1106'; // Alias for gpt-4-0125
} else if (modelName.includes('gpt-4-turbo')) {
return 'gpt-4-1106'; // Alias for gpt-4-turbo
} else if (modelName.includes('gpt-4-32k')) {
return '32k';
} else if (modelName.includes('gpt-4')) {
return '8k';
}
return undefined;
};
/**
* Retrieves the multiplier for a given value key and token type. If no value key is provided,
* it attempts to derive it from the model name.
*
* @param {Object} params - The parameters for the function.
* @param {string} [params.valueKey] - The key corresponding to the model name.
* @param {'prompt' | 'completion'} [params.tokenType] - The type of token (e.g., 'prompt' or 'completion').
* @param {string} [params.model] - The model name to derive the value key from if not provided.
* @param {string} [params.endpoint] - The endpoint name to derive the value key from if not provided.
* @param {EndpointTokenConfig} [params.endpointTokenConfig] - The token configuration for the endpoint.
* @param {number} [params.inputTokenCount] - Total input token count for tiered pricing.
* @returns {number} The multiplier for the given parameters, or a default value if not found.
*/
const getMultiplier = ({
model,
valueKey,
endpoint,
tokenType,
inputTokenCount,
endpointTokenConfig,
}) => {
if (endpointTokenConfig) {
return endpointTokenConfig?.[model]?.[tokenType] ?? defaultRate;
}
if (valueKey && tokenType) {
const premiumRate = getPremiumRate(valueKey, tokenType, inputTokenCount);
if (premiumRate != null) {
return premiumRate;
}
return tokenValues[valueKey]?.[tokenType] ?? defaultRate;
}
if (!tokenType || !model) {
return 1;
}
valueKey = getValueKey(model, endpoint);
if (!valueKey) {
return defaultRate;
}
const premiumRate = getPremiumRate(valueKey, tokenType, inputTokenCount);
if (premiumRate != null) {
return premiumRate;
}
return tokenValues[valueKey]?.[tokenType] ?? defaultRate;
};
/**
* Checks if premium (tiered) pricing applies and returns the premium rate.
* Each model defines its own threshold in `premiumTokenValues`.
* @param {string} valueKey
* @param {string} tokenType
* @param {number} [inputTokenCount]
* @returns {number|null}
*/
const getPremiumRate = (valueKey, tokenType, inputTokenCount) => {
if (inputTokenCount == null) {
return null;
}
const premiumEntry = premiumTokenValues[valueKey];
if (!premiumEntry || inputTokenCount <= premiumEntry.threshold) {
return null;
}
return premiumEntry[tokenType] ?? null;
};
/**
* Retrieves the cache multiplier for a given value key and token type. If no value key is provided,
* it attempts to derive it from the model name.
*
* @param {Object} params - The parameters for the function.
* @param {string} [params.valueKey] - The key corresponding to the model name.
* @param {'write' | 'read'} [params.cacheType] - The type of token (e.g., 'write' or 'read').
* @param {string} [params.model] - The model name to derive the value key from if not provided.
* @param {string} [params.endpoint] - The endpoint name to derive the value key from if not provided.
* @param {EndpointTokenConfig} [params.endpointTokenConfig] - The token configuration for the endpoint.
* @returns {number | null} The multiplier for the given parameters, or `null` if not found.
*/
const getCacheMultiplier = ({ valueKey, cacheType, model, endpoint, endpointTokenConfig }) => {
if (endpointTokenConfig) {
return endpointTokenConfig?.[model]?.[cacheType] ?? null;
}
if (valueKey && cacheType) {
return cacheTokenValues[valueKey]?.[cacheType] ?? null;
}
if (!cacheType || !model) {
return null;
}
valueKey = getValueKey(model, endpoint);
if (!valueKey) {
return null;
}
// If we got this far, and values[cacheType] is undefined somehow, return a rough average of default multipliers
return cacheTokenValues[valueKey]?.[cacheType] ?? null;
};
module.exports = {
tokenValues,
premiumTokenValues,
getValueKey,
getMultiplier,
getPremiumRate,
getCacheMultiplier,
defaultRate,
cacheTokenValues,
};

File diff suppressed because it is too large Load diff

View file

@ -1,31 +0,0 @@
const bcrypt = require('bcryptjs');
/**
* Compares the provided password with the user's password.
*
* @param {IUser} user - The user to compare the password for.
* @param {string} candidatePassword - The password to test against the user's password.
* @returns {Promise<boolean>} A promise that resolves to a boolean indicating if the password matches.
*/
const comparePassword = async (user, candidatePassword) => {
if (!user) {
throw new Error('No user provided');
}
if (!user.password) {
throw new Error('No password, likely an email first registered via Social/OIDC login');
}
return new Promise((resolve, reject) => {
bcrypt.compare(candidatePassword, user.password, (err, isMatch) => {
if (err) {
reject(err);
}
resolve(isMatch);
});
});
};
module.exports = {
comparePassword,
};