🧠 feat: User Memories for Conversational Context (#7760)

* 🧠 feat: User Memories for Conversational Context

chore: mcp typing, use `t`

WIP: first pass, Memories UI

- Added MemoryViewer component for displaying, editing, and deleting user memories.
- Integrated data provider hooks for fetching, updating, and deleting memories.
- Implemented pagination and loading states for better user experience.
- Created unit tests for MemoryViewer to ensure functionality and interaction with data provider.
- Updated translation files to include new UI strings related to memories.

chore: move mcp-related files to own directory

chore: rename librechat-mcp to librechat-api

WIP: first pass, memory processing and data schemas

chore: linting in fileSearch.js query description

chore: rename librechat-api to @librechat/api across the project

WIP: first pass, functional memory agent

feat: add MemoryEditDialog and MemoryViewer components for managing user memories

- Introduced MemoryEditDialog for editing memory entries with validation and toast notifications.
- Updated MemoryViewer to support editing and deleting memories, including pagination and loading states.
- Enhanced data provider to handle memory updates with optional original key for better management.
- Added new localization strings for memory-related UI elements.

feat: add memory permissions management

- Implemented memory permissions in the backend, allowing roles to have specific permissions for using, creating, updating, and reading memories.
- Added new API endpoints for updating memory permissions associated with roles.
- Created a new AdminSettings component for managing memory permissions in the frontend.
- Integrated memory permissions into the existing roles and permissions schemas.
- Updated the interface to include memory settings and permissions.
- Enhanced the MemoryViewer component to conditionally render admin settings based on user roles.
- Added localization support for memory permissions in the translation files.

feat: move AdminSettings component to a new position in MemoryViewer for better visibility

refactor: clean up commented code in MemoryViewer component

feat: enhance MemoryViewer with search functionality and improve MemoryEditDialog integration

- Added a search input to filter memories in the MemoryViewer component.
- Refactored MemoryEditDialog to accept children for better customization.
- Updated MemoryViewer to utilize the new EditMemoryButton and DeleteMemoryButton components for editing and deleting memories.
- Improved localization support by adding new strings for memory filtering and deletion confirmation.

refactor: optimize memory filtering in MemoryViewer using match-sorter

- Replaced manual filtering logic with match-sorter for improved search functionality.
- Enhanced performance and readability of the filteredMemories computation.

feat: enhance MemoryEditDialog with triggerRef and improve updateMemory mutation handling

feat: implement access control for MemoryEditDialog and MemoryViewer components

refactor: remove commented out code and create runMemory method

refactor: rename role based files

feat: implement access control for memory usage in AgentClient

refactor: simplify checkVisionRequest method in AgentClient by removing commented-out code

refactor: make `agents` dir in api package

refactor: migrate Azure utilities to TypeScript and consolidate imports

refactor: move sanitizeFilename function to a new file and update imports, add related tests

refactor: update LLM configuration types and consolidate Azure options in the API package

chore: linting

chore: import order

refactor: replace getLLMConfig with getOpenAIConfig and remove unused LLM configuration file

chore: update winston-daily-rotate-file to version 5.0.0 and add object-hash dependency in package-lock.json

refactor: move primeResources and optionalChainWithEmptyCheck functions to resources.ts and update imports

refactor: move createRun function to a new run.ts file and update related imports

fix: ensure safeAttachments is correctly typed as an array of TFile

chore: add node-fetch dependency and refactor fetch-related functions into packages/api/utils, removing the old generators file

refactor: enhance TEndpointOption type by using Pick to streamline endpoint fields and add new properties for model parameters and client options

feat: implement initializeOpenAIOptions function and update OpenAI types for enhanced configuration handling

fix: update types due to new TEndpointOption typing

fix: ensure safe access to group parameters in initializeOpenAIOptions function

fix: remove redundant API key validation comment in initializeOpenAIOptions function

refactor: rename initializeOpenAIOptions to initializeOpenAI for consistency and update related documentation

refactor: decouple req.body fields and tool loading from initializeAgentOptions

chore: linting

refactor: adjust column widths in MemoryViewer for improved layout

refactor: simplify agent initialization by creating loadAgent function and removing unused code

feat: add memory configuration loading and validation functions

WIP: first pass, memory processing with config

feat: implement memory callback and artifact handling

feat: implement memory artifacts display and processing updates

feat: add memory configuration options and schema validation for validKeys

fix: update MemoryEditDialog and MemoryViewer to handle memory state and display improvements

refactor: remove padding from BookmarkTable and MemoryViewer headers for consistent styling

WIP: initial tokenLimit config and move Tokenizer to @librechat/api

refactor: update mongoMeili plugin methods to use callback for better error handling

feat: enhance memory management with token tracking and usage metrics

- Added token counting for memory entries to enforce limits and provide usage statistics.
- Updated memory retrieval and update routes to include total token usage and limit.
- Enhanced MemoryEditDialog and MemoryViewer components to display memory usage and token information.
- Refactored memory processing functions to handle token limits and provide feedback on memory capacity.

feat: implement memory artifact handling in attachment handler

- Enhanced useAttachmentHandler to process memory artifacts when receiving updates.
- Introduced handleMemoryArtifact utility to manage memory updates and deletions.
- Updated query client to reflect changes in memory state based on incoming data.

refactor: restructure web search key extraction logic

- Moved the logic for extracting API keys from the webSearchAuth configuration into a dedicated function, getWebSearchKeys.
- Updated webSearchKeys to utilize the new function for improved clarity and maintainability.
- Prevents build time errors

feat: add personalization settings and memory preferences management

- Introduced a new Personalization tab in settings to manage user memory preferences.
- Implemented API endpoints and client-side logic for updating memory preferences.
- Enhanced user interface components to reflect personalization options and memory usage.
- Updated permissions to allow users to opt out of memory features.
- Added localization support for new settings and messages related to personalization.

style: personalization switch class

feat: add PersonalizationIcon and align Side Panel UI

feat: implement memory creation functionality

- Added a new API endpoint for creating memory entries, including validation for key and value.
- Introduced MemoryCreateDialog component for user interface to facilitate memory creation.
- Integrated token limit checks to prevent exceeding user memory capacity.
- Updated MemoryViewer to include a button for opening the memory creation dialog.
- Enhanced localization support for new messages related to memory creation.

feat: enhance message processing with configurable window size

- Updated AgentClient to use a configurable message window size for processing messages.
- Introduced messageWindowSize option in memory configuration schema with a default value of 5.
- Improved logic for selecting messages to process based on the configured window size.

chore: update librechat-data-provider version to 0.7.87 in package.json and package-lock.json

chore: remove OpenAPIPlugin and its associated tests

chore: remove MIGRATION_README.md as migration tasks are completed

ci: fix backend tests

chore: remove unused translation keys from localization file

chore: remove problematic test file and unused var in AgentClient

chore: remove unused import and import directly for JSDoc

* feat: add api package build stage in Dockerfile for improved modularity

* docs: reorder build steps in contributing guide for clarity
This commit is contained in:
Danny Avila 2025-06-07 18:52:22 -04:00 committed by GitHub
parent cd7dd576c1
commit 29ef91b4dd
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
170 changed files with 5700 additions and 3632 deletions

View file

@ -30,8 +30,8 @@ Project maintainers have the right and responsibility to remove, edit, or reject
2. Install typescript globally: `npm i -g typescript`.
3. Run `npm ci` to install dependencies.
4. Build the data provider: `npm run build:data-provider`.
5. Build MCP: `npm run build:mcp`.
6. Build data schemas: `npm run build:data-schemas`.
5. Build data schemas: `npm run build:data-schemas`.
6. Build API methods: `npm run build:api`.
7. Setup and run unit tests:
- Copy `.env.test`: `cp api/test/.env.test.example api/test/.env.test`.
- Run backend unit tests: `npm run test:api`.

View file

@ -7,6 +7,7 @@ on:
- release/*
paths:
- 'api/**'
- 'packages/api/**'
jobs:
tests_Backend:
name: Run Backend unit tests
@ -36,12 +37,12 @@ jobs:
- name: Install Data Provider Package
run: npm run build:data-provider
- name: Install MCP Package
run: npm run build:mcp
- name: Install Data Schemas Package
run: npm run build:data-schemas
- name: Install API Package
run: npm run build:api
- name: Create empty auth.json file
run: |
mkdir -p api/data
@ -66,5 +67,5 @@ jobs:
- name: Run librechat-data-provider unit tests
run: cd packages/data-provider && npm run test:ci
- name: Run librechat-mcp unit tests
run: cd packages/mcp && npm run test:ci
- name: Run librechat-api unit tests
run: cd packages/api && npm run test:ci

View file

@ -14,7 +14,7 @@ RUN npm config set fetch-retry-maxtimeout 600000 && \
npm config set fetch-retry-mintimeout 15000
COPY package*.json ./
COPY packages/data-provider/package*.json ./packages/data-provider/
COPY packages/mcp/package*.json ./packages/mcp/
COPY packages/api/package*.json ./packages/api/
COPY packages/data-schemas/package*.json ./packages/data-schemas/
COPY client/package*.json ./client/
COPY api/package*.json ./api/
@ -24,26 +24,27 @@ FROM base-min AS base
WORKDIR /app
RUN npm ci
# Build data-provider
# Build `data-provider` package
FROM base AS data-provider-build
WORKDIR /app/packages/data-provider
COPY packages/data-provider ./
RUN npm run build
# Build mcp package
FROM base AS mcp-build
WORKDIR /app/packages/mcp
COPY packages/mcp ./
COPY --from=data-provider-build /app/packages/data-provider/dist /app/packages/data-provider/dist
RUN npm run build
# Build data-schemas
# Build `data-schemas` package
FROM base AS data-schemas-build
WORKDIR /app/packages/data-schemas
COPY packages/data-schemas ./
COPY --from=data-provider-build /app/packages/data-provider/dist /app/packages/data-provider/dist
RUN npm run build
# Build `api` package
FROM base AS api-build
WORKDIR /app/packages/api
COPY packages/api ./
COPY --from=data-provider-build /app/packages/data-provider/dist /app/packages/data-provider/dist
COPY --from=data-schemas-build /app/packages/data-schemas/dist /app/packages/data-schemas/dist
RUN npm run build
# Client build
FROM base AS client-build
WORKDIR /app/client
@ -63,8 +64,8 @@ RUN npm ci --omit=dev
COPY api ./api
COPY config ./config
COPY --from=data-provider-build /app/packages/data-provider/dist ./packages/data-provider/dist
COPY --from=mcp-build /app/packages/mcp/dist ./packages/mcp/dist
COPY --from=data-schemas-build /app/packages/data-schemas/dist ./packages/data-schemas/dist
COPY --from=api-build /app/packages/api/dist ./packages/api/dist
COPY --from=client-build /app/client/dist ./client/dist
WORKDIR /app/api
EXPOSE 3080

View file

@ -10,6 +10,7 @@ const {
validateVisionModel,
} = require('librechat-data-provider');
const { SplitStreamHandler: _Handler } = require('@librechat/agents');
const { Tokenizer, createFetch, createStreamEventHandlers } = require('@librechat/api');
const {
truncateText,
formatMessage,
@ -26,8 +27,6 @@ const {
const { getModelMaxTokens, getModelMaxOutputTokens, matchModelName } = require('~/utils');
const { spendTokens, spendStructuredTokens } = require('~/models/spendTokens');
const { encodeAndFormat } = require('~/server/services/Files/images/encode');
const { createFetch, createStreamEventHandlers } = require('./generators');
const Tokenizer = require('~/server/services/Tokenizer');
const { sleep } = require('~/server/utils');
const BaseClient = require('./BaseClient');
const { logger } = require('~/config');

View file

@ -2,6 +2,7 @@ const { Keyv } = require('keyv');
const crypto = require('crypto');
const { CohereClient } = require('cohere-ai');
const { fetchEventSource } = require('@waylaidwanderer/fetch-event-source');
const { constructAzureURL, genAzureChatCompletion } = require('@librechat/api');
const { encoding_for_model: encodingForModel, get_encoding: getEncoding } = require('tiktoken');
const {
ImageDetail,
@ -10,9 +11,9 @@ const {
CohereConstants,
mapModelToAzureConfig,
} = require('librechat-data-provider');
const { extractBaseURL, constructAzureURL, genAzureChatCompletion } = require('~/utils');
const { createContextHandlers } = require('./prompts');
const { createCoherePayload } = require('./llm');
const { extractBaseURL } = require('~/utils');
const BaseClient = require('./BaseClient');
const { logger } = require('~/config');

View file

@ -1,4 +1,5 @@
const { google } = require('googleapis');
const { Tokenizer } = require('@librechat/api');
const { concat } = require('@langchain/core/utils/stream');
const { ChatVertexAI } = require('@langchain/google-vertexai');
const { ChatGoogleGenerativeAI } = require('@langchain/google-genai');
@ -19,7 +20,6 @@ const {
} = require('librechat-data-provider');
const { getSafetySettings } = require('~/server/services/Endpoints/google/llm');
const { encodeAndFormat } = require('~/server/services/Files/images');
const Tokenizer = require('~/server/services/Tokenizer');
const { spendTokens } = require('~/models/spendTokens');
const { getModelMaxTokens } = require('~/utils');
const { sleep } = require('~/server/utils');

View file

@ -1,6 +1,14 @@
const { OllamaClient } = require('./OllamaClient');
const { HttpsProxyAgent } = require('https-proxy-agent');
const { SplitStreamHandler, CustomOpenAIClient: OpenAI } = require('@librechat/agents');
const {
isEnabled,
Tokenizer,
createFetch,
constructAzureURL,
genAzureChatCompletion,
createStreamEventHandlers,
} = require('@librechat/api');
const {
Constants,
ImageDetail,
@ -16,13 +24,6 @@ const {
validateVisionModel,
mapModelToAzureConfig,
} = require('librechat-data-provider');
const {
extractBaseURL,
constructAzureURL,
getModelMaxTokens,
genAzureChatCompletion,
getModelMaxOutputTokens,
} = require('~/utils');
const {
truncateText,
formatMessage,
@ -30,10 +31,9 @@ const {
titleInstruction,
createContextHandlers,
} = require('./prompts');
const { extractBaseURL, getModelMaxTokens, getModelMaxOutputTokens } = require('~/utils');
const { encodeAndFormat } = require('~/server/services/Files/images/encode');
const { createFetch, createStreamEventHandlers } = require('./generators');
const { addSpaceIfNeeded, isEnabled, sleep } = require('~/server/utils');
const Tokenizer = require('~/server/services/Tokenizer');
const { addSpaceIfNeeded, sleep } = require('~/server/utils');
const { spendTokens } = require('~/models/spendTokens');
const { handleOpenAIErrors } = require('./tools/util');
const { createLLM, RunManager } = require('./llm');

View file

@ -1,71 +0,0 @@
const fetch = require('node-fetch');
const { GraphEvents } = require('@librechat/agents');
const { logger, sendEvent } = require('~/config');
const { sleep } = require('~/server/utils');
/**
* Makes a function to make HTTP request and logs the process.
* @param {Object} params
* @param {boolean} [params.directEndpoint] - Whether to use a direct endpoint.
* @param {string} [params.reverseProxyUrl] - The reverse proxy URL to use for the request.
* @returns {Promise<Response>} - A promise that resolves to the response of the fetch request.
*/
function createFetch({ directEndpoint = false, reverseProxyUrl = '' }) {
/**
* Makes an HTTP request and logs the process.
* @param {RequestInfo} url - The URL to make the request to. Can be a string or a Request object.
* @param {RequestInit} [init] - Optional init options for the request.
* @returns {Promise<Response>} - A promise that resolves to the response of the fetch request.
*/
return async (_url, init) => {
let url = _url;
if (directEndpoint) {
url = reverseProxyUrl;
}
logger.debug(`Making request to ${url}`);
if (typeof Bun !== 'undefined') {
return await fetch(url, init);
}
return await fetch(url, init);
};
}
// Add this at the module level outside the class
/**
* Creates event handlers for stream events that don't capture client references
* @param {Object} res - The response object to send events to
* @returns {Object} Object containing handler functions
*/
function createStreamEventHandlers(res) {
return {
[GraphEvents.ON_RUN_STEP]: (event) => {
if (res) {
sendEvent(res, event);
}
},
[GraphEvents.ON_MESSAGE_DELTA]: (event) => {
if (res) {
sendEvent(res, event);
}
},
[GraphEvents.ON_REASONING_DELTA]: (event) => {
if (res) {
sendEvent(res, event);
}
},
};
}
function createHandleLLMNewToken(streamRate) {
return async () => {
if (streamRate) {
await sleep(streamRate);
}
};
}
module.exports = {
createFetch,
createHandleLLMNewToken,
createStreamEventHandlers,
};

View file

@ -1,6 +1,5 @@
const { ChatOpenAI } = require('@langchain/openai');
const { sanitizeModelName, constructAzureURL } = require('~/utils');
const { isEnabled } = require('~/server/utils');
const { isEnabled, sanitizeModelName, constructAzureURL } = require('@librechat/api');
/**
* Creates a new instance of a language model (LLM) for chat interactions.

View file

@ -33,7 +33,9 @@ jest.mock('~/models', () => ({
const { getConvo, saveConvo } = require('~/models');
jest.mock('@librechat/agents', () => {
const { Providers } = jest.requireActual('@librechat/agents');
return {
Providers,
ChatOpenAI: jest.fn().mockImplementation(() => {
return {};
}),

View file

@ -1,184 +0,0 @@
require('dotenv').config();
const fs = require('fs');
const { z } = require('zod');
const path = require('path');
const yaml = require('js-yaml');
const { createOpenAPIChain } = require('langchain/chains');
const { DynamicStructuredTool } = require('@langchain/core/tools');
const { ChatPromptTemplate, HumanMessagePromptTemplate } = require('@langchain/core/prompts');
const { logger } = require('~/config');
function addLinePrefix(text, prefix = '// ') {
return text
.split('\n')
.map((line) => prefix + line)
.join('\n');
}
function createPrompt(name, functions) {
const prefix = `// The ${name} tool has the following functions. Determine the desired or most optimal function for the user's query:`;
const functionDescriptions = functions
.map((func) => `// - ${func.name}: ${func.description}`)
.join('\n');
return `${prefix}\n${functionDescriptions}
// You are an expert manager and scrum master. You must provide a detailed intent to better execute the function.
// Always format as such: {{"func": "function_name", "intent": "intent and expected result"}}`;
}
const AuthBearer = z
.object({
type: z.string().includes('service_http'),
authorization_type: z.string().includes('bearer'),
verification_tokens: z.object({
openai: z.string(),
}),
})
.catch(() => false);
const AuthDefinition = z
.object({
type: z.string(),
authorization_type: z.string(),
verification_tokens: z.object({
openai: z.string(),
}),
})
.catch(() => false);
async function readSpecFile(filePath) {
try {
const fileContents = await fs.promises.readFile(filePath, 'utf8');
if (path.extname(filePath) === '.json') {
return JSON.parse(fileContents);
}
return yaml.load(fileContents);
} catch (e) {
logger.error('[readSpecFile] error', e);
return false;
}
}
async function getSpec(url) {
const RegularUrl = z
.string()
.url()
.catch(() => false);
if (RegularUrl.parse(url) && path.extname(url) === '.json') {
const response = await fetch(url);
return await response.json();
}
const ValidSpecPath = z
.string()
.url()
.catch(async () => {
const spec = path.join(__dirname, '..', '.well-known', 'openapi', url);
if (!fs.existsSync(spec)) {
return false;
}
return await readSpecFile(spec);
});
return ValidSpecPath.parse(url);
}
async function createOpenAPIPlugin({ data, llm, user, message, memory, signal }) {
let spec;
try {
spec = await getSpec(data.api.url);
} catch (error) {
logger.error('[createOpenAPIPlugin] getSpec error', error);
return null;
}
if (!spec) {
logger.warn('[createOpenAPIPlugin] No spec found');
return null;
}
const headers = {};
const { auth, name_for_model, description_for_model, description_for_human } = data;
if (auth && AuthDefinition.parse(auth)) {
logger.debug('[createOpenAPIPlugin] auth detected', auth);
const { openai } = auth.verification_tokens;
if (AuthBearer.parse(auth)) {
headers.authorization = `Bearer ${openai}`;
logger.debug('[createOpenAPIPlugin] added auth bearer', headers);
}
}
const chainOptions = { llm };
if (data.headers && data.headers['librechat_user_id']) {
logger.debug('[createOpenAPIPlugin] id detected', headers);
headers[data.headers['librechat_user_id']] = user;
}
if (Object.keys(headers).length > 0) {
logger.debug('[createOpenAPIPlugin] headers detected', headers);
chainOptions.headers = headers;
}
if (data.params) {
logger.debug('[createOpenAPIPlugin] params detected', data.params);
chainOptions.params = data.params;
}
let history = '';
if (memory) {
logger.debug('[createOpenAPIPlugin] openAPI chain: memory detected', memory);
const { history: chat_history } = await memory.loadMemoryVariables({});
history = chat_history?.length > 0 ? `\n\n## Chat History:\n${chat_history}\n` : '';
}
chainOptions.prompt = ChatPromptTemplate.fromMessages([
HumanMessagePromptTemplate.fromTemplate(
`# Use the provided API's to respond to this query:\n\n{query}\n\n## Instructions:\n${addLinePrefix(
description_for_model,
)}${history}`,
),
]);
const chain = await createOpenAPIChain(spec, chainOptions);
const { functions } = chain.chains[0].lc_kwargs.llmKwargs;
return new DynamicStructuredTool({
name: name_for_model,
description_for_model: `${addLinePrefix(description_for_human)}${createPrompt(
name_for_model,
functions,
)}`,
description: `${description_for_human}`,
schema: z.object({
func: z
.string()
.describe(
`The function to invoke. The functions available are: ${functions
.map((func) => func.name)
.join(', ')}`,
),
intent: z
.string()
.describe('Describe your intent with the function and your expected result'),
}),
func: async ({ func = '', intent = '' }) => {
const filteredFunctions = functions.filter((f) => f.name === func);
chain.chains[0].lc_kwargs.llmKwargs.functions = filteredFunctions;
const query = `${message}${func?.length > 0 ? `\n// Intent: ${intent}` : ''}`;
const result = await chain.call({
query,
signal,
});
return result.response;
},
});
}
module.exports = {
getSpec,
readSpecFile,
createOpenAPIPlugin,
};

View file

@ -1,72 +0,0 @@
const fs = require('fs');
const { createOpenAPIPlugin, getSpec, readSpecFile } = require('./OpenAPIPlugin');
global.fetch = jest.fn().mockImplementationOnce(() => {
return new Promise((resolve) => {
resolve({
ok: true,
json: () => Promise.resolve({ key: 'value' }),
});
});
});
jest.mock('fs', () => ({
promises: {
readFile: jest.fn(),
},
existsSync: jest.fn(),
}));
describe('readSpecFile', () => {
it('reads JSON file correctly', async () => {
fs.promises.readFile.mockResolvedValue(JSON.stringify({ test: 'value' }));
const result = await readSpecFile('test.json');
expect(result).toEqual({ test: 'value' });
});
it('reads YAML file correctly', async () => {
fs.promises.readFile.mockResolvedValue('test: value');
const result = await readSpecFile('test.yaml');
expect(result).toEqual({ test: 'value' });
});
it('handles error correctly', async () => {
fs.promises.readFile.mockRejectedValue(new Error('test error'));
const result = await readSpecFile('test.json');
expect(result).toBe(false);
});
});
describe('getSpec', () => {
it('fetches spec from url correctly', async () => {
const parsedJson = await getSpec('https://www.instacart.com/.well-known/ai-plugin.json');
const isObject = typeof parsedJson === 'object';
expect(isObject).toEqual(true);
});
it('reads spec from file correctly', async () => {
fs.existsSync.mockReturnValue(true);
fs.promises.readFile.mockResolvedValue(JSON.stringify({ test: 'value' }));
const result = await getSpec('test.json');
expect(result).toEqual({ test: 'value' });
});
it('returns false when file does not exist', async () => {
fs.existsSync.mockReturnValue(false);
const result = await getSpec('test.json');
expect(result).toBe(false);
});
});
describe('createOpenAPIPlugin', () => {
it('returns null when getSpec throws an error', async () => {
const result = await createOpenAPIPlugin({ data: { api: { url: 'invalid' } } });
expect(result).toBe(null);
});
it('returns null when no spec is found', async () => {
const result = await createOpenAPIPlugin({});
expect(result).toBe(null);
});
// Add more tests here for different scenarios
});

View file

@ -8,10 +8,10 @@ const { HttpsProxyAgent } = require('https-proxy-agent');
const { FileContext, ContentTypes } = require('librechat-data-provider');
const { getImageBasename } = require('~/server/services/Files/images');
const extractBaseURL = require('~/utils/extractBaseURL');
const { logger } = require('~/config');
const logger = require('~/config/winston');
const displayMessage =
'DALL-E displayed an image. All generated images are already plainly visible, so don\'t repeat the descriptions in detail. Do not list download links as they are available in the UI already. The user may download the images by clicking on them, but do not mention anything about downloading to the user.';
"DALL-E displayed an image. All generated images are already plainly visible, so don't repeat the descriptions in detail. Do not list download links as they are available in the UI already. The user may download the images by clicking on them, but do not mention anything about downloading to the user.";
class DALLE3 extends Tool {
constructor(fields = {}) {
super();

View file

@ -1,10 +1,29 @@
const OpenAI = require('openai');
const DALLE3 = require('../DALLE3');
const { logger } = require('~/config');
const logger = require('~/config/winston');
jest.mock('openai');
jest.mock('@librechat/data-schemas', () => {
return {
logger: {
info: jest.fn(),
warn: jest.fn(),
debug: jest.fn(),
error: jest.fn(),
},
};
});
jest.mock('tiktoken', () => {
return {
encoding_for_model: jest.fn().mockReturnValue({
encode: jest.fn(),
decode: jest.fn(),
}),
};
});
const processFileURL = jest.fn();
jest.mock('~/server/services/Files/images', () => ({
@ -37,6 +56,11 @@ jest.mock('fs', () => {
return {
existsSync: jest.fn(),
mkdirSync: jest.fn(),
promises: {
writeFile: jest.fn(),
readFile: jest.fn(),
unlink: jest.fn(),
},
};
});

View file

@ -135,7 +135,7 @@ const createFileSearchTool = async ({ req, files, entity_id }) => {
query: z
.string()
.describe(
'A natural language query to search for relevant information in the files. Be specific and use keywords related to the information you\'re looking for. The query will be used for semantic similarity matching against the file contents.',
"A natural language query to search for relevant information in the files. Be specific and use keywords related to the information you're looking for. The query will be used for semantic similarity matching against the file contents.",
),
}),
},

View file

@ -1,7 +1,7 @@
const axios = require('axios');
const { EventSource } = require('eventsource');
const { Time, CacheKeys } = require('librechat-data-provider');
const { MCPManager, FlowStateManager } = require('librechat-mcp');
const { Time } = require('librechat-data-provider');
const { MCPManager, FlowStateManager } = require('@librechat/api');
const logger = require('./winston');
global.EventSource = EventSource;

View file

@ -49,6 +49,7 @@
"@langchain/google-vertexai": "^0.2.9",
"@langchain/textsplitters": "^0.1.0",
"@librechat/agents": "^2.4.38",
"@librechat/api": "*",
"@librechat/data-schemas": "*",
"@node-saml/passport-saml": "^5.0.0",
"@waylaidwanderer/fetch-event-source": "^3.0.1",
@ -81,7 +82,6 @@
"keyv-file": "^5.1.2",
"klona": "^2.0.6",
"librechat-data-provider": "*",
"librechat-mcp": "*",
"lodash": "^4.17.21",
"meilisearch": "^0.38.0",
"memorystore": "^1.6.7",
@ -90,6 +90,7 @@
"mongoose": "^8.12.1",
"multer": "^2.0.0",
"nanoid": "^3.3.7",
"node-fetch": "^2.7.0",
"nodemailer": "^6.9.15",
"ollama": "^0.5.0",
"openai": "^4.96.2",
@ -110,7 +111,7 @@
"traverse": "^0.6.7",
"ua-parser-js": "^1.0.36",
"winston": "^3.11.0",
"winston-daily-rotate-file": "^4.7.1",
"winston-daily-rotate-file": "^5.0.0",
"youtube-transcript": "^1.2.1",
"zod": "^3.22.4"
},

View file

@ -220,6 +220,9 @@ function disposeClient(client) {
if (client.maxResponseTokens) {
client.maxResponseTokens = null;
}
if (client.processMemory) {
client.processMemory = null;
}
if (client.run) {
// Break circular references in run
if (client.run.Graph) {

View file

@ -1,4 +1,6 @@
const { nanoid } = require('nanoid');
const { sendEvent } = require('@librechat/api');
const { logger } = require('@librechat/data-schemas');
const { Tools, StepTypes, FileContext } = require('librechat-data-provider');
const {
EnvVar,
@ -12,7 +14,6 @@ const {
const { processCodeOutput } = require('~/server/services/Files/Code/process');
const { loadAuthValues } = require('~/server/services/Tools/credentials');
const { saveBase64Image } = require('~/server/services/Files/process');
const { logger, sendEvent } = require('~/config');
class ModelEndHandler {
/**
@ -240,9 +241,7 @@ function createToolEndCallback({ req, res, artifactPromises }) {
if (output.artifact[Tools.web_search]) {
artifactPromises.push(
(async () => {
const name = `${output.name}_${output.tool_call_id}_${nanoid()}`;
const attachment = {
name,
type: Tools.web_search,
messageId: metadata.run_id,
toolCallId: output.tool_call_id,

View file

@ -1,13 +1,12 @@
// const { HttpsProxyAgent } = require('https-proxy-agent');
// const {
// Constants,
// ImageDetail,
// EModelEndpoint,
// resolveHeaders,
// validateVisionModel,
// mapModelToAzureConfig,
// } = require('librechat-data-provider');
require('events').EventEmitter.defaultMaxListeners = 100;
const { logger } = require('@librechat/data-schemas');
const {
sendEvent,
createRun,
Tokenizer,
memoryInstructions,
createMemoryProcessor,
} = require('@librechat/api');
const {
Callback,
GraphEvents,
@ -19,26 +18,30 @@ const {
} = require('@librechat/agents');
const {
Constants,
Permissions,
VisionModes,
ContentTypes,
EModelEndpoint,
KnownEndpoints,
PermissionTypes,
isAgentsEndpoint,
AgentCapabilities,
bedrockInputSchema,
removeNullishValues,
} = require('librechat-data-provider');
const { DynamicStructuredTool } = require('@langchain/core/tools');
const { getBufferString, HumanMessage } = require('@langchain/core/messages');
const { getCustomEndpointConfig, checkCapability } = require('~/server/services/Config');
const { addCacheControl, createContextHandlers } = require('~/app/clients/prompts');
const { initializeAgent } = require('~/server/services/Endpoints/agents/agent');
const { spendTokens, spendStructuredTokens } = require('~/models/spendTokens');
const { getBufferString, HumanMessage } = require('@langchain/core/messages');
const { DynamicStructuredTool } = require('@langchain/core/tools');
const { setMemory, deleteMemory, getFormattedMemories } = require('~/models');
const { encodeAndFormat } = require('~/server/services/Files/images/encode');
const initOpenAI = require('~/server/services/Endpoints/openAI/initialize');
const Tokenizer = require('~/server/services/Tokenizer');
const { checkAccess } = require('~/server/middleware/roles/access');
const BaseClient = require('~/app/clients/BaseClient');
const { logger, sendEvent, getMCPManager } = require('~/config');
const { createRun } = require('./run');
const { loadAgent } = require('~/models/Agent');
const { getMCPManager } = require('~/config');
/**
* @param {ServerRequest} req
@ -58,12 +61,8 @@ const legacyContentEndpoints = new Set([KnownEndpoints.groq, KnownEndpoints.deep
const noSystemModelRegex = [/\b(o1-preview|o1-mini|amazon\.titan-text)\b/gi];
// const { processMemory, memoryInstructions } = require('~/server/services/Endpoints/agents/memory');
// const { getFormattedMemories } = require('~/models/Memory');
// const { getCurrentDateTime } = require('~/utils');
function createTokenCounter(encoding) {
return (message) => {
return function (message) {
const countTokens = (text) => Tokenizer.getTokenCount(text, encoding);
return getTokenCountForMessage(message, countTokens);
};
@ -124,6 +123,8 @@ class AgentClient extends BaseClient {
this.usage;
/** @type {Record<string, number>} */
this.indexTokenCountMap = {};
/** @type {(messages: BaseMessage[]) => Promise<void>} */
this.processMemory;
}
/**
@ -138,55 +139,10 @@ class AgentClient extends BaseClient {
}
/**
*
* Checks if the model is a vision model based on request attachments and sets the appropriate options:
* - Sets `this.modelOptions.model` to `gpt-4-vision-preview` if the request is a vision request.
* - Sets `this.isVisionModel` to `true` if vision request.
* - Deletes `this.modelOptions.stop` if vision request.
* `AgentClient` is not opinionated about vision requests, so we don't do anything here
* @param {MongoFile[]} attachments
*/
checkVisionRequest(attachments) {
// if (!attachments) {
// return;
// }
// const availableModels = this.options.modelsConfig?.[this.options.endpoint];
// if (!availableModels) {
// return;
// }
// let visionRequestDetected = false;
// for (const file of attachments) {
// if (file?.type?.includes('image')) {
// visionRequestDetected = true;
// break;
// }
// }
// if (!visionRequestDetected) {
// return;
// }
// this.isVisionModel = validateVisionModel({ model: this.modelOptions.model, availableModels });
// if (this.isVisionModel) {
// delete this.modelOptions.stop;
// return;
// }
// for (const model of availableModels) {
// if (!validateVisionModel({ model, availableModels })) {
// continue;
// }
// this.modelOptions.model = model;
// this.isVisionModel = true;
// delete this.modelOptions.stop;
// return;
// }
// if (!availableModels.includes(this.defaultVisionModel)) {
// return;
// }
// if (!validateVisionModel({ model: this.defaultVisionModel, availableModels })) {
// return;
// }
// this.modelOptions.model = this.defaultVisionModel;
// this.isVisionModel = true;
// delete this.modelOptions.stop;
}
checkVisionRequest() {}
getSaveOptions() {
// TODO:
@ -270,24 +226,6 @@ class AgentClient extends BaseClient {
.filter(Boolean)
.join('\n')
.trim();
// this.systemMessage = getCurrentDateTime();
// const { withKeys, withoutKeys } = await getFormattedMemories({
// userId: this.options.req.user.id,
// });
// processMemory({
// userId: this.options.req.user.id,
// message: this.options.req.body.text,
// parentMessageId,
// memory: withKeys,
// thread_id: this.conversationId,
// }).catch((error) => {
// logger.error('Memory Agent failed to process memory', error);
// });
// this.systemMessage += '\n\n' + memoryInstructions;
// if (withoutKeys) {
// this.systemMessage += `\n\n# Existing memory about the user:\n${withoutKeys}`;
// }
if (this.options.attachments) {
const attachments = await this.options.attachments;
@ -431,9 +369,150 @@ class AgentClient extends BaseClient {
opts.getReqData({ promptTokens });
}
const withoutKeys = await this.useMemory();
if (withoutKeys) {
systemContent += `${memoryInstructions}\n\n# Existing memory about the user:\n${withoutKeys}`;
}
if (systemContent) {
this.options.agent.instructions = systemContent;
}
return result;
}
/**
* @returns {Promise<string | undefined>}
*/
async useMemory() {
const user = this.options.req.user;
if (user.personalization?.memories === false) {
return;
}
const hasAccess = await checkAccess(user, PermissionTypes.MEMORIES, [Permissions.USE]);
if (!hasAccess) {
logger.debug(
`[api/server/controllers/agents/client.js #useMemory] User ${user.id} does not have USE permission for memories`,
);
return;
}
/** @type {TCustomConfig['memory']} */
const memoryConfig = this.options.req?.app?.locals?.memory;
if (!memoryConfig || memoryConfig.disabled === true) {
return;
}
/** @type {Agent} */
let prelimAgent;
const allowedProviders = new Set(
this.options.req?.app?.locals?.[EModelEndpoint.agents]?.allowedProviders,
);
try {
if (memoryConfig.agent?.id != null && memoryConfig.agent.id !== this.options.agent.id) {
prelimAgent = await loadAgent({
req: this.options.req,
agent_id: memoryConfig.agent.id,
endpoint: EModelEndpoint.agents,
});
} else if (
memoryConfig.agent?.id == null &&
memoryConfig.agent?.model != null &&
memoryConfig.agent?.provider != null
) {
prelimAgent = { id: Constants.EPHEMERAL_AGENT_ID, ...memoryConfig.agent };
}
} catch (error) {
logger.error(
'[api/server/controllers/agents/client.js #useMemory] Error loading agent for memory',
error,
);
}
const agent = await initializeAgent({
req: this.options.req,
res: this.options.res,
agent: prelimAgent,
allowedProviders,
});
if (!agent) {
logger.warn(
'[api/server/controllers/agents/client.js #useMemory] No agent found for memory',
memoryConfig,
);
return;
}
const llmConfig = Object.assign(
{
provider: agent.provider,
model: agent.model,
},
agent.model_parameters,
);
/** @type {import('@librechat/api').MemoryConfig} */
const config = {
validKeys: memoryConfig.validKeys,
instructions: agent.instructions,
llmConfig,
tokenLimit: memoryConfig.tokenLimit,
};
const userId = this.options.req.user.id + '';
const messageId = this.responseMessageId + '';
const conversationId = this.conversationId + '';
const [withoutKeys, processMemory] = await createMemoryProcessor({
userId,
config,
messageId,
conversationId,
memoryMethods: {
setMemory,
deleteMemory,
getFormattedMemories,
},
res: this.options.res,
});
this.processMemory = processMemory;
return withoutKeys;
}
/**
* @param {BaseMessage[]} messages
* @returns {Promise<void | (TAttachment | null)[]>}
*/
async runMemory(messages) {
try {
if (this.processMemory == null) {
return;
}
/** @type {TCustomConfig['memory']} */
const memoryConfig = this.options.req?.app?.locals?.memory;
const messageWindowSize = memoryConfig?.messageWindowSize ?? 5;
let messagesToProcess = [...messages];
if (messages.length > messageWindowSize) {
for (let i = messages.length - messageWindowSize; i >= 0; i--) {
const potentialWindow = messages.slice(i, i + messageWindowSize);
if (potentialWindow[0]?.role === 'user') {
messagesToProcess = [...potentialWindow];
break;
}
}
if (messagesToProcess.length === messages.length) {
messagesToProcess = [...messages.slice(-messageWindowSize)];
}
}
return await this.processMemory(messagesToProcess);
} catch (error) {
logger.error('Memory Agent failed to process memory', error);
}
}
/** @type {sendCompletion} */
async sendCompletion(payload, opts = {}) {
await this.chatCompletion({
@ -576,100 +655,13 @@ class AgentClient extends BaseClient {
let config;
/** @type {ReturnType<createRun>} */
let run;
/** @type {Promise<(TAttachment | null)[] | undefined>} */
let memoryPromise;
try {
if (!abortController) {
abortController = new AbortController();
}
// if (this.options.headers) {
// opts.defaultHeaders = { ...opts.defaultHeaders, ...this.options.headers };
// }
// if (this.options.proxy) {
// opts.httpAgent = new HttpsProxyAgent(this.options.proxy);
// }
// if (this.isVisionModel) {
// modelOptions.max_tokens = 4000;
// }
// /** @type {TAzureConfig | undefined} */
// const azureConfig = this.options?.req?.app?.locals?.[EModelEndpoint.azureOpenAI];
// if (
// (this.azure && this.isVisionModel && azureConfig) ||
// (azureConfig && this.isVisionModel && this.options.endpoint === EModelEndpoint.azureOpenAI)
// ) {
// const { modelGroupMap, groupMap } = azureConfig;
// const {
// azureOptions,
// baseURL,
// headers = {},
// serverless,
// } = mapModelToAzureConfig({
// modelName: modelOptions.model,
// modelGroupMap,
// groupMap,
// });
// opts.defaultHeaders = resolveHeaders(headers);
// this.langchainProxy = extractBaseURL(baseURL);
// this.apiKey = azureOptions.azureOpenAIApiKey;
// const groupName = modelGroupMap[modelOptions.model].group;
// this.options.addParams = azureConfig.groupMap[groupName].addParams;
// this.options.dropParams = azureConfig.groupMap[groupName].dropParams;
// // Note: `forcePrompt` not re-assigned as only chat models are vision models
// this.azure = !serverless && azureOptions;
// this.azureEndpoint =
// !serverless && genAzureChatCompletion(this.azure, modelOptions.model, this);
// }
// if (this.azure || this.options.azure) {
// /* Azure Bug, extremely short default `max_tokens` response */
// if (!modelOptions.max_tokens && modelOptions.model === 'gpt-4-vision-preview') {
// modelOptions.max_tokens = 4000;
// }
// /* Azure does not accept `model` in the body, so we need to remove it. */
// delete modelOptions.model;
// opts.baseURL = this.langchainProxy
// ? constructAzureURL({
// baseURL: this.langchainProxy,
// azureOptions: this.azure,
// })
// : this.azureEndpoint.split(/(?<!\/)\/(chat|completion)\//)[0];
// opts.defaultQuery = { 'api-version': this.azure.azureOpenAIApiVersion };
// opts.defaultHeaders = { ...opts.defaultHeaders, 'api-key': this.apiKey };
// }
// if (process.env.OPENAI_ORGANIZATION) {
// opts.organization = process.env.OPENAI_ORGANIZATION;
// }
// if (this.options.addParams && typeof this.options.addParams === 'object') {
// modelOptions = {
// ...modelOptions,
// ...this.options.addParams,
// };
// logger.debug('[api/server/controllers/agents/client.js #chatCompletion] added params', {
// addParams: this.options.addParams,
// modelOptions,
// });
// }
// if (this.options.dropParams && Array.isArray(this.options.dropParams)) {
// this.options.dropParams.forEach((param) => {
// delete modelOptions[param];
// });
// logger.debug('[api/server/controllers/agents/client.js #chatCompletion] dropped params', {
// dropParams: this.options.dropParams,
// modelOptions,
// });
// }
/** @type {TCustomConfig['endpoints']['agents']} */
const agentsEConfig = this.options.req.app.locals[EModelEndpoint.agents];
@ -766,6 +758,10 @@ class AgentClient extends BaseClient {
messages = addCacheControl(messages);
}
if (i === 0) {
memoryPromise = this.runMemory(messages);
}
run = await createRun({
agent,
req: this.options.req,
@ -801,10 +797,9 @@ class AgentClient extends BaseClient {
run.Graph.contentData = contentData;
}
const encoding = this.getEncoding();
await run.processStream({ messages }, config, {
keepContent: i !== 0,
tokenCounter: createTokenCounter(encoding),
tokenCounter: createTokenCounter(this.getEncoding()),
indexTokenCountMap: currentIndexCountMap,
maxContextTokens: agent.maxContextTokens,
callbacks: {
@ -919,6 +914,12 @@ class AgentClient extends BaseClient {
});
try {
if (memoryPromise) {
const attachments = await memoryPromise;
if (attachments && attachments.length > 0) {
this.artifactPromises.push(...attachments);
}
}
await this.recordCollectedUsage({ context: 'message' });
} catch (err) {
logger.error(
@ -927,6 +928,12 @@ class AgentClient extends BaseClient {
);
}
} catch (err) {
if (memoryPromise) {
const attachments = await memoryPromise;
if (attachments && attachments.length > 0) {
this.artifactPromises.push(...attachments);
}
}
logger.error(
'[api/server/controllers/agents/client.js #sendCompletion] Operation aborted',
err,

View file

@ -1,94 +0,0 @@
const { Run, Providers } = require('@librechat/agents');
const { providerEndpointMap, KnownEndpoints } = require('librechat-data-provider');
/**
* @typedef {import('@librechat/agents').t} t
* @typedef {import('@librechat/agents').StandardGraphConfig} StandardGraphConfig
* @typedef {import('@librechat/agents').StreamEventData} StreamEventData
* @typedef {import('@librechat/agents').EventHandler} EventHandler
* @typedef {import('@librechat/agents').GraphEvents} GraphEvents
* @typedef {import('@librechat/agents').LLMConfig} LLMConfig
* @typedef {import('@librechat/agents').IState} IState
*/
const customProviders = new Set([
Providers.XAI,
Providers.OLLAMA,
Providers.DEEPSEEK,
Providers.OPENROUTER,
]);
/**
* Creates a new Run instance with custom handlers and configuration.
*
* @param {Object} options - The options for creating the Run instance.
* @param {ServerRequest} [options.req] - The server request.
* @param {string | undefined} [options.runId] - Optional run ID; otherwise, a new run ID will be generated.
* @param {Agent} options.agent - The agent for this run.
* @param {AbortSignal} options.signal - The signal for this run.
* @param {Record<GraphEvents, EventHandler> | undefined} [options.customHandlers] - Custom event handlers.
* @param {boolean} [options.streaming=true] - Whether to use streaming.
* @param {boolean} [options.streamUsage=true] - Whether to stream usage information.
* @returns {Promise<Run<IState>>} A promise that resolves to a new Run instance.
*/
async function createRun({
runId,
agent,
signal,
customHandlers,
streaming = true,
streamUsage = true,
}) {
const provider = providerEndpointMap[agent.provider] ?? agent.provider;
/** @type {LLMConfig} */
const llmConfig = Object.assign(
{
provider,
streaming,
streamUsage,
},
agent.model_parameters,
);
/** Resolves issues with new OpenAI usage field */
if (
customProviders.has(agent.provider) ||
(agent.provider === Providers.OPENAI && agent.endpoint !== agent.provider)
) {
llmConfig.streamUsage = false;
llmConfig.usage = true;
}
/** @type {'reasoning_content' | 'reasoning'} */
let reasoningKey;
if (
llmConfig.configuration?.baseURL?.includes(KnownEndpoints.openrouter) ||
(agent.endpoint && agent.endpoint.toLowerCase().includes(KnownEndpoints.openrouter))
) {
reasoningKey = 'reasoning';
}
/** @type {StandardGraphConfig} */
const graphConfig = {
signal,
llmConfig,
reasoningKey,
tools: agent.tools,
instructions: agent.instructions,
additional_instructions: agent.additional_instructions,
// toolEnd: agent.end_after_tools,
};
// TEMPORARY FOR TESTING
if (agent.provider === Providers.ANTHROPIC || agent.provider === Providers.BEDROCK) {
graphConfig.streamBuffer = 2000;
}
return Run.create({
runId,
graphConfig,
customHandlers,
});
}
module.exports = { createRun };

View file

@ -117,7 +117,7 @@ const startServer = async () => {
app.use('/api/agents', routes.agents);
app.use('/api/banner', routes.banner);
app.use('/api/bedrock', routes.bedrock);
app.use('/api/memories', routes.memories);
app.use('/api/tags', routes.tags);
app.use((req, res) => {

View file

@ -1,5 +1,5 @@
const checkAdmin = require('./checkAdmin');
const { checkAccess, generateCheckAccess } = require('./generateCheckAccess');
const checkAdmin = require('./admin');
const { checkAccess, generateCheckAccess } = require('./access');
module.exports = {
checkAdmin,

View file

@ -2,8 +2,8 @@ const fs = require('fs');
const path = require('path');
const crypto = require('crypto');
const multer = require('multer');
const { sanitizeFilename } = require('@librechat/api');
const { fileConfig: defaultFileConfig, mergeFileConfig } = require('librechat-data-provider');
const { sanitizeFilename } = require('~/server/utils/handleText');
const { getCustomConfig } = require('~/server/services/Config');
const storage = multer.diskStorage({

View file

@ -4,6 +4,7 @@ const tokenizer = require('./tokenizer');
const endpoints = require('./endpoints');
const staticRoute = require('./static');
const messages = require('./messages');
const memories = require('./memories');
const presets = require('./presets');
const prompts = require('./prompts');
const balance = require('./balance');
@ -51,6 +52,7 @@ module.exports = {
presets,
balance,
messages,
memories,
endpoints,
tokenizer,
assistants,

View file

@ -0,0 +1,231 @@
const express = require('express');
const { Tokenizer } = require('@librechat/api');
const { PermissionTypes, Permissions } = require('librechat-data-provider');
const {
getAllUserMemories,
toggleUserMemories,
createMemory,
setMemory,
deleteMemory,
} = require('~/models');
const { requireJwtAuth, generateCheckAccess } = require('~/server/middleware');
const router = express.Router();
const checkMemoryRead = generateCheckAccess(PermissionTypes.MEMORIES, [
Permissions.USE,
Permissions.READ,
]);
const checkMemoryCreate = generateCheckAccess(PermissionTypes.MEMORIES, [
Permissions.USE,
Permissions.CREATE,
]);
const checkMemoryUpdate = generateCheckAccess(PermissionTypes.MEMORIES, [
Permissions.USE,
Permissions.UPDATE,
]);
const checkMemoryDelete = generateCheckAccess(PermissionTypes.MEMORIES, [
Permissions.USE,
Permissions.UPDATE,
]);
const checkMemoryOptOut = generateCheckAccess(PermissionTypes.MEMORIES, [
Permissions.USE,
Permissions.OPT_OUT,
]);
router.use(requireJwtAuth);
/**
* GET /memories
* Returns all memories for the authenticated user, sorted by updated_at (newest first).
* Also includes memory usage percentage based on token limit.
*/
router.get('/', checkMemoryRead, async (req, res) => {
try {
const memories = await getAllUserMemories(req.user.id);
const sortedMemories = memories.sort(
(a, b) => new Date(b.updated_at).getTime() - new Date(a.updated_at).getTime(),
);
const totalTokens = memories.reduce((sum, memory) => {
return sum + (memory.tokenCount || 0);
}, 0);
const memoryConfig = req.app.locals?.memory;
const tokenLimit = memoryConfig?.tokenLimit;
let usagePercentage = null;
if (tokenLimit && tokenLimit > 0) {
usagePercentage = Math.min(100, Math.round((totalTokens / tokenLimit) * 100));
}
res.json({
memories: sortedMemories,
totalTokens,
tokenLimit: tokenLimit || null,
usagePercentage,
});
} catch (error) {
res.status(500).json({ error: error.message });
}
});
/**
* POST /memories
* Creates a new memory entry for the authenticated user.
* Body: { key: string, value: string }
* Returns 201 and { created: true, memory: <createdDoc> } when successful.
*/
router.post('/', checkMemoryCreate, async (req, res) => {
const { key, value } = req.body;
if (typeof key !== 'string' || key.trim() === '') {
return res.status(400).json({ error: 'Key is required and must be a non-empty string.' });
}
if (typeof value !== 'string' || value.trim() === '') {
return res.status(400).json({ error: 'Value is required and must be a non-empty string.' });
}
try {
const tokenCount = Tokenizer.getTokenCount(value, 'o200k_base');
const memories = await getAllUserMemories(req.user.id);
// Check token limit
const memoryConfig = req.app.locals?.memory;
const tokenLimit = memoryConfig?.tokenLimit;
if (tokenLimit) {
const currentTotalTokens = memories.reduce(
(sum, memory) => sum + (memory.tokenCount || 0),
0,
);
if (currentTotalTokens + tokenCount > tokenLimit) {
return res.status(400).json({
error: `Adding this memory would exceed the token limit of ${tokenLimit}. Current usage: ${currentTotalTokens} tokens.`,
});
}
}
const result = await createMemory({
userId: req.user.id,
key: key.trim(),
value: value.trim(),
tokenCount,
});
if (!result.ok) {
return res.status(500).json({ error: 'Failed to create memory.' });
}
const updatedMemories = await getAllUserMemories(req.user.id);
const newMemory = updatedMemories.find((m) => m.key === key.trim());
res.status(201).json({ created: true, memory: newMemory });
} catch (error) {
if (error.message && error.message.includes('already exists')) {
return res.status(409).json({ error: 'Memory with this key already exists.' });
}
res.status(500).json({ error: error.message });
}
});
/**
* PATCH /memories/preferences
* Updates the user's memory preferences (e.g., enabling/disabling memories).
* Body: { memories: boolean }
* Returns 200 and { updated: true, preferences: { memories: boolean } } when successful.
*/
router.patch('/preferences', checkMemoryOptOut, async (req, res) => {
const { memories } = req.body;
if (typeof memories !== 'boolean') {
return res.status(400).json({ error: 'memories must be a boolean value.' });
}
try {
const updatedUser = await toggleUserMemories(req.user.id, memories);
if (!updatedUser) {
return res.status(404).json({ error: 'User not found.' });
}
res.json({
updated: true,
preferences: {
memories: updatedUser.personalization?.memories ?? true,
},
});
} catch (error) {
res.status(500).json({ error: error.message });
}
});
/**
* PATCH /memories/:key
* Updates the value of an existing memory entry for the authenticated user.
* Body: { value: string }
* Returns 200 and { updated: true, memory: <updatedDoc> } when successful.
*/
router.patch('/:key', checkMemoryUpdate, async (req, res) => {
const { key } = req.params;
const { value } = req.body || {};
if (typeof value !== 'string' || value.trim() === '') {
return res.status(400).json({ error: 'Value is required and must be a non-empty string.' });
}
try {
const tokenCount = Tokenizer.getTokenCount(value, 'o200k_base');
const memories = await getAllUserMemories(req.user.id);
const existingMemory = memories.find((m) => m.key === key);
if (!existingMemory) {
return res.status(404).json({ error: 'Memory not found.' });
}
const result = await setMemory({
userId: req.user.id,
key,
value,
tokenCount,
});
if (!result.ok) {
return res.status(500).json({ error: 'Failed to update memory.' });
}
const updatedMemories = await getAllUserMemories(req.user.id);
const updatedMemory = updatedMemories.find((m) => m.key === key);
res.json({ updated: true, memory: updatedMemory });
} catch (error) {
res.status(500).json({ error: error.message });
}
});
/**
* DELETE /memories/:key
* Deletes a memory entry for the authenticated user.
* Returns 200 and { deleted: true } when successful.
*/
router.delete('/:key', checkMemoryDelete, async (req, res) => {
const { key } = req.params;
try {
const result = await deleteMemory({ userId: req.user.id, key });
if (!result.ok) {
return res.status(404).json({ error: 'Memory not found.' });
}
res.json({ deleted: true });
} catch (error) {
res.status(500).json({ error: error.message });
}
});
module.exports = router;

View file

@ -1,6 +1,7 @@
const express = require('express');
const {
promptPermissionsSchema,
memoryPermissionsSchema,
agentPermissionsSchema,
PermissionTypes,
roleDefaults,
@ -118,4 +119,43 @@ router.put('/:roleName/agents', checkAdmin, async (req, res) => {
}
});
/**
* PUT /api/roles/:roleName/memories
* Update memory permissions for a specific role
*/
router.put('/:roleName/memories', checkAdmin, async (req, res) => {
const { roleName: _r } = req.params;
// TODO: TEMP, use a better parsing for roleName
const roleName = _r.toUpperCase();
/** @type {TRole['permissions']['MEMORIES']} */
const updates = req.body;
try {
const parsedUpdates = memoryPermissionsSchema.partial().parse(updates);
const role = await getRoleByName(roleName);
if (!role) {
return res.status(404).send({ message: 'Role not found' });
}
const currentPermissions =
role.permissions?.[PermissionTypes.MEMORIES] || role[PermissionTypes.MEMORIES] || {};
const mergedUpdates = {
permissions: {
...role.permissions,
[PermissionTypes.MEMORIES]: {
...currentPermissions,
...parsedUpdates,
},
},
};
const updatedRole = await updateRoleByName(roleName, mergedUpdates);
res.status(200).send(updatedRole);
} catch (error) {
return res.status(400).send({ message: 'Invalid memory permissions.', error: error.errors });
}
});
module.exports = router;

View file

@ -1,6 +1,8 @@
const jwt = require('jsonwebtoken');
const { nanoid } = require('nanoid');
const { sendEvent } = require('@librechat/api');
const { tool } = require('@langchain/core/tools');
const { logger } = require('@librechat/data-schemas');
const { GraphEvents, sleep } = require('@librechat/agents');
const {
Time,
@ -13,10 +15,10 @@ const {
actionDomainSeparator,
} = require('librechat-data-provider');
const { refreshAccessToken } = require('~/server/services/TokenService');
const { logger, getFlowStateManager, sendEvent } = require('~/config');
const { encryptV2, decryptV2 } = require('~/server/utils/crypto');
const { getActions, deleteActions } = require('~/models/Action');
const { deleteAssistant } = require('~/models/Assistant');
const { getFlowStateManager } = require('~/config');
const { logAxiosError } = require('~/utils');
const { getLogStores } = require('~/cache');
const { findToken } = require('~/models');

View file

@ -3,6 +3,7 @@ const {
loadOCRConfig,
processMCPEnv,
EModelEndpoint,
loadMemoryConfig,
getConfigDefaults,
loadWebSearchConfig,
} = require('librechat-data-provider');
@ -44,6 +45,7 @@ const AppService = async (app) => {
const ocr = loadOCRConfig(config.ocr);
const webSearch = loadWebSearchConfig(config.webSearch);
checkWebSearchConfig(webSearch);
const memory = loadMemoryConfig(config.memory);
const filteredTools = config.filteredTools;
const includedTools = config.includedTools;
const fileStrategy = config.fileStrategy ?? configDefaults.fileStrategy;
@ -88,6 +90,7 @@ const AppService = async (app) => {
const defaultLocals = {
ocr,
paths,
memory,
webSearch,
fileStrategy,
socialLogins,

View file

@ -0,0 +1,196 @@
const { Providers } = require('@librechat/agents');
const { primeResources, optionalChainWithEmptyCheck } = require('@librechat/api');
const {
ErrorTypes,
EModelEndpoint,
EToolResources,
replaceSpecialVars,
providerEndpointMap,
} = require('librechat-data-provider');
const initAnthropic = require('~/server/services/Endpoints/anthropic/initialize');
const getBedrockOptions = require('~/server/services/Endpoints/bedrock/options');
const initOpenAI = require('~/server/services/Endpoints/openAI/initialize');
const initCustom = require('~/server/services/Endpoints/custom/initialize');
const initGoogle = require('~/server/services/Endpoints/google/initialize');
const generateArtifactsPrompt = require('~/app/clients/prompts/artifacts');
const { getCustomEndpointConfig } = require('~/server/services/Config');
const { processFiles } = require('~/server/services/Files/process');
const { getConvoFiles } = require('~/models/Conversation');
const { getToolFilesByIds } = require('~/models/File');
const { getModelMaxTokens } = require('~/utils');
const { getFiles } = require('~/models/File');
const providerConfigMap = {
[Providers.XAI]: initCustom,
[Providers.OLLAMA]: initCustom,
[Providers.DEEPSEEK]: initCustom,
[Providers.OPENROUTER]: initCustom,
[EModelEndpoint.openAI]: initOpenAI,
[EModelEndpoint.google]: initGoogle,
[EModelEndpoint.azureOpenAI]: initOpenAI,
[EModelEndpoint.anthropic]: initAnthropic,
[EModelEndpoint.bedrock]: getBedrockOptions,
};
/**
* @param {object} params
* @param {ServerRequest} params.req
* @param {ServerResponse} params.res
* @param {Agent} params.agent
* @param {string | null} [params.conversationId]
* @param {Array<IMongoFile>} [params.requestFiles]
* @param {typeof import('~/server/services/ToolService').loadAgentTools | undefined} [params.loadTools]
* @param {TEndpointOption} [params.endpointOption]
* @param {Set<string>} [params.allowedProviders]
* @param {boolean} [params.isInitialAgent]
* @returns {Promise<Agent & { tools: StructuredTool[], attachments: Array<MongoFile>, toolContextMap: Record<string, unknown>, maxContextTokens: number }>}
*/
const initializeAgent = async ({
req,
res,
agent,
loadTools,
requestFiles,
conversationId,
endpointOption,
allowedProviders,
isInitialAgent = false,
}) => {
if (allowedProviders.size > 0 && !allowedProviders.has(agent.provider)) {
throw new Error(
`{ "type": "${ErrorTypes.INVALID_AGENT_PROVIDER}", "info": "${agent.provider}" }`,
);
}
let currentFiles;
if (
isInitialAgent &&
conversationId != null &&
(agent.model_parameters?.resendFiles ?? true) === true
) {
const fileIds = (await getConvoFiles(conversationId)) ?? [];
/** @type {Set<EToolResources>} */
const toolResourceSet = new Set();
for (const tool of agent.tools) {
if (EToolResources[tool]) {
toolResourceSet.add(EToolResources[tool]);
}
}
const toolFiles = await getToolFilesByIds(fileIds, toolResourceSet);
if (requestFiles.length || toolFiles.length) {
currentFiles = await processFiles(requestFiles.concat(toolFiles));
}
} else if (isInitialAgent && requestFiles.length) {
currentFiles = await processFiles(requestFiles);
}
const { attachments, tool_resources } = await primeResources({
req,
getFiles,
attachments: currentFiles,
tool_resources: agent.tool_resources,
requestFileSet: new Set(requestFiles?.map((file) => file.file_id)),
});
const provider = agent.provider;
const { tools, toolContextMap } =
(await loadTools?.({
req,
res,
provider,
agentId: agent.id,
tools: agent.tools,
model: agent.model,
tool_resources,
})) ?? {};
agent.endpoint = provider;
let getOptions = providerConfigMap[provider];
if (!getOptions && providerConfigMap[provider.toLowerCase()] != null) {
agent.provider = provider.toLowerCase();
getOptions = providerConfigMap[agent.provider];
} else if (!getOptions) {
const customEndpointConfig = await getCustomEndpointConfig(provider);
if (!customEndpointConfig) {
throw new Error(`Provider ${provider} not supported`);
}
getOptions = initCustom;
agent.provider = Providers.OPENAI;
}
const model_parameters = Object.assign(
{},
agent.model_parameters ?? { model: agent.model },
isInitialAgent === true ? endpointOption?.model_parameters : {},
);
const _endpointOption =
isInitialAgent === true
? Object.assign({}, endpointOption, { model_parameters })
: { model_parameters };
const options = await getOptions({
req,
res,
optionsOnly: true,
overrideEndpoint: provider,
overrideModel: agent.model,
endpointOption: _endpointOption,
});
if (
agent.endpoint === EModelEndpoint.azureOpenAI &&
options.llmConfig?.azureOpenAIApiInstanceName == null
) {
agent.provider = Providers.OPENAI;
}
if (options.provider != null) {
agent.provider = options.provider;
}
/** @type {import('@librechat/agents').ClientOptions} */
agent.model_parameters = Object.assign(model_parameters, options.llmConfig);
if (options.configOptions) {
agent.model_parameters.configuration = options.configOptions;
}
if (!agent.model_parameters.model) {
agent.model_parameters.model = agent.model;
}
if (agent.instructions && agent.instructions !== '') {
agent.instructions = replaceSpecialVars({
text: agent.instructions,
user: req.user,
});
}
if (typeof agent.artifacts === 'string' && agent.artifacts !== '') {
agent.additional_instructions = generateArtifactsPrompt({
endpoint: agent.provider,
artifacts: agent.artifacts,
});
}
const tokensModel =
agent.provider === EModelEndpoint.azureOpenAI ? agent.model : agent.model_parameters.model;
const maxTokens = optionalChainWithEmptyCheck(
agent.model_parameters.maxOutputTokens,
agent.model_parameters.maxTokens,
0,
);
const maxContextTokens = optionalChainWithEmptyCheck(
agent.model_parameters.maxContextTokens,
agent.max_context_tokens,
getModelMaxTokens(tokensModel, providerEndpointMap[provider]),
4096,
);
return {
...agent,
tools,
attachments,
toolContextMap,
maxContextTokens: (maxContextTokens - maxTokens) * 0.9,
};
};
module.exports = { initializeAgent };

View file

@ -1,294 +1,41 @@
const { createContentAggregator, Providers } = require('@librechat/agents');
const {
Constants,
ErrorTypes,
EModelEndpoint,
EToolResources,
getResponseSender,
AgentCapabilities,
replaceSpecialVars,
providerEndpointMap,
} = require('librechat-data-provider');
const { logger } = require('@librechat/data-schemas');
const { createContentAggregator } = require('@librechat/agents');
const { Constants, EModelEndpoint, getResponseSender } = require('librechat-data-provider');
const {
getDefaultHandlers,
createToolEndCallback,
} = require('~/server/controllers/agents/callbacks');
const initAnthropic = require('~/server/services/Endpoints/anthropic/initialize');
const getBedrockOptions = require('~/server/services/Endpoints/bedrock/options');
const initOpenAI = require('~/server/services/Endpoints/openAI/initialize');
const initCustom = require('~/server/services/Endpoints/custom/initialize');
const initGoogle = require('~/server/services/Endpoints/google/initialize');
const generateArtifactsPrompt = require('~/app/clients/prompts/artifacts');
const { getCustomEndpointConfig } = require('~/server/services/Config');
const { processFiles } = require('~/server/services/Files/process');
const { initializeAgent } = require('~/server/services/Endpoints/agents/agent');
const { loadAgentTools } = require('~/server/services/ToolService');
const AgentClient = require('~/server/controllers/agents/client');
const { getConvoFiles } = require('~/models/Conversation');
const { getToolFilesByIds } = require('~/models/File');
const { getModelMaxTokens } = require('~/utils');
const { getAgent } = require('~/models/Agent');
const { getFiles } = require('~/models/File');
const { logger } = require('~/config');
const providerConfigMap = {
[Providers.XAI]: initCustom,
[Providers.OLLAMA]: initCustom,
[Providers.DEEPSEEK]: initCustom,
[Providers.OPENROUTER]: initCustom,
[EModelEndpoint.openAI]: initOpenAI,
[EModelEndpoint.google]: initGoogle,
[EModelEndpoint.azureOpenAI]: initOpenAI,
[EModelEndpoint.anthropic]: initAnthropic,
[EModelEndpoint.bedrock]: getBedrockOptions,
};
/**
* @param {Object} params
* @param {ServerRequest} params.req
* @param {Promise<Array<MongoFile | null>> | undefined} [params.attachments]
* @param {Set<string>} params.requestFileSet
* @param {AgentToolResources | undefined} [params.tool_resources]
* @returns {Promise<{ attachments: Array<MongoFile | undefined> | undefined, tool_resources: AgentToolResources | undefined }>}
*/
const primeResources = async ({
req,
attachments: _attachments,
tool_resources: _tool_resources,
requestFileSet,
}) => {
try {
/** @type {Array<MongoFile | undefined> | undefined} */
let attachments;
const tool_resources = _tool_resources ?? {};
const isOCREnabled = (req.app.locals?.[EModelEndpoint.agents]?.capabilities ?? []).includes(
AgentCapabilities.ocr,
);
if (tool_resources[EToolResources.ocr]?.file_ids && isOCREnabled) {
const context = await getFiles(
{
file_id: { $in: tool_resources.ocr.file_ids },
},
{},
{},
);
attachments = (attachments ?? []).concat(context);
}
if (!_attachments) {
return { attachments, tool_resources };
}
/** @type {Array<MongoFile | undefined> | undefined} */
const files = await _attachments;
if (!attachments) {
/** @type {Array<MongoFile | undefined>} */
attachments = [];
}
for (const file of files) {
if (!file) {
continue;
}
if (file.metadata?.fileIdentifier) {
const execute_code = tool_resources[EToolResources.execute_code] ?? {};
if (!execute_code.files) {
tool_resources[EToolResources.execute_code] = { ...execute_code, files: [] };
}
tool_resources[EToolResources.execute_code].files.push(file);
} else if (file.embedded === true) {
const file_search = tool_resources[EToolResources.file_search] ?? {};
if (!file_search.files) {
tool_resources[EToolResources.file_search] = { ...file_search, files: [] };
}
tool_resources[EToolResources.file_search].files.push(file);
} else if (
requestFileSet.has(file.file_id) &&
file.type.startsWith('image') &&
file.height &&
file.width
) {
const image_edit = tool_resources[EToolResources.image_edit] ?? {};
if (!image_edit.files) {
tool_resources[EToolResources.image_edit] = { ...image_edit, files: [] };
}
tool_resources[EToolResources.image_edit].files.push(file);
}
attachments.push(file);
}
return { attachments, tool_resources };
} catch (error) {
logger.error('Error priming resources', error);
return { attachments: _attachments, tool_resources: _tool_resources };
}
};
/**
* @param {...string | number} values
* @returns {string | number | undefined}
*/
function optionalChainWithEmptyCheck(...values) {
for (const value of values) {
if (value !== undefined && value !== null && value !== '') {
return value;
}
}
return values[values.length - 1];
}
function createToolLoader() {
/**
* @param {object} params
* @param {ServerRequest} params.req
* @param {ServerResponse} params.res
* @param {Agent} params.agent
* @param {Set<string>} [params.allowedProviders]
* @param {object} [params.endpointOption]
* @param {boolean} [params.isInitialAgent]
* @returns {Promise<Agent>}
* @param {string} params.agentId
* @param {string[]} params.tools
* @param {string} params.provider
* @param {string} params.model
* @param {AgentToolResources} params.tool_resources
* @returns {Promise<{ tools: StructuredTool[], toolContextMap: Record<string, unknown> } | undefined>}
*/
const initializeAgentOptions = async ({
return async function loadTools({ req, res, agentId, tools, provider, model, tool_resources }) {
const agent = { id: agentId, tools, provider, model };
try {
return await loadAgentTools({
req,
res,
agent,
endpointOption,
allowedProviders,
isInitialAgent = false,
}) => {
if (allowedProviders.size > 0 && !allowedProviders.has(agent.provider)) {
throw new Error(
`{ "type": "${ErrorTypes.INVALID_AGENT_PROVIDER}", "info": "${agent.provider}" }`,
);
}
let currentFiles;
/** @type {Array<MongoFile>} */
const requestFiles = req.body.files ?? [];
if (
isInitialAgent &&
req.body.conversationId != null &&
(agent.model_parameters?.resendFiles ?? true) === true
) {
const fileIds = (await getConvoFiles(req.body.conversationId)) ?? [];
/** @type {Set<EToolResources>} */
const toolResourceSet = new Set();
for (const tool of agent.tools) {
if (EToolResources[tool]) {
toolResourceSet.add(EToolResources[tool]);
}
}
const toolFiles = await getToolFilesByIds(fileIds, toolResourceSet);
if (requestFiles.length || toolFiles.length) {
currentFiles = await processFiles(requestFiles.concat(toolFiles));
}
} else if (isInitialAgent && requestFiles.length) {
currentFiles = await processFiles(requestFiles);
}
const { attachments, tool_resources } = await primeResources({
req,
attachments: currentFiles,
tool_resources: agent.tool_resources,
requestFileSet: new Set(requestFiles.map((file) => file.file_id)),
});
const provider = agent.provider;
const { tools, toolContextMap } = await loadAgentTools({
req,
res,
agent: {
id: agent.id,
tools: agent.tools,
provider,
model: agent.model,
},
tool_resources,
});
agent.endpoint = provider;
let getOptions = providerConfigMap[provider];
if (!getOptions && providerConfigMap[provider.toLowerCase()] != null) {
agent.provider = provider.toLowerCase();
getOptions = providerConfigMap[agent.provider];
} else if (!getOptions) {
const customEndpointConfig = await getCustomEndpointConfig(provider);
if (!customEndpointConfig) {
throw new Error(`Provider ${provider} not supported`);
} catch (error) {
logger.error('Error loading tools for agent ' + agentId, error);
}
getOptions = initCustom;
agent.provider = Providers.OPENAI;
}
const model_parameters = Object.assign(
{},
agent.model_parameters ?? { model: agent.model },
isInitialAgent === true ? endpointOption?.model_parameters : {},
);
const _endpointOption =
isInitialAgent === true
? Object.assign({}, endpointOption, { model_parameters })
: { model_parameters };
const options = await getOptions({
req,
res,
optionsOnly: true,
overrideEndpoint: provider,
overrideModel: agent.model,
endpointOption: _endpointOption,
});
if (
agent.endpoint === EModelEndpoint.azureOpenAI &&
options.llmConfig?.azureOpenAIApiInstanceName == null
) {
agent.provider = Providers.OPENAI;
}
if (options.provider != null) {
agent.provider = options.provider;
}
/** @type {import('@librechat/agents').ClientOptions} */
agent.model_parameters = Object.assign(model_parameters, options.llmConfig);
if (options.configOptions) {
agent.model_parameters.configuration = options.configOptions;
}
if (!agent.model_parameters.model) {
agent.model_parameters.model = agent.model;
}
if (agent.instructions && agent.instructions !== '') {
agent.instructions = replaceSpecialVars({
text: agent.instructions,
user: req.user,
});
}
if (typeof agent.artifacts === 'string' && agent.artifacts !== '') {
agent.additional_instructions = generateArtifactsPrompt({
endpoint: agent.provider,
artifacts: agent.artifacts,
});
}
const tokensModel =
agent.provider === EModelEndpoint.azureOpenAI ? agent.model : agent.model_parameters.model;
const maxTokens = optionalChainWithEmptyCheck(
agent.model_parameters.maxOutputTokens,
agent.model_parameters.maxTokens,
0,
);
const maxContextTokens = optionalChainWithEmptyCheck(
agent.model_parameters.maxContextTokens,
agent.max_context_tokens,
getModelMaxTokens(tokensModel, providerEndpointMap[provider]),
4096,
);
return {
...agent,
tools,
attachments,
toolContextMap,
maxContextTokens: (maxContextTokens - maxTokens) * 0.9,
};
};
}
const initializeClient = async ({ req, res, endpointOption }) => {
if (!endpointOption) {
@ -313,7 +60,6 @@ const initializeClient = async ({ req, res, endpointOption }) => {
throw new Error('No agent promise provided');
}
// Initialize primary agent
const primaryAgent = await endpointOption.agent;
if (!primaryAgent) {
throw new Error('Agent not found');
@ -323,10 +69,18 @@ const initializeClient = async ({ req, res, endpointOption }) => {
/** @type {Set<string>} */
const allowedProviders = new Set(req?.app?.locals?.[EModelEndpoint.agents]?.allowedProviders);
// Handle primary agent
const primaryConfig = await initializeAgentOptions({
const loadTools = createToolLoader();
/** @type {Array<MongoFile>} */
const requestFiles = req.body.files ?? [];
/** @type {string} */
const conversationId = req.body.conversationId;
const primaryConfig = await initializeAgent({
req,
res,
loadTools,
requestFiles,
conversationId,
agent: primaryAgent,
endpointOption,
allowedProviders,
@ -340,10 +94,13 @@ const initializeClient = async ({ req, res, endpointOption }) => {
if (!agent) {
throw new Error(`Agent ${agentId} not found`);
}
const config = await initializeAgentOptions({
const config = await initializeAgent({
req,
res,
agent,
loadTools,
requestFiles,
conversationId,
endpointOption,
allowedProviders,
});

View file

@ -1,5 +1,6 @@
const OpenAI = require('openai');
const { HttpsProxyAgent } = require('https-proxy-agent');
const { constructAzureURL, isUserProvided } = require('@librechat/api');
const {
ErrorTypes,
EModelEndpoint,
@ -12,8 +13,6 @@ const {
checkUserKeyExpiry,
} = require('~/server/services/UserService');
const OpenAIClient = require('~/app/clients/OpenAIClient');
const { isUserProvided } = require('~/server/utils');
const { constructAzureURL } = require('~/utils');
class Files {
constructor(client) {

View file

@ -1,4 +1,5 @@
const { HttpsProxyAgent } = require('https-proxy-agent');
const { createHandleLLMNewToken } = require('@librechat/api');
const {
AuthType,
Constants,
@ -8,7 +9,6 @@ const {
removeNullishValues,
} = require('librechat-data-provider');
const { getUserKey, checkUserKeyExpiry } = require('~/server/services/UserService');
const { createHandleLLMNewToken } = require('~/app/clients/generators');
const getOptions = async ({ req, overrideModel, endpointOption }) => {
const {

View file

@ -6,10 +6,9 @@ const {
extractEnvVariable,
} = require('librechat-data-provider');
const { Providers } = require('@librechat/agents');
const { getOpenAIConfig, createHandleLLMNewToken } = require('@librechat/api');
const { getUserKeyValues, checkUserKeyExpiry } = require('~/server/services/UserService');
const { getLLMConfig } = require('~/server/services/Endpoints/openAI/llm');
const { getCustomEndpointConfig } = require('~/server/services/Config');
const { createHandleLLMNewToken } = require('~/app/clients/generators');
const { fetchModels } = require('~/server/services/ModelService');
const OpenAIClient = require('~/app/clients/OpenAIClient');
const { isUserProvided } = require('~/server/utils');
@ -144,7 +143,7 @@ const initializeClient = async ({ req, res, endpointOption, optionsOnly, overrid
clientOptions,
);
clientOptions.modelOptions.user = req.user.id;
const options = getLLMConfig(apiKey, clientOptions, endpoint);
const options = getOpenAIConfig(apiKey, clientOptions, endpoint);
if (!customOptions.streamRate) {
return options;
}

View file

@ -1,11 +1,10 @@
const {
EModelEndpoint,
mapModelToAzureConfig,
resolveHeaders,
mapModelToAzureConfig,
} = require('librechat-data-provider');
const { isEnabled, isUserProvided, getAzureCredentials } = require('@librechat/api');
const { getUserKeyValues, checkUserKeyExpiry } = require('~/server/services/UserService');
const { isEnabled, isUserProvided } = require('~/server/utils');
const { getAzureCredentials } = require('~/utils');
const { PluginsClient } = require('~/app');
const initializeClient = async ({ req, res, endpointOption }) => {

View file

@ -4,12 +4,15 @@ const {
resolveHeaders,
mapModelToAzureConfig,
} = require('librechat-data-provider');
const {
isEnabled,
isUserProvided,
getOpenAIConfig,
getAzureCredentials,
createHandleLLMNewToken,
} = require('@librechat/api');
const { getUserKeyValues, checkUserKeyExpiry } = require('~/server/services/UserService');
const { getLLMConfig } = require('~/server/services/Endpoints/openAI/llm');
const { createHandleLLMNewToken } = require('~/app/clients/generators');
const { isEnabled, isUserProvided } = require('~/server/utils');
const OpenAIClient = require('~/app/clients/OpenAIClient');
const { getAzureCredentials } = require('~/utils');
const initializeClient = async ({
req,
@ -140,7 +143,7 @@ const initializeClient = async ({
modelOptions.model = modelName;
clientOptions = Object.assign({ modelOptions }, clientOptions);
clientOptions.modelOptions.user = req.user.id;
const options = getLLMConfig(apiKey, clientOptions);
const options = getOpenAIConfig(apiKey, clientOptions);
const streamRate = clientOptions.streamRate;
if (!streamRate) {
return options;

View file

@ -1,170 +0,0 @@
const { HttpsProxyAgent } = require('https-proxy-agent');
const { KnownEndpoints } = require('librechat-data-provider');
const { sanitizeModelName, constructAzureURL } = require('~/utils');
const { isEnabled } = require('~/server/utils');
/**
* Generates configuration options for creating a language model (LLM) instance.
* @param {string} apiKey - The API key for authentication.
* @param {Object} options - Additional options for configuring the LLM.
* @param {Object} [options.modelOptions] - Model-specific options.
* @param {string} [options.modelOptions.model] - The name of the model to use.
* @param {string} [options.modelOptions.user] - The user ID
* @param {number} [options.modelOptions.temperature] - Controls randomness in output generation (0-2).
* @param {number} [options.modelOptions.top_p] - Controls diversity via nucleus sampling (0-1).
* @param {number} [options.modelOptions.frequency_penalty] - Reduces repetition of token sequences (-2 to 2).
* @param {number} [options.modelOptions.presence_penalty] - Encourages discussing new topics (-2 to 2).
* @param {number} [options.modelOptions.max_tokens] - The maximum number of tokens to generate.
* @param {string[]} [options.modelOptions.stop] - Sequences where the API will stop generating further tokens.
* @param {string} [options.reverseProxyUrl] - URL for a reverse proxy, if used.
* @param {boolean} [options.useOpenRouter] - Flag to use OpenRouter API.
* @param {Object} [options.headers] - Additional headers for API requests.
* @param {string} [options.proxy] - Proxy server URL.
* @param {Object} [options.azure] - Azure-specific configurations.
* @param {boolean} [options.streaming] - Whether to use streaming mode.
* @param {Object} [options.addParams] - Additional parameters to add to the model options.
* @param {string[]} [options.dropParams] - Parameters to remove from the model options.
* @param {string|null} [endpoint=null] - The endpoint name
* @returns {Object} Configuration options for creating an LLM instance.
*/
function getLLMConfig(apiKey, options = {}, endpoint = null) {
let {
modelOptions = {},
reverseProxyUrl,
defaultQuery,
headers,
proxy,
azure,
streaming = true,
addParams,
dropParams,
} = options;
/** @type {OpenAIClientOptions} */
let llmConfig = {
streaming,
};
Object.assign(llmConfig, modelOptions);
if (addParams && typeof addParams === 'object') {
Object.assign(llmConfig, addParams);
}
/** Note: OpenAI Web Search models do not support any known parameters besdies `max_tokens` */
if (modelOptions.model && /gpt-4o.*search/.test(modelOptions.model)) {
const searchExcludeParams = [
'frequency_penalty',
'presence_penalty',
'temperature',
'top_p',
'top_k',
'stop',
'logit_bias',
'seed',
'response_format',
'n',
'logprobs',
'user',
];
dropParams = dropParams || [];
dropParams = [...new Set([...dropParams, ...searchExcludeParams])];
}
if (dropParams && Array.isArray(dropParams)) {
dropParams.forEach((param) => {
if (llmConfig[param]) {
llmConfig[param] = undefined;
}
});
}
let useOpenRouter;
/** @type {OpenAIClientOptions['configuration']} */
const configOptions = {};
if (
(reverseProxyUrl && reverseProxyUrl.includes(KnownEndpoints.openrouter)) ||
(endpoint && endpoint.toLowerCase().includes(KnownEndpoints.openrouter))
) {
useOpenRouter = true;
llmConfig.include_reasoning = true;
configOptions.baseURL = reverseProxyUrl;
configOptions.defaultHeaders = Object.assign(
{
'HTTP-Referer': 'https://librechat.ai',
'X-Title': 'LibreChat',
},
headers,
);
} else if (reverseProxyUrl) {
configOptions.baseURL = reverseProxyUrl;
if (headers) {
configOptions.defaultHeaders = headers;
}
}
if (defaultQuery) {
configOptions.defaultQuery = defaultQuery;
}
if (proxy) {
const proxyAgent = new HttpsProxyAgent(proxy);
Object.assign(configOptions, {
httpAgent: proxyAgent,
httpsAgent: proxyAgent,
});
}
if (azure) {
const useModelName = isEnabled(process.env.AZURE_USE_MODEL_AS_DEPLOYMENT_NAME);
azure.azureOpenAIApiDeploymentName = useModelName
? sanitizeModelName(llmConfig.model)
: azure.azureOpenAIApiDeploymentName;
if (process.env.AZURE_OPENAI_DEFAULT_MODEL) {
llmConfig.model = process.env.AZURE_OPENAI_DEFAULT_MODEL;
}
if (configOptions.baseURL) {
const azureURL = constructAzureURL({
baseURL: configOptions.baseURL,
azureOptions: azure,
});
azure.azureOpenAIBasePath = azureURL.split(`/${azure.azureOpenAIApiDeploymentName}`)[0];
}
Object.assign(llmConfig, azure);
llmConfig.model = llmConfig.azureOpenAIApiDeploymentName;
} else {
llmConfig.apiKey = apiKey;
// Object.assign(llmConfig, {
// configuration: { apiKey },
// });
}
if (process.env.OPENAI_ORGANIZATION && this.azure) {
llmConfig.organization = process.env.OPENAI_ORGANIZATION;
}
if (useOpenRouter && llmConfig.reasoning_effort != null) {
llmConfig.reasoning = {
effort: llmConfig.reasoning_effort,
};
delete llmConfig.reasoning_effort;
}
if (llmConfig?.['max_tokens'] != null) {
/** @type {number} */
llmConfig.maxTokens = llmConfig['max_tokens'];
delete llmConfig['max_tokens'];
}
return {
/** @type {OpenAIClientOptions} */
llmConfig,
/** @type {OpenAIClientOptions['configuration']} */
configOptions,
};
}
module.exports = { getLLMConfig };

View file

@ -2,9 +2,9 @@ const axios = require('axios');
const fs = require('fs').promises;
const FormData = require('form-data');
const { Readable } = require('stream');
const { genAzureEndpoint } = require('@librechat/api');
const { extractEnvVariable, STTProviders } = require('librechat-data-provider');
const { getCustomConfig } = require('~/server/services/Config');
const { genAzureEndpoint } = require('~/utils');
const { logger } = require('~/config');
/**

View file

@ -1,8 +1,8 @@
const axios = require('axios');
const { genAzureEndpoint } = require('@librechat/api');
const { extractEnvVariable, TTSProviders } = require('librechat-data-provider');
const { getRandomVoiceId, createChunkProcessor, splitTextIntoChunks } = require('./streamAudio');
const { getCustomConfig } = require('~/server/services/Config');
const { genAzureEndpoint } = require('~/utils');
const { logger } = require('~/config');
/**

View file

@ -1,6 +1,6 @@
const { z } = require('zod');
const { tool } = require('@langchain/core/tools');
const { normalizeServerName } = require('librechat-mcp');
const { normalizeServerName } = require('@librechat/api');
const { Constants: AgentConstants, Providers } = require('@librechat/agents');
const {
Constants,

View file

@ -2,6 +2,7 @@ const {
SystemRoles,
Permissions,
PermissionTypes,
isMemoryEnabled,
removeNullishValues,
} = require('librechat-data-provider');
const { updateAccessPermissions } = require('~/models/Role');
@ -20,6 +21,14 @@ async function loadDefaultInterface(config, configDefaults, roleName = SystemRol
const hasModelSpecs = config?.modelSpecs?.list?.length > 0;
const includesAddedEndpoints = config?.modelSpecs?.addedEndpoints?.length > 0;
const memoryConfig = config?.memory;
const memoryEnabled = isMemoryEnabled(memoryConfig);
/** Only disable memories if memory config is present but disabled/invalid */
const shouldDisableMemories = memoryConfig && !memoryEnabled;
/** Check if personalization is enabled (defaults to true if memory is configured and enabled) */
const isPersonalizationEnabled =
memoryConfig && memoryEnabled && memoryConfig.personalize !== false;
/** @type {TCustomConfig['interface']} */
const loadedInterface = removeNullishValues({
endpointsMenu:
@ -33,6 +42,7 @@ async function loadDefaultInterface(config, configDefaults, roleName = SystemRol
privacyPolicy: interfaceConfig?.privacyPolicy ?? defaults.privacyPolicy,
termsOfService: interfaceConfig?.termsOfService ?? defaults.termsOfService,
bookmarks: interfaceConfig?.bookmarks ?? defaults.bookmarks,
memories: shouldDisableMemories ? false : (interfaceConfig?.memories ?? defaults.memories),
prompts: interfaceConfig?.prompts ?? defaults.prompts,
multiConvo: interfaceConfig?.multiConvo ?? defaults.multiConvo,
agents: interfaceConfig?.agents ?? defaults.agents,
@ -45,6 +55,10 @@ async function loadDefaultInterface(config, configDefaults, roleName = SystemRol
await updateAccessPermissions(roleName, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: loadedInterface.prompts },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: loadedInterface.bookmarks },
[PermissionTypes.MEMORIES]: {
[Permissions.USE]: loadedInterface.memories,
[Permissions.OPT_OUT]: isPersonalizationEnabled,
},
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: loadedInterface.multiConvo },
[PermissionTypes.AGENTS]: { [Permissions.USE]: loadedInterface.agents },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: loadedInterface.temporaryChat },
@ -54,6 +68,10 @@ async function loadDefaultInterface(config, configDefaults, roleName = SystemRol
await updateAccessPermissions(SystemRoles.ADMIN, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: loadedInterface.prompts },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: loadedInterface.bookmarks },
[PermissionTypes.MEMORIES]: {
[Permissions.USE]: loadedInterface.memories,
[Permissions.OPT_OUT]: isPersonalizationEnabled,
},
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: loadedInterface.multiConvo },
[PermissionTypes.AGENTS]: { [Permissions.USE]: loadedInterface.agents },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: loadedInterface.temporaryChat },

View file

@ -12,6 +12,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: true,
bookmarks: true,
memories: true,
multiConvo: true,
agents: true,
temporaryChat: true,
@ -26,6 +27,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: true },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: true },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: true },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: true },
[PermissionTypes.AGENTS]: { [Permissions.USE]: true },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: true },
@ -39,6 +41,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: false,
bookmarks: false,
memories: false,
multiConvo: false,
agents: false,
temporaryChat: false,
@ -53,6 +56,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: false },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: false },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: false },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: false },
[PermissionTypes.AGENTS]: { [Permissions.USE]: false },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: false },
@ -70,6 +74,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: undefined },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: undefined },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: undefined },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: undefined },
[PermissionTypes.AGENTS]: { [Permissions.USE]: undefined },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -83,6 +88,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: undefined,
bookmarks: undefined,
memories: undefined,
multiConvo: undefined,
agents: undefined,
temporaryChat: undefined,
@ -97,6 +103,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: undefined },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: undefined },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: undefined },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: undefined },
[PermissionTypes.AGENTS]: { [Permissions.USE]: undefined },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -110,6 +117,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: true,
bookmarks: false,
memories: true,
multiConvo: undefined,
agents: true,
temporaryChat: undefined,
@ -124,6 +132,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: true },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: false },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: true },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: undefined },
[PermissionTypes.AGENTS]: { [Permissions.USE]: true },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -138,6 +147,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: true,
bookmarks: true,
memories: true,
multiConvo: true,
agents: true,
temporaryChat: true,
@ -151,6 +161,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: true },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: true },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: true },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: true },
[PermissionTypes.AGENTS]: { [Permissions.USE]: true },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: true },
@ -168,6 +179,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: undefined },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: undefined },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: undefined },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: true },
[PermissionTypes.AGENTS]: { [Permissions.USE]: undefined },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -185,6 +197,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: undefined },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: undefined },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: undefined },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: false },
[PermissionTypes.AGENTS]: { [Permissions.USE]: undefined },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -202,6 +215,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: undefined },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: undefined },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: undefined },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: undefined },
[PermissionTypes.AGENTS]: { [Permissions.USE]: undefined },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -215,6 +229,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: true,
bookmarks: false,
memories: true,
multiConvo: true,
agents: false,
temporaryChat: true,
@ -228,6 +243,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: true },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: false },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: true },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: true },
[PermissionTypes.AGENTS]: { [Permissions.USE]: false },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: true },
@ -242,6 +258,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: true,
bookmarks: true,
memories: false,
multiConvo: false,
agents: undefined,
temporaryChat: undefined,
@ -255,6 +272,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: true },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: true },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: false },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: false },
[PermissionTypes.AGENTS]: { [Permissions.USE]: undefined },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: undefined },
@ -268,6 +286,7 @@ describe('loadDefaultInterface', () => {
interface: {
prompts: true,
bookmarks: false,
memories: true,
multiConvo: true,
agents: false,
temporaryChat: true,
@ -281,6 +300,7 @@ describe('loadDefaultInterface', () => {
expect(updateAccessPermissions).toHaveBeenCalledWith(SystemRoles.USER, {
[PermissionTypes.PROMPTS]: { [Permissions.USE]: true },
[PermissionTypes.BOOKMARKS]: { [Permissions.USE]: false },
[PermissionTypes.MEMORIES]: { [Permissions.USE]: true },
[PermissionTypes.MULTI_CONVO]: { [Permissions.USE]: true },
[PermissionTypes.AGENTS]: { [Permissions.USE]: false },
[PermissionTypes.TEMPORARY_CHAT]: { [Permissions.USE]: true },

View file

@ -1,5 +1,3 @@
const path = require('path');
const crypto = require('crypto');
const {
Capabilities,
EModelEndpoint,
@ -218,38 +216,6 @@ function normalizeEndpointName(name = '') {
return name.toLowerCase() === Providers.OLLAMA ? Providers.OLLAMA : name;
}
/**
* Sanitize a filename by removing any directory components, replacing non-alphanumeric characters
* @param {string} inputName
* @returns {string}
*/
function sanitizeFilename(inputName) {
// Remove any directory components
let name = path.basename(inputName);
// Replace any non-alphanumeric characters except for '.' and '-'
name = name.replace(/[^a-zA-Z0-9.-]/g, '_');
// Ensure the name doesn't start with a dot (hidden file in Unix-like systems)
if (name.startsWith('.') || name === '') {
name = '_' + name;
}
// Limit the length of the filename
const MAX_LENGTH = 255;
if (name.length > MAX_LENGTH) {
const ext = path.extname(name);
const nameWithoutExt = path.basename(name, ext);
name =
nameWithoutExt.slice(0, MAX_LENGTH - ext.length - 7) +
'-' +
crypto.randomBytes(3).toString('hex') +
ext;
}
return name;
}
module.exports = {
isEnabled,
handleText,
@ -260,6 +226,5 @@ module.exports = {
generateConfig,
addSpaceIfNeeded,
createOnProgress,
sanitizeFilename,
normalizeEndpointName,
};

View file

@ -1,103 +0,0 @@
const { isEnabled, sanitizeFilename } = require('./handleText');
describe('isEnabled', () => {
test('should return true when input is "true"', () => {
expect(isEnabled('true')).toBe(true);
});
test('should return true when input is "TRUE"', () => {
expect(isEnabled('TRUE')).toBe(true);
});
test('should return true when input is true', () => {
expect(isEnabled(true)).toBe(true);
});
test('should return false when input is "false"', () => {
expect(isEnabled('false')).toBe(false);
});
test('should return false when input is false', () => {
expect(isEnabled(false)).toBe(false);
});
test('should return false when input is null', () => {
expect(isEnabled(null)).toBe(false);
});
test('should return false when input is undefined', () => {
expect(isEnabled()).toBe(false);
});
test('should return false when input is an empty string', () => {
expect(isEnabled('')).toBe(false);
});
test('should return false when input is a whitespace string', () => {
expect(isEnabled(' ')).toBe(false);
});
test('should return false when input is a number', () => {
expect(isEnabled(123)).toBe(false);
});
test('should return false when input is an object', () => {
expect(isEnabled({})).toBe(false);
});
test('should return false when input is an array', () => {
expect(isEnabled([])).toBe(false);
});
});
jest.mock('crypto', () => {
const actualModule = jest.requireActual('crypto');
return {
...actualModule,
randomBytes: jest.fn().mockReturnValue(Buffer.from('abc123', 'hex')),
};
});
describe('sanitizeFilename', () => {
test('removes directory components (1/2)', () => {
expect(sanitizeFilename('/path/to/file.txt')).toBe('file.txt');
});
test('removes directory components (2/2)', () => {
expect(sanitizeFilename('../../../../file.txt')).toBe('file.txt');
});
test('replaces non-alphanumeric characters', () => {
expect(sanitizeFilename('file name@#$.txt')).toBe('file_name___.txt');
});
test('preserves dots and hyphens', () => {
expect(sanitizeFilename('file-name.with.dots.txt')).toBe('file-name.with.dots.txt');
});
test('prepends underscore to filenames starting with a dot', () => {
expect(sanitizeFilename('.hiddenfile')).toBe('_.hiddenfile');
});
test('truncates long filenames', () => {
const longName = 'a'.repeat(300) + '.txt';
const result = sanitizeFilename(longName);
expect(result.length).toBe(255);
expect(result).toMatch(/^a+-abc123\.txt$/);
});
test('handles filenames with no extension', () => {
const longName = 'a'.repeat(300);
const result = sanitizeFilename(longName);
expect(result.length).toBe(255);
expect(result).toMatch(/^a+-abc123$/);
});
test('handles empty input', () => {
expect(sanitizeFilename('')).toBe('_');
});
test('handles input with only special characters', () => {
expect(sanitizeFilename('@#$%^&*')).toBe('_______');
});
});

View file

@ -1073,7 +1073,7 @@
/**
* @exports MCPServers
* @typedef {import('librechat-mcp').MCPServers} MCPServers
* @typedef {import('@librechat/api').MCPServers} MCPServers
* @memberof typedefs
*/
@ -1085,31 +1085,31 @@
/**
* @exports MCPManager
* @typedef {import('librechat-mcp').MCPManager} MCPManager
* @typedef {import('@librechat/api').MCPManager} MCPManager
* @memberof typedefs
*/
/**
* @exports FlowStateManager
* @typedef {import('librechat-mcp').FlowStateManager} FlowStateManager
* @typedef {import('@librechat/api').FlowStateManager} FlowStateManager
* @memberof typedefs
*/
/**
* @exports LCAvailableTools
* @typedef {import('librechat-mcp').LCAvailableTools} LCAvailableTools
* @typedef {import('@librechat/api').LCAvailableTools} LCAvailableTools
* @memberof typedefs
*/
/**
* @exports LCTool
* @typedef {import('librechat-mcp').LCTool} LCTool
* @typedef {import('@librechat/api').LCTool} LCTool
* @memberof typedefs
*/
/**
* @exports FormattedContent
* @typedef {import('librechat-mcp').FormattedContent} FormattedContent
* @typedef {import('@librechat/api').FormattedContent} FormattedContent
* @memberof typedefs
*/
@ -1232,7 +1232,7 @@
* @typedef {Object} AgentClientOptions
* @property {Agent} agent - The agent configuration object
* @property {string} endpoint - The endpoint identifier for the agent
* @property {Object} req - The request object
* @property {ServerRequest} req - The request object
* @property {string} [name] - The username
* @property {string} [modelLabel] - The label for the model being used
* @property {number} [maxContextTokens] - Maximum number of tokens allowed in context

View file

@ -1,7 +1,6 @@
const loadYaml = require('./loadYaml');
const axiosHelpers = require('./axios');
const tokenHelpers = require('./tokens');
const azureUtils = require('./azureUtils');
const deriveBaseURL = require('./deriveBaseURL');
const extractBaseURL = require('./extractBaseURL');
const findMessageContent = require('./findMessageContent');
@ -10,7 +9,6 @@ module.exports = {
loadYaml,
deriveBaseURL,
extractBaseURL,
...azureUtils,
...axiosHelpers,
...tokenHelpers,
findMessageContent,

View file

@ -9,6 +9,7 @@ import type {
} from 'librechat-data-provider';
import { ThinkingButton } from '~/components/Artifacts/Thinking';
import { MessageContext, SearchContext } from '~/Providers';
import MemoryArtifacts from './MemoryArtifacts';
import Sources from '~/components/Web/Sources';
import useLocalize from '~/hooks/useLocalize';
import { mapAttachments } from '~/utils/map';
@ -72,6 +73,7 @@ const ContentParts = memo(
return hasThinkPart && allThinkPartsHaveContent;
}, [content]);
if (!content) {
return null;
}
@ -103,6 +105,7 @@ const ContentParts = memo(
return (
<>
<SearchContext.Provider value={{ searchResults }}>
<MemoryArtifacts attachments={attachments} />
<Sources />
{hasReasoningParts && (
<div className="mb-5">

View file

@ -0,0 +1,143 @@
import { Tools } from 'librechat-data-provider';
import { useState, useRef, useMemo, useLayoutEffect, useEffect } from 'react';
import type { MemoryArtifact, TAttachment } from 'librechat-data-provider';
import MemoryInfo from './MemoryInfo';
import { useLocalize } from '~/hooks';
import { cn } from '~/utils';
export default function MemoryArtifacts({ attachments }: { attachments?: TAttachment[] }) {
const localize = useLocalize();
const [showInfo, setShowInfo] = useState(false);
const contentRef = useRef<HTMLDivElement>(null);
const [contentHeight, setContentHeight] = useState<number | undefined>(0);
const [isAnimating, setIsAnimating] = useState(false);
const prevShowInfoRef = useRef<boolean>(showInfo);
const memoryArtifacts = useMemo(() => {
const result: MemoryArtifact[] = [];
for (const attachment of attachments ?? []) {
if (attachment?.[Tools.memory] != null) {
result.push(attachment[Tools.memory]);
}
}
return result;
}, [attachments]);
useLayoutEffect(() => {
if (showInfo !== prevShowInfoRef.current) {
prevShowInfoRef.current = showInfo;
setIsAnimating(true);
if (showInfo && contentRef.current) {
requestAnimationFrame(() => {
if (contentRef.current) {
const height = contentRef.current.scrollHeight;
setContentHeight(height + 4);
}
});
} else {
setContentHeight(0);
}
const timer = setTimeout(() => {
setIsAnimating(false);
}, 400);
return () => clearTimeout(timer);
}
}, [showInfo]);
useEffect(() => {
if (!contentRef.current) {
return;
}
const resizeObserver = new ResizeObserver((entries) => {
if (showInfo && !isAnimating) {
for (const entry of entries) {
if (entry.target === contentRef.current) {
setContentHeight(entry.contentRect.height + 4);
}
}
}
});
resizeObserver.observe(contentRef.current);
return () => {
resizeObserver.disconnect();
};
}, [showInfo, isAnimating]);
if (!memoryArtifacts || memoryArtifacts.length === 0) {
return null;
}
return (
<>
<div className="flex items-center">
<div className="inline-block">
<button
className="outline-hidden my-1 flex items-center gap-1 text-sm font-semibold text-text-secondary-alt transition-colors hover:text-text-primary"
type="button"
onClick={() => setShowInfo((prev) => !prev)}
aria-expanded={showInfo}
aria-label={localize('com_ui_memory_updated')}
>
<svg
width="18"
height="18"
viewBox="0 0 18 18"
fill="none"
xmlns="http://www.w3.org/2000/svg"
className="mb-[-1px]"
>
<path
d="M6 3C4.89543 3 4 3.89543 4 5V13C4 14.1046 4.89543 15 6 15L6 3Z"
fill="currentColor"
/>
<path
d="M7 3V15H8.18037L8.4899 13.4523C8.54798 13.1619 8.69071 12.8952 8.90012 12.6858L12.2931 9.29289C12.7644 8.82153 13.3822 8.58583 14 8.58578V3.5C14 3.22386 13.7761 3 13.5 3H7Z"
fill="currentColor"
/>
<path
d="M11.3512 15.5297L9.73505 15.8529C9.38519 15.9229 9.07673 15.6144 9.14671 15.2646L9.46993 13.6484C9.48929 13.5517 9.53687 13.4628 9.60667 13.393L12.9996 10C13.5519 9.44771 14.4473 9.44771 14.9996 10C15.5519 10.5523 15.5519 11.4477 14.9996 12L11.6067 15.393C11.5369 15.4628 11.448 15.5103 11.3512 15.5297Z"
fill="currentColor"
/>
</svg>
{localize('com_ui_memory_updated')}
</button>
</div>
</div>
<div
className="relative"
style={{
height: showInfo ? contentHeight : 0,
overflow: 'hidden',
transition:
'height 0.4s cubic-bezier(0.16, 1, 0.3, 1), opacity 0.4s cubic-bezier(0.16, 1, 0.3, 1)',
opacity: showInfo ? 1 : 0,
transformOrigin: 'top',
willChange: 'height, opacity',
perspective: '1000px',
backfaceVisibility: 'hidden',
WebkitFontSmoothing: 'subpixel-antialiased',
}}
>
<div
className={cn(
'overflow-hidden rounded-xl border border-border-light bg-surface-primary-alt shadow-md',
showInfo && 'shadow-lg',
)}
style={{
transform: showInfo ? 'translateY(0) scale(1)' : 'translateY(-8px) scale(0.98)',
opacity: showInfo ? 1 : 0,
transition:
'transform 0.4s cubic-bezier(0.16, 1, 0.3, 1), opacity 0.4s cubic-bezier(0.16, 1, 0.3, 1)',
}}
>
<div ref={contentRef}>
{showInfo && <MemoryInfo key="memory-info" memoryArtifacts={memoryArtifacts} />}
</div>
</div>
</div>
</>
);
}

View file

@ -0,0 +1,61 @@
import type { MemoryArtifact } from 'librechat-data-provider';
import { useLocalize } from '~/hooks';
export default function MemoryInfo({ memoryArtifacts }: { memoryArtifacts: MemoryArtifact[] }) {
const localize = useLocalize();
if (memoryArtifacts.length === 0) {
return null;
}
// Group artifacts by type
const updatedMemories = memoryArtifacts.filter((artifact) => artifact.type === 'update');
const deletedMemories = memoryArtifacts.filter((artifact) => artifact.type === 'delete');
if (updatedMemories.length === 0 && deletedMemories.length === 0) {
return null;
}
return (
<div className="space-y-4 p-4">
{updatedMemories.length > 0 && (
<div>
<h4 className="mb-2 text-sm font-semibold text-text-primary">
{localize('com_ui_memory_updated_items')}
</h4>
<div className="space-y-2">
{updatedMemories.map((artifact, index) => (
<div key={`update-${index}`} className="rounded-lg p-3">
<div className="mb-1 text-xs font-medium uppercase tracking-wide text-text-secondary">
{artifact.key}
</div>
<div className="whitespace-pre-wrap text-sm text-text-primary">
{artifact.value}
</div>
</div>
))}
</div>
</div>
)}
{deletedMemories.length > 0 && (
<div>
<h4 className="mb-2 text-sm font-semibold text-text-primary">
{localize('com_ui_memory_deleted_items')}
</h4>
<div className="space-y-2">
{deletedMemories.map((artifact, index) => (
<div key={`delete-${index}`} className="rounded-lg p-3 opacity-60">
<div className="mb-1 text-xs font-medium uppercase tracking-wide text-text-secondary">
{artifact.key}
</div>
<div className="text-sm italic text-text-secondary">
{localize('com_ui_memory_deleted')}
</div>
</div>
))}
</div>
</div>
)}
</div>
);
}

View file

@ -5,9 +5,27 @@ import { SettingsTabValues } from 'librechat-data-provider';
import { useGetStartupConfig } from '~/data-provider';
import type { TDialogProps } from '~/common';
import { Dialog, DialogPanel, DialogTitle, Transition, TransitionChild } from '@headlessui/react';
import { GearIcon, DataIcon, SpeechIcon, UserIcon, ExperimentIcon } from '~/components/svg';
import { General, Chat, Speech, Beta, Commands, Data, Account, Balance } from './SettingsTabs';
import {
GearIcon,
DataIcon,
SpeechIcon,
UserIcon,
ExperimentIcon,
PersonalizationIcon,
} from '~/components/svg';
import {
General,
Chat,
Speech,
Beta,
Commands,
Data,
Account,
Balance,
Personalization,
} from './SettingsTabs';
import { useMediaQuery, useLocalize, TranslationKeys } from '~/hooks';
import usePersonalizationAccess from '~/hooks/usePersonalizationAccess';
import { cn } from '~/utils';
export default function Settings({ open, onOpenChange }: TDialogProps) {
@ -16,6 +34,7 @@ export default function Settings({ open, onOpenChange }: TDialogProps) {
const localize = useLocalize();
const [activeTab, setActiveTab] = useState(SettingsTabValues.GENERAL);
const tabRefs = useRef({});
const { hasAnyPersonalizationFeature, hasMemoryOptOut } = usePersonalizationAccess();
const handleKeyDown = (event: React.KeyboardEvent) => {
const tabs: SettingsTabValues[] = [
@ -24,6 +43,7 @@ export default function Settings({ open, onOpenChange }: TDialogProps) {
SettingsTabValues.BETA,
SettingsTabValues.COMMANDS,
SettingsTabValues.SPEECH,
...(hasAnyPersonalizationFeature ? [SettingsTabValues.PERSONALIZATION] : []),
SettingsTabValues.DATA,
...(startupConfig?.balance?.enabled ? [SettingsTabValues.BALANCE] : []),
SettingsTabValues.ACCOUNT,
@ -80,6 +100,15 @@ export default function Settings({ open, onOpenChange }: TDialogProps) {
icon: <SpeechIcon className="icon-sm" />,
label: 'com_nav_setting_speech',
},
...(hasAnyPersonalizationFeature
? [
{
value: SettingsTabValues.PERSONALIZATION,
icon: <PersonalizationIcon />,
label: 'com_nav_setting_personalization' as TranslationKeys,
},
]
: []),
{
value: SettingsTabValues.DATA,
icon: <DataIcon />,
@ -213,6 +242,14 @@ export default function Settings({ open, onOpenChange }: TDialogProps) {
<Tabs.Content value={SettingsTabValues.SPEECH}>
<Speech />
</Tabs.Content>
{hasAnyPersonalizationFeature && (
<Tabs.Content value={SettingsTabValues.PERSONALIZATION}>
<Personalization
hasMemoryOptOut={hasMemoryOptOut}
hasAnyPersonalizationFeature={hasAnyPersonalizationFeature}
/>
</Tabs.Content>
)}
<Tabs.Content value={SettingsTabValues.DATA}>
<Data />
</Tabs.Content>

View file

@ -0,0 +1,87 @@
import { useState, useEffect } from 'react';
import { useGetUserQuery, useUpdateMemoryPreferencesMutation } from '~/data-provider';
import { useToastContext } from '~/Providers';
import { Switch } from '~/components/ui';
import { useLocalize } from '~/hooks';
interface PersonalizationProps {
hasMemoryOptOut: boolean;
hasAnyPersonalizationFeature: boolean;
}
export default function Personalization({
hasMemoryOptOut,
hasAnyPersonalizationFeature,
}: PersonalizationProps) {
const localize = useLocalize();
const { showToast } = useToastContext();
const { data: user } = useGetUserQuery();
const [referenceSavedMemories, setReferenceSavedMemories] = useState(true);
const updateMemoryPreferencesMutation = useUpdateMemoryPreferencesMutation({
onSuccess: () => {
showToast({
message: localize('com_ui_preferences_updated'),
status: 'success',
});
},
onError: () => {
showToast({
message: localize('com_ui_error_updating_preferences'),
status: 'error',
});
// Revert the toggle on error
setReferenceSavedMemories((prev) => !prev);
},
});
// Initialize state from user data
useEffect(() => {
if (user?.personalization?.memories !== undefined) {
setReferenceSavedMemories(user.personalization.memories);
}
}, [user?.personalization?.memories]);
const handleMemoryToggle = (checked: boolean) => {
setReferenceSavedMemories(checked);
updateMemoryPreferencesMutation.mutate({ memories: checked });
};
if (!hasAnyPersonalizationFeature) {
return (
<div className="flex flex-col gap-3 text-sm text-text-primary">
<div className="text-text-secondary">{localize('com_ui_no_personalization_available')}</div>
</div>
);
}
return (
<div className="flex flex-col gap-3 text-sm text-text-primary">
{/* Memory Settings Section */}
{hasMemoryOptOut && (
<>
<div className="border-b border-border-medium pb-3">
<div className="text-base font-semibold">{localize('com_ui_memory')}</div>
</div>
<div className="flex items-center justify-between">
<div>
<div className="flex items-center gap-2">
{localize('com_ui_reference_saved_memories')}
</div>
<div className="mt-1 text-xs text-text-secondary">
{localize('com_ui_reference_saved_memories_description')}
</div>
</div>
<Switch
checked={referenceSavedMemories}
onCheckedChange={handleMemoryToggle}
disabled={updateMemoryPreferencesMutation.isLoading}
aria-label={localize('com_ui_reference_saved_memories')}
/>
</div>
</>
)}
</div>
);
}

View file

@ -7,3 +7,4 @@ export { RevokeKeysButton } from './Data/RevokeKeysButton';
export { default as Account } from './Account/Account';
export { default as Balance } from './Balance/Balance';
export { default as Speech } from './Speech/Speech';
export { default as Personalization } from './Personalization';

View file

@ -1,6 +1,7 @@
import { useMemo } from 'react';
import { useLocation } from 'react-router-dom';
import PanelNavigation from '~/components/Prompts/Groups/PanelNavigation';
import ManagePrompts from '~/components/Prompts/ManagePrompts';
import { useMediaQuery, usePromptGroupsNav } from '~/hooks';
import List from '~/components/Prompts/Groups/List';
import { cn } from '~/utils';
@ -38,6 +39,8 @@ export default function GroupSidePanel({
<div className="flex-grow overflow-y-auto">
<List groups={promptGroups} isChatRoute={isChatRoute} isLoading={!!groupsQuery.isLoading} />
</div>
<div className="flex items-center justify-between">
{isChatRoute && <ManagePrompts className="select-none" />}
<PanelNavigation
nextPage={nextPage}
prevPage={prevPage}
@ -47,5 +50,6 @@ export default function GroupSidePanel({
hasPreviousPage={hasPreviousPage}
/>
</div>
</div>
);
}

View file

@ -19,11 +19,13 @@ function PanelNavigation({
}) {
const localize = useLocalize();
return (
<div className="my-1 flex justify-between">
<div className="mb-2 flex gap-2">
{!isChatRoute && <ThemeSelector returnThemeOnly={true} />}
</div>
<div className="mb-2 flex gap-2">
<>
<div className="flex gap-2">{!isChatRoute && <ThemeSelector returnThemeOnly={true} />}</div>
<div
className="flex items-center justify-between gap-2"
role="navigation"
aria-label="Pagination"
>
<Button variant="outline" size="sm" onClick={() => prevPage()} disabled={!hasPreviousPage}>
{localize('com_ui_prev')}
</Button>
@ -36,7 +38,7 @@ function PanelNavigation({
{localize('com_ui_next')}
</Button>
</div>
</div>
</>
);
}

View file

@ -1,19 +1,17 @@
import PromptSidePanel from '~/components/Prompts/Groups/GroupSidePanel';
import AutoSendPrompt from '~/components/Prompts/Groups/AutoSendPrompt';
import FilterPrompts from '~/components/Prompts/Groups/FilterPrompts';
import ManagePrompts from '~/components/Prompts/ManagePrompts';
import { usePromptGroupsNav } from '~/hooks';
export default function PromptsAccordion() {
const groupsNav = usePromptGroupsNav();
return (
<div className="flex h-full w-full flex-col">
<PromptSidePanel className="lg:w-full xl:w-full" {...groupsNav}>
<div className="flex w-full flex-row items-center justify-between pt-2">
<ManagePrompts className="select-none" />
<PromptSidePanel className="mt-2 space-y-2 lg:w-full xl:w-full" {...groupsNav}>
<FilterPrompts setName={groupsNav.setName} className="items-center justify-center" />
<div className="flex w-full flex-row items-center justify-end">
<AutoSendPrompt className="text-xs dark:text-white" />
</div>
<FilterPrompts setName={groupsNav.setName} className="items-center justify-center" />
</PromptSidePanel>
</div>
);

View file

@ -80,13 +80,13 @@ const BookmarkTable = () => {
<TableHeader>
<TableRow className="border-b border-border-light">
<TableHead className="w-[70%] bg-surface-secondary py-3 text-left text-sm font-medium text-text-secondary">
<div className="px-4">{localize('com_ui_bookmarks_title')}</div>
<div>{localize('com_ui_bookmarks_title')}</div>
</TableHead>
<TableHead className="w-[30%] bg-surface-secondary py-3 text-left text-sm font-medium text-text-secondary">
<div className="px-4">{localize('com_ui_bookmarks_count')}</div>
<div>{localize('com_ui_bookmarks_count')}</div>
</TableHead>
<TableHead className="w-[40%] bg-surface-secondary py-3 text-left text-sm font-medium text-text-secondary">
<div className="px-4">{localize('com_assistants_actions')}</div>
<div>{localize('com_assistants_actions')}</div>
</TableHead>
</TableRow>
</TableHeader>

View file

@ -0,0 +1,212 @@
import * as Ariakit from '@ariakit/react';
import { useMemo, useEffect, useState } from 'react';
import { ShieldEllipsis } from 'lucide-react';
import { useForm, Controller } from 'react-hook-form';
import { Permissions, SystemRoles, roleDefaults, PermissionTypes } from 'librechat-data-provider';
import type { Control, UseFormSetValue, UseFormGetValues } from 'react-hook-form';
import { OGDialog, OGDialogTitle, OGDialogContent, OGDialogTrigger } from '~/components/ui';
import { useUpdateMemoryPermissionsMutation } from '~/data-provider';
import { Button, Switch, DropdownPopup } from '~/components/ui';
import { useLocalize, useAuthContext } from '~/hooks';
import { useToastContext } from '~/Providers';
type FormValues = Record<Permissions, boolean>;
type LabelControllerProps = {
label: string;
memoryPerm: Permissions;
control: Control<FormValues, unknown, FormValues>;
setValue: UseFormSetValue<FormValues>;
getValues: UseFormGetValues<FormValues>;
};
const LabelController: React.FC<LabelControllerProps> = ({ control, memoryPerm, label }) => (
<div className="mb-4 flex items-center justify-between gap-2">
{label}
<Controller
name={memoryPerm}
control={control}
render={({ field }) => (
<Switch
{...field}
checked={field.value}
onCheckedChange={field.onChange}
value={field.value.toString()}
/>
)}
/>
</div>
);
const AdminSettings = () => {
const localize = useLocalize();
const { user, roles } = useAuthContext();
const { showToast } = useToastContext();
const { mutate, isLoading } = useUpdateMemoryPermissionsMutation({
onSuccess: () => {
showToast({ status: 'success', message: localize('com_ui_saved') });
},
onError: () => {
showToast({ status: 'error', message: localize('com_ui_error_save_admin_settings') });
},
});
const [isRoleMenuOpen, setIsRoleMenuOpen] = useState(false);
const [selectedRole, setSelectedRole] = useState<SystemRoles>(SystemRoles.USER);
const defaultValues = useMemo(() => {
if (roles?.[selectedRole]?.permissions) {
return roles?.[selectedRole]?.permissions?.[PermissionTypes.MEMORIES];
}
return roleDefaults[selectedRole].permissions[PermissionTypes.MEMORIES];
}, [roles, selectedRole]);
const {
reset,
control,
setValue,
getValues,
handleSubmit,
formState: { isSubmitting },
} = useForm<FormValues>({
mode: 'onChange',
defaultValues,
});
useEffect(() => {
if (roles?.[selectedRole]?.permissions?.[PermissionTypes.MEMORIES]) {
reset(roles?.[selectedRole]?.permissions?.[PermissionTypes.MEMORIES]);
} else {
reset(roleDefaults[selectedRole].permissions[PermissionTypes.MEMORIES]);
}
}, [roles, selectedRole, reset]);
if (user?.role !== SystemRoles.ADMIN) {
return null;
}
const labelControllerData = [
{
memoryPerm: Permissions.USE,
label: localize('com_ui_memories_allow_use'),
},
{
memoryPerm: Permissions.CREATE,
label: localize('com_ui_memories_allow_create'),
},
{
memoryPerm: Permissions.UPDATE,
label: localize('com_ui_memories_allow_update'),
},
{
memoryPerm: Permissions.READ,
label: localize('com_ui_memories_allow_read'),
},
{
memoryPerm: Permissions.OPT_OUT,
label: localize('com_ui_memories_allow_opt_out'),
},
];
const onSubmit = (data: FormValues) => {
mutate({ roleName: selectedRole, updates: data });
};
const roleDropdownItems = [
{
label: SystemRoles.USER,
onClick: () => {
setSelectedRole(SystemRoles.USER);
},
},
{
label: SystemRoles.ADMIN,
onClick: () => {
setSelectedRole(SystemRoles.ADMIN);
},
},
];
return (
<OGDialog>
<OGDialogTrigger asChild>
<Button
size={'sm'}
variant={'outline'}
className="btn btn-neutral border-token-border-light relative h-9 w-full gap-1 rounded-lg font-medium"
>
<ShieldEllipsis className="cursor-pointer" aria-hidden="true" />
{localize('com_ui_admin_settings')}
</Button>
</OGDialogTrigger>
<OGDialogContent className="w-1/4 border-border-light bg-surface-primary text-text-primary">
<OGDialogTitle>{`${localize('com_ui_admin_settings')} - ${localize(
'com_ui_memories',
)}`}</OGDialogTitle>
<div className="p-2">
{/* Role selection dropdown */}
<div className="flex items-center gap-2">
<span className="font-medium">{localize('com_ui_role_select')}:</span>
<DropdownPopup
unmountOnHide={true}
menuId="memory-role-dropdown"
isOpen={isRoleMenuOpen}
setIsOpen={setIsRoleMenuOpen}
trigger={
<Ariakit.MenuButton className="inline-flex w-1/4 items-center justify-center rounded-lg border border-border-light bg-transparent px-2 py-1 text-text-primary transition-all ease-in-out hover:bg-surface-tertiary">
{selectedRole}
</Ariakit.MenuButton>
}
items={roleDropdownItems}
itemClassName="items-center justify-center"
sameWidth={true}
/>
</div>
{/* Permissions form */}
<form onSubmit={handleSubmit(onSubmit)}>
<div className="py-5">
{labelControllerData.map(({ memoryPerm, label }) => (
<div key={memoryPerm}>
<LabelController
control={control}
memoryPerm={memoryPerm}
label={label}
getValues={getValues}
setValue={setValue}
/>
{selectedRole === SystemRoles.ADMIN && memoryPerm === Permissions.USE && (
<>
<div className="mb-2 max-w-full whitespace-normal break-words text-sm text-red-600">
<span>{localize('com_ui_admin_access_warning')}</span>
{'\n'}
<a
href="https://www.librechat.ai/docs/configuration/librechat_yaml/object_structure/interface"
target="_blank"
rel="noreferrer"
className="text-blue-500 underline"
>
{localize('com_ui_more_info')}
</a>
</div>
</>
)}
</div>
))}
</div>
<div className="flex justify-end">
<button
type="submit"
disabled={isSubmitting || isLoading}
className="btn rounded bg-green-500 font-bold text-white transition-all hover:bg-green-600"
>
{localize('com_ui_save')}
</button>
</div>
</form>
</div>
</OGDialogContent>
</OGDialog>
);
};
export default AdminSettings;

View file

@ -0,0 +1,147 @@
import React, { useState } from 'react';
import { PermissionTypes, Permissions } from 'librechat-data-provider';
import { OGDialog, OGDialogTemplate, Button, Label, Input } from '~/components/ui';
import { useCreateMemoryMutation } from '~/data-provider';
import { useLocalize, useHasAccess } from '~/hooks';
import { useToastContext } from '~/Providers';
import { Spinner } from '~/components/svg';
interface MemoryCreateDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
children: React.ReactNode;
triggerRef?: React.MutableRefObject<HTMLButtonElement | null>;
}
export default function MemoryCreateDialog({
open,
onOpenChange,
children,
triggerRef,
}: MemoryCreateDialogProps) {
const localize = useLocalize();
const { showToast } = useToastContext();
const hasCreateAccess = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.CREATE,
});
const { mutate: createMemory, isLoading } = useCreateMemoryMutation({
onSuccess: () => {
showToast({
message: localize('com_ui_memory_created'),
status: 'success',
});
onOpenChange(false);
setKey('');
setValue('');
setTimeout(() => {
triggerRef?.current?.focus();
}, 0);
},
onError: (error: Error) => {
let errorMessage = localize('com_ui_error');
if (error && typeof error === 'object' && 'response' in error) {
const axiosError = error as any;
if (axiosError.response?.data?.error) {
errorMessage = axiosError.response.data.error;
// Check for duplicate key error
if (axiosError.response?.status === 409 || errorMessage.includes('already exists')) {
errorMessage = localize('com_ui_memory_key_exists');
}
}
} else if (error.message) {
errorMessage = error.message;
}
showToast({
message: errorMessage,
status: 'error',
});
},
});
const [key, setKey] = useState('');
const [value, setValue] = useState('');
const handleSave = () => {
if (!hasCreateAccess) {
return;
}
if (!key.trim() || !value.trim()) {
showToast({
message: localize('com_ui_field_required'),
status: 'error',
});
return;
}
createMemory({
key: key.trim(),
value: value.trim(),
});
};
const handleKeyPress = (e: React.KeyboardEvent) => {
if (e.key === 'Enter' && e.ctrlKey && hasCreateAccess) {
handleSave();
}
};
return (
<OGDialog open={open} onOpenChange={onOpenChange} triggerRef={triggerRef}>
{children}
<OGDialogTemplate
title={localize('com_ui_create_memory')}
showCloseButton={false}
className="w-11/12 md:max-w-lg"
main={
<div className="space-y-4">
<div className="space-y-2">
<Label htmlFor="memory-key" className="text-sm font-medium">
{localize('com_ui_key')}
</Label>
<Input
id="memory-key"
value={key}
onChange={(e) => setKey(e.target.value)}
onKeyDown={handleKeyPress}
placeholder={localize('com_ui_enter_key')}
className="w-full"
/>
</div>
<div className="space-y-2">
<Label htmlFor="memory-value" className="text-sm font-medium">
{localize('com_ui_value')}
</Label>
<textarea
id="memory-value"
value={value}
onChange={(e) => setValue(e.target.value)}
onKeyDown={handleKeyPress}
placeholder={localize('com_ui_enter_value')}
className="flex min-h-[80px] w-full rounded-md border border-input bg-background px-3 py-2 text-sm ring-offset-background placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
rows={3}
/>
</div>
</div>
}
buttons={
<Button
type="button"
variant="submit"
onClick={handleSave}
disabled={isLoading || !key.trim() || !value.trim()}
className="text-white"
>
{isLoading ? <Spinner className="size-4" /> : localize('com_ui_create')}
</Button>
}
/>
</OGDialog>
);
}

View file

@ -0,0 +1,179 @@
import React, { useState, useEffect } from 'react';
import { PermissionTypes, Permissions } from 'librechat-data-provider';
import type { TUserMemory } from 'librechat-data-provider';
import { OGDialog, OGDialogTemplate, Button, Label, Input } from '~/components/ui';
import { useUpdateMemoryMutation, useMemoriesQuery } from '~/data-provider';
import { useLocalize, useHasAccess } from '~/hooks';
import { useToastContext } from '~/Providers';
import { Spinner } from '~/components/svg';
interface MemoryEditDialogProps {
memory: TUserMemory | null;
open: boolean;
onOpenChange: (open: boolean) => void;
children: React.ReactNode;
triggerRef?: React.MutableRefObject<HTMLButtonElement | null>;
}
export default function MemoryEditDialog({
memory,
open,
onOpenChange,
children,
triggerRef,
}: MemoryEditDialogProps) {
const localize = useLocalize();
const { showToast } = useToastContext();
const { data: memData } = useMemoriesQuery();
const hasUpdateAccess = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.UPDATE,
});
const { mutate: updateMemory, isLoading } = useUpdateMemoryMutation({
onMutate: () => {
onOpenChange(false);
setTimeout(() => {
triggerRef?.current?.focus();
}, 0);
},
onSuccess: () => {
showToast({
message: localize('com_ui_saved'),
status: 'success',
});
},
onError: () => {
showToast({
message: localize('com_ui_error'),
status: 'error',
});
},
});
const [key, setKey] = useState('');
const [value, setValue] = useState('');
const [originalKey, setOriginalKey] = useState('');
useEffect(() => {
if (memory) {
setKey(memory.key);
setValue(memory.value);
setOriginalKey(memory.key);
}
}, [memory]);
const handleSave = () => {
if (!hasUpdateAccess || !memory) {
return;
}
if (!key.trim() || !value.trim()) {
showToast({
message: localize('com_ui_field_required'),
status: 'error',
});
return;
}
updateMemory({
key: key.trim(),
value: value.trim(),
...(originalKey !== key.trim() && { originalKey }),
});
};
const handleKeyPress = (e: React.KeyboardEvent) => {
if (e.key === 'Enter' && e.ctrlKey && hasUpdateAccess) {
handleSave();
}
};
return (
<OGDialog open={open} onOpenChange={onOpenChange} triggerRef={triggerRef}>
{children}
<OGDialogTemplate
title={hasUpdateAccess ? localize('com_ui_edit_memory') : localize('com_ui_view_memory')}
showCloseButton={false}
className="w-11/12 md:max-w-lg"
main={
<div className="space-y-4">
{memory && (
<div className="space-y-2">
<div className="flex items-center justify-between text-xs text-text-secondary">
<div>
{localize('com_ui_date')}:{' '}
{new Date(memory.updated_at).toLocaleDateString(undefined, {
month: 'short',
day: 'numeric',
year: 'numeric',
hour: '2-digit',
minute: '2-digit',
})}
</div>
{/* Token Information */}
{memory.tokenCount !== undefined && (
<div>
{memory.tokenCount.toLocaleString()}
{memData?.tokenLimit && ` / ${memData.tokenLimit.toLocaleString()}`}{' '}
{localize('com_ui_tokens')}
</div>
)}
</div>
{/* Overall Memory Usage */}
{memData?.tokenLimit && memData?.usagePercentage !== null && (
<div className="text-xs text-text-secondary">
{localize('com_ui_usage')}: {memData.usagePercentage}%{' '}
</div>
)}
</div>
)}
<div className="space-y-2">
<Label htmlFor="memory-key" className="text-sm font-medium">
{localize('com_ui_key')}
</Label>
<Input
id="memory-key"
value={key}
onChange={(e) => hasUpdateAccess && setKey(e.target.value)}
onKeyDown={handleKeyPress}
placeholder={localize('com_ui_enter_key')}
className="w-full"
disabled={!hasUpdateAccess}
/>
</div>
<div className="space-y-2">
<Label htmlFor="memory-value" className="text-sm font-medium">
{localize('com_ui_value')}
</Label>
<textarea
id="memory-value"
value={value}
onChange={(e) => hasUpdateAccess && setValue(e.target.value)}
onKeyDown={handleKeyPress}
placeholder={localize('com_ui_enter_value')}
className="flex min-h-[80px] w-full rounded-md border border-input bg-background px-3 py-2 text-sm ring-offset-background placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
rows={3}
disabled={!hasUpdateAccess}
/>
</div>
</div>
}
buttons={
hasUpdateAccess ? (
<Button
type="button"
variant="submit"
onClick={handleSave}
disabled={isLoading || !key.trim() || !value.trim()}
className="text-white"
>
{isLoading ? <Spinner className="size-4" /> : localize('com_ui_save')}
</Button>
) : null
}
/>
</OGDialog>
);
}

View file

@ -0,0 +1,428 @@
/* Memories */
import { useMemo, useState, useRef, useEffect } from 'react';
import { Plus } from 'lucide-react';
import { matchSorter } from 'match-sorter';
import { SystemRoles, PermissionTypes, Permissions } from 'librechat-data-provider';
import type { TUserMemory } from 'librechat-data-provider';
import {
Table,
Input,
Label,
Button,
Switch,
TableRow,
OGDialog,
TableHead,
TableBody,
TableCell,
TableHeader,
TooltipAnchor,
OGDialogTrigger,
} from '~/components/ui';
import {
useGetUserQuery,
useMemoriesQuery,
useDeleteMemoryMutation,
useUpdateMemoryPreferencesMutation,
} from '~/data-provider';
import { useLocalize, useAuthContext, useHasAccess } from '~/hooks';
import OGDialogTemplate from '~/components/ui/OGDialogTemplate';
import { EditIcon, TrashIcon } from '~/components/svg';
import MemoryCreateDialog from './MemoryCreateDialog';
import MemoryEditDialog from './MemoryEditDialog';
import Spinner from '~/components/svg/Spinner';
import { useToastContext } from '~/Providers';
import AdminSettings from './AdminSettings';
export default function MemoryViewer() {
const localize = useLocalize();
const { user } = useAuthContext();
const { data: userData } = useGetUserQuery();
const { data: memData, isLoading } = useMemoriesQuery();
const { mutate: deleteMemory } = useDeleteMemoryMutation();
const { showToast } = useToastContext();
const [pageIndex, setPageIndex] = useState(0);
const [searchQuery, setSearchQuery] = useState('');
const pageSize = 10;
const [createDialogOpen, setCreateDialogOpen] = useState(false);
const [deletingKey, setDeletingKey] = useState<string | null>(null);
const [referenceSavedMemories, setReferenceSavedMemories] = useState(true);
const updateMemoryPreferencesMutation = useUpdateMemoryPreferencesMutation({
onSuccess: () => {
showToast({
message: localize('com_ui_preferences_updated'),
status: 'success',
});
},
onError: () => {
showToast({
message: localize('com_ui_error_updating_preferences'),
status: 'error',
});
setReferenceSavedMemories((prev) => !prev);
},
});
useEffect(() => {
if (userData?.personalization?.memories !== undefined) {
setReferenceSavedMemories(userData.personalization.memories);
}
}, [userData?.personalization?.memories]);
const handleMemoryToggle = (checked: boolean) => {
setReferenceSavedMemories(checked);
updateMemoryPreferencesMutation.mutate({ memories: checked });
};
const hasReadAccess = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.READ,
});
const hasUpdateAccess = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.UPDATE,
});
const hasCreateAccess = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.CREATE,
});
const hasOptOutAccess = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.OPT_OUT,
});
const memories: TUserMemory[] = useMemo(() => memData?.memories ?? [], [memData]);
const filteredMemories = useMemo(() => {
return matchSorter(memories, searchQuery, {
keys: ['key', 'value'],
});
}, [memories, searchQuery]);
const currentRows = useMemo(() => {
return filteredMemories.slice(pageIndex * pageSize, (pageIndex + 1) * pageSize);
}, [filteredMemories, pageIndex]);
const getProgressBarColor = (percentage: number): string => {
if (percentage > 90) {
return 'stroke-red-500';
}
if (percentage > 75) {
return 'stroke-yellow-500';
}
return 'stroke-green-500';
};
const EditMemoryButton = ({ memory }: { memory: TUserMemory }) => {
const [open, setOpen] = useState(false);
const triggerRef = useRef<HTMLDivElement>(null);
const handleKeyDown = (event: React.KeyboardEvent<HTMLDivElement>) => {
if (event.key === 'Enter' || event.key === ' ') {
event.preventDefault();
setOpen(!open);
}
};
// Only show edit button if user has UPDATE permission
if (!hasUpdateAccess) {
return null;
}
return (
<MemoryEditDialog
open={open}
memory={memory}
onOpenChange={setOpen}
triggerRef={triggerRef as React.MutableRefObject<HTMLButtonElement | null>}
>
<OGDialogTrigger asChild>
<TooltipAnchor
ref={triggerRef}
role="button"
aria-label={localize('com_ui_edit')}
description={localize('com_ui_edit')}
tabIndex={0}
onClick={() => setOpen(!open)}
className="flex size-7 items-center justify-center rounded-lg transition-colors duration-200 hover:bg-surface-hover"
onKeyDown={handleKeyDown}
>
<EditIcon />
</TooltipAnchor>
</OGDialogTrigger>
</MemoryEditDialog>
);
};
const DeleteMemoryButton = ({ memory }: { memory: TUserMemory }) => {
const [open, setOpen] = useState(false);
const handleKeyDown = (event: React.KeyboardEvent<HTMLDivElement>) => {
if (event.key === 'Enter' || event.key === ' ') {
event.preventDefault();
event.stopPropagation();
setOpen(!open);
}
};
if (!hasUpdateAccess) {
return null;
}
const confirmDelete = async () => {
setDeletingKey(memory.key);
deleteMemory(memory.key, {
onSuccess: () => {
showToast({
message: localize('com_ui_deleted'),
status: 'success',
});
setOpen(false);
},
onError: () =>
showToast({
message: localize('com_ui_error'),
status: 'error',
}),
onSettled: () => setDeletingKey(null),
});
};
return (
<OGDialog open={open} onOpenChange={setOpen}>
<OGDialogTrigger asChild>
<TooltipAnchor
role="button"
aria-label={localize('com_ui_delete')}
description={localize('com_ui_delete')}
className="flex size-7 items-center justify-center rounded-lg transition-colors duration-200 hover:bg-surface-hover"
tabIndex={0}
onClick={() => setOpen(!open)}
onKeyDown={handleKeyDown}
>
{deletingKey === memory.key ? (
<Spinner className="size-4 animate-spin" />
) : (
<TrashIcon className="size-4" />
)}
</TooltipAnchor>
</OGDialogTrigger>
<OGDialogTemplate
showCloseButton={false}
title={localize('com_ui_delete_memory')}
className="w-11/12 max-w-lg"
main={
<Label className="text-left text-sm font-medium">
{localize('com_ui_delete_confirm')} &quot;{memory.key}&quot;?
</Label>
}
selection={{
selectHandler: confirmDelete,
selectClasses:
'bg-red-700 dark:bg-red-600 hover:bg-red-800 dark:hover:bg-red-800 text-white',
selectText: localize('com_ui_delete'),
}}
/>
</OGDialog>
);
};
if (isLoading) {
return (
<div className="flex h-full w-full items-center justify-center p-4">
<Spinner />
</div>
);
}
if (!hasReadAccess) {
return (
<div className="flex h-full w-full items-center justify-center p-4">
<div className="text-center">
<p className="text-sm text-text-secondary">{localize('com_ui_no_read_access')}</p>
</div>
</div>
);
}
return (
<div className="flex h-full w-full flex-col overflow-hidden">
<div role="region" aria-label={localize('com_ui_memories')} className="mt-2 space-y-2">
<div className="flex items-center gap-4">
<Input
placeholder={localize('com_ui_memories_filter')}
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
aria-label={localize('com_ui_memories_filter')}
/>
</div>
{/* Memory Usage and Toggle Display */}
{(memData?.tokenLimit || hasOptOutAccess) && (
<div className="flex items-center justify-between rounded-lg">
{/* Usage Display */}
{memData?.tokenLimit && (
<div className="flex items-center gap-2">
<div className="relative size-10">
<svg className="size-10 -rotate-90 transform">
<circle
cx="20"
cy="20"
r="16"
stroke="currentColor"
strokeWidth="3"
fill="none"
className="text-gray-200 dark:text-gray-700"
/>
<circle
cx="20"
cy="20"
r="16"
strokeWidth="3"
fill="none"
strokeDasharray={`${2 * Math.PI * 16}`}
strokeDashoffset={`${2 * Math.PI * 16 * (1 - (memData.usagePercentage ?? 0) / 100)}`}
className={`transition-all ${getProgressBarColor(memData.usagePercentage ?? 0)}`}
strokeLinecap="round"
/>
</svg>
<div className="absolute inset-0 flex items-center justify-center">
<span className="text-xs font-medium">{memData.usagePercentage}%</span>
</div>
</div>
<div className="text-sm text-text-secondary">{localize('com_ui_usage')}</div>
</div>
)}
{/* Memory Toggle */}
{hasOptOutAccess && (
<div className="flex items-center gap-2 text-xs">
<span>{localize('com_ui_use_memory')}</span>
<Switch
checked={referenceSavedMemories}
onCheckedChange={handleMemoryToggle}
aria-label={localize('com_ui_reference_saved_memories')}
disabled={updateMemoryPreferencesMutation.isLoading}
/>
</div>
)}
</div>
)}
{/* Create Memory Button */}
{hasCreateAccess && (
<div className="flex w-full justify-end">
<MemoryCreateDialog open={createDialogOpen} onOpenChange={setCreateDialogOpen}>
<OGDialogTrigger asChild>
<Button variant="outline" className="w-full bg-transparent">
<Plus className="size-4" aria-hidden />
{localize('com_ui_create_memory')}
</Button>
</OGDialogTrigger>
</MemoryCreateDialog>
</div>
)}
<div className="rounded-lg border border-border-light bg-transparent shadow-sm transition-colors">
<Table className="w-full table-fixed">
<TableHeader>
<TableRow className="border-b border-border-light hover:bg-surface-secondary">
<TableHead
className={`${
hasUpdateAccess ? 'w-[75%]' : 'w-[100%]'
} bg-surface-secondary py-3 text-left text-sm font-medium text-text-secondary`}
>
<div>{localize('com_ui_memory')}</div>
</TableHead>
{hasUpdateAccess && (
<TableHead className="w-[25%] bg-surface-secondary py-3 text-center text-sm font-medium text-text-secondary">
<div>{localize('com_assistants_actions')}</div>
</TableHead>
)}
</TableRow>
</TableHeader>
<TableBody>
{currentRows.length ? (
currentRows.map((memory: TUserMemory, idx: number) => (
<TableRow
key={idx}
className="border-b border-border-light hover:bg-surface-secondary"
>
<TableCell className={`${hasUpdateAccess ? 'w-[75%]' : 'w-[100%]'} px-4 py-4`}>
<div
className="overflow-hidden text-ellipsis whitespace-nowrap text-sm text-text-primary"
title={memory.value}
>
{memory.value}
</div>
</TableCell>
{hasUpdateAccess && (
<TableCell className="w-[25%] px-4 py-4">
<div className="flex justify-center gap-2">
<EditMemoryButton memory={memory} />
<DeleteMemoryButton memory={memory} />
</div>
</TableCell>
)}
</TableRow>
))
) : (
<TableRow>
<TableCell
colSpan={hasUpdateAccess ? 2 : 1}
className="h-24 text-center text-sm text-text-secondary"
>
{localize('com_ui_no_data')}
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</div>
{/* Pagination controls */}
{filteredMemories.length > pageSize && (
<div
className="flex items-center justify-end gap-2"
role="navigation"
aria-label="Pagination"
>
<Button
variant="outline"
size="sm"
onClick={() => setPageIndex((prev) => Math.max(prev - 1, 0))}
disabled={pageIndex === 0}
aria-label={localize('com_ui_prev')}
>
{localize('com_ui_prev')}
</Button>
<div className="text-sm" aria-live="polite">
{`${pageIndex + 1} / ${Math.ceil(filteredMemories.length / pageSize)}`}
</div>
<Button
variant="outline"
size="sm"
onClick={() =>
setPageIndex((prev) =>
(prev + 1) * pageSize < filteredMemories.length ? prev + 1 : prev,
)
}
disabled={(pageIndex + 1) * pageSize >= filteredMemories.length}
aria-label={localize('com_ui_next')}
>
{localize('com_ui_next')}
</Button>
</div>
)}
{/* Admin Settings */}
{user?.role === SystemRoles.ADMIN && (
<div className="mt-4">
<AdminSettings />
</div>
)}
</div>
</div>
);
}

View file

@ -0,0 +1,2 @@
export { default as MemoryViewer } from './MemoryViewer';
export { default as MemoryEditDialog } from './MemoryEditDialog';

View file

@ -0,0 +1,19 @@
export default function PersonalizationIcon({ className = '' }: { className?: string }) {
return (
<svg
width="24"
height="24"
viewBox="0 0 24 24"
fill="none"
xmlns="http://www.w3.org/2000/svg"
className={`icon-sm ${className}`}
>
<path
fillRule="evenodd"
clipRule="evenodd"
d="M12 4C10.3431 4 9 5.34315 9 7C9 8.65685 10.3431 10 12 10C13.6569 10 15 8.65685 15 7C15 5.34315 13.6569 4 12 4ZM7 7C7 4.23858 9.23858 2 12 2C14.7614 2 17 4.23858 17 7C17 9.76142 14.7614 12 12 12C9.23858 12 7 9.76142 7 7ZM19.0277 15.6255C18.6859 15.5646 18.1941 15.6534 17.682 16.1829C17.4936 16.3777 17.2342 16.4877 16.9632 16.4877C16.6922 16.4877 16.4328 16.3777 16.2444 16.1829C15.7322 15.6534 15.2405 15.5646 14.8987 15.6255C14.5381 15.6897 14.2179 15.9384 14.0623 16.3275C13.8048 16.9713 13.9014 18.662 16.9632 20.4617C20.0249 18.662 20.1216 16.9713 19.864 16.3275C19.7084 15.9384 19.3882 15.6897 19.0277 15.6255ZM21.721 15.5847C22.5748 17.7191 21.2654 20.429 17.437 22.4892C17.1412 22.6484 16.7852 22.6484 16.4893 22.4892C12.6609 20.4291 11.3516 17.7191 12.2053 15.5847C12.6117 14.5689 13.4917 13.8446 14.5481 13.6565C15.3567 13.5125 16.2032 13.6915 16.9632 14.1924C17.7232 13.6915 18.5697 13.5125 19.3783 13.6565C20.4347 13.8446 21.3147 14.5689 21.721 15.5847ZM9.92597 14.2049C10.1345 14.7163 9.889 15.2999 9.3776 15.5084C7.06131 16.453 5.5 18.5813 5.5 20.9999C5.5 21.5522 5.05228 21.9999 4.5 21.9999C3.94772 21.9999 3.5 21.5522 3.5 20.9999C3.5 17.6777 5.641 14.8723 8.6224 13.6565C9.1338 13.448 9.71743 13.6935 9.92597 14.2049Z"
fill="currentColor"
/>
</svg>
);
}

View file

@ -61,3 +61,4 @@ export { default as BedrockIcon } from './BedrockIcon';
export { default as ThumbUpIcon } from './ThumbUpIcon';
export { default as ThumbDownIcon } from './ThumbDownIcon';
export { default as XAIcon } from './XAIcon';
export { default as PersonalizationIcon } from './PersonalizationIcon';

View file

@ -4,7 +4,7 @@ import { X } from 'lucide-react';
import { cn } from '~/utils';
interface OGDialogProps extends DialogPrimitive.DialogProps {
triggerRef?: React.RefObject<HTMLButtonElement | HTMLInputElement>;
triggerRef?: React.RefObject<HTMLButtonElement | HTMLInputElement | null>;
}
const Dialog = React.forwardRef<HTMLDivElement, OGDialogProps>(

View file

@ -32,7 +32,7 @@ const TableFooter = React.forwardRef<
>(({ className, ...props }, ref) => (
<tfoot
ref={ref}
className={cn('bg-muted/50 border-t font-medium [&>tr]:last:border-b-0', className)}
className={cn('border-t bg-muted/50 font-medium [&>tr]:last:border-b-0', className)}
{...props}
/>
));
@ -43,7 +43,7 @@ const TableRow = React.forwardRef<HTMLTableRowElement, React.HTMLAttributes<HTML
<tr
ref={ref}
className={cn(
'hover:bg-muted/50 data-[state=selected]:bg-muted border-b border-border-light transition-colors',
'border-b border-border-light transition-colors hover:bg-muted/50 data-[state=selected]:bg-muted',
className,
)}
{...props}
@ -59,7 +59,7 @@ const TableHead = React.forwardRef<
<th
ref={ref}
className={cn(
'text-muted-foreground h-12 px-4 text-left align-middle font-medium [&:has([role=checkbox])]:pr-0',
'h-12 px-4 text-left align-middle font-medium text-muted-foreground [&:has([role=checkbox])]:pr-0',
className,
)}
{...props}
@ -83,7 +83,7 @@ const TableCaption = React.forwardRef<
HTMLTableCaptionElement,
React.HTMLAttributes<HTMLTableCaptionElement>
>(({ className, ...props }, ref) => (
<caption ref={ref} className={cn('text-muted-foreground mt-4 text-sm', className)} {...props} />
<caption ref={ref} className={cn('mt-4 text-sm text-muted-foreground', className)} {...props} />
));
TableCaption.displayName = 'TableCaption';

View file

@ -0,0 +1,2 @@
/* Memories */
export * from './queries';

View file

@ -0,0 +1,116 @@
/* Memories */
import { QueryKeys, MutationKeys, dataService } from 'librechat-data-provider';
import { useQuery, useQueryClient, useMutation } from '@tanstack/react-query';
import type {
UseQueryOptions,
UseMutationOptions,
QueryObserverResult,
} from '@tanstack/react-query';
import type { TUserMemory, MemoriesResponse } from 'librechat-data-provider';
export const useMemoriesQuery = (
config?: UseQueryOptions<MemoriesResponse>,
): QueryObserverResult<MemoriesResponse> => {
return useQuery<MemoriesResponse>([QueryKeys.memories], () => dataService.getMemories(), {
refetchOnWindowFocus: false,
refetchOnReconnect: false,
refetchOnMount: false,
...config,
});
};
export const useDeleteMemoryMutation = () => {
const queryClient = useQueryClient();
return useMutation((key: string) => dataService.deleteMemory(key), {
onSuccess: () => {
queryClient.invalidateQueries([QueryKeys.memories]);
},
});
};
export type UpdateMemoryParams = { key: string; value: string; originalKey?: string };
export const useUpdateMemoryMutation = (
options?: UseMutationOptions<TUserMemory, Error, UpdateMemoryParams>,
) => {
const queryClient = useQueryClient();
return useMutation(
({ key, value, originalKey }: UpdateMemoryParams) =>
dataService.updateMemory(key, value, originalKey),
{
...options,
onSuccess: (...params) => {
queryClient.invalidateQueries([QueryKeys.memories]);
options?.onSuccess?.(...params);
},
},
);
};
export type UpdateMemoryPreferencesParams = { memories: boolean };
export type UpdateMemoryPreferencesResponse = {
updated: boolean;
preferences: { memories: boolean };
};
export const useUpdateMemoryPreferencesMutation = (
options?: UseMutationOptions<
UpdateMemoryPreferencesResponse,
Error,
UpdateMemoryPreferencesParams
>,
) => {
const queryClient = useQueryClient();
return useMutation<UpdateMemoryPreferencesResponse, Error, UpdateMemoryPreferencesParams>(
[MutationKeys.updateMemoryPreferences],
(preferences: UpdateMemoryPreferencesParams) =>
dataService.updateMemoryPreferences(preferences),
{
...options,
onSuccess: (...params) => {
queryClient.invalidateQueries([QueryKeys.user]);
options?.onSuccess?.(...params);
},
},
);
};
export type CreateMemoryParams = { key: string; value: string };
export type CreateMemoryResponse = { created: boolean; memory: TUserMemory };
export const useCreateMemoryMutation = (
options?: UseMutationOptions<CreateMemoryResponse, Error, CreateMemoryParams>,
) => {
const queryClient = useQueryClient();
return useMutation<CreateMemoryResponse, Error, CreateMemoryParams>(
({ key, value }: CreateMemoryParams) => dataService.createMemory({ key, value }),
{
...options,
onSuccess: (data, variables, context) => {
queryClient.setQueryData<MemoriesResponse>([QueryKeys.memories], (oldData) => {
if (!oldData) return oldData;
const newMemories = [...oldData.memories, data.memory];
const totalTokens = newMemories.reduce(
(sum, memory) => sum + (memory.tokenCount || 0),
0,
);
const tokenLimit = oldData.tokenLimit;
let usagePercentage = oldData.usagePercentage;
if (tokenLimit && tokenLimit > 0) {
usagePercentage = Math.min(100, Math.round((totalTokens / tokenLimit) * 100));
}
return {
...oldData,
memories: newMemories,
totalTokens,
usagePercentage,
};
});
options?.onSuccess?.(data, variables, context);
},
},
);
};

View file

@ -0,0 +1,19 @@
import { dataService as _dataService } from 'librechat-data-provider';
import axios from 'axios';
jest.mock('axios');
const mockedAxios = axios as jest.Mocked<typeof axios>;
describe('getMemories', () => {
it('should fetch memories from /api/memories', async () => {
const mockData = [{ key: 'foo', value: 'bar', updated_at: '2024-05-01T00:00:00Z' }];
mockedAxios.get.mockResolvedValueOnce({ data: mockData } as any);
const result = await (_dataService as any).getMemories();
expect(mockedAxios.get).toHaveBeenCalledWith('/api/memories', expect.any(Object));
expect(result).toEqual(mockData);
});
});

View file

@ -2,6 +2,8 @@ export * from './Auth';
export * from './Agents';
export * from './Endpoints';
export * from './Files';
/* Memories */
export * from './Memories';
export * from './Messages';
export * from './Misc';
export * from './Tools';

View file

@ -1,10 +1,15 @@
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import {
QueryKeys,
dataService,
promptPermissionsSchema,
memoryPermissionsSchema,
} from 'librechat-data-provider';
import type {
UseQueryOptions,
UseMutationResult,
QueryObserverResult,
UseQueryOptions,
} from '@tanstack/react-query';
import { QueryKeys, dataService, promptPermissionsSchema } from 'librechat-data-provider';
import type * as t from 'librechat-data-provider';
export const useGetRole = (
@ -91,3 +96,39 @@ export const useUpdateAgentPermissionsMutation = (
},
);
};
export const useUpdateMemoryPermissionsMutation = (
options?: t.UpdateMemoryPermOptions,
): UseMutationResult<
t.UpdatePermResponse,
t.TError | undefined,
t.UpdateMemoryPermVars,
unknown
> => {
const queryClient = useQueryClient();
const { onMutate, onSuccess, onError } = options ?? {};
return useMutation(
(variables) => {
memoryPermissionsSchema.partial().parse(variables.updates);
return dataService.updateMemoryPermissions(variables);
},
{
onSuccess: (data, variables, context) => {
queryClient.invalidateQueries([QueryKeys.roles, variables.roleName]);
if (onSuccess) {
onSuccess(data, variables, context);
}
},
onError: (...args) => {
const error = args[0];
if (error != null) {
console.error('Failed to update memory permissions:', error);
}
if (onError) {
onError(...args);
}
},
onMutate,
},
);
};

View file

@ -1,5 +1,5 @@
import { useMemo } from 'react';
import { MessageSquareQuote, ArrowRightToLine, Settings2, Bookmark } from 'lucide-react';
import { MessageSquareQuote, ArrowRightToLine, Settings2, Database, Bookmark } from 'lucide-react';
import {
isAssistantsEndpoint,
isAgentsEndpoint,
@ -12,6 +12,7 @@ import type { TInterfaceConfig, TEndpointsConfig } from 'librechat-data-provider
import type { NavLink } from '~/common';
import AgentPanelSwitch from '~/components/SidePanel/Agents/AgentPanelSwitch';
import BookmarkPanel from '~/components/SidePanel/Bookmarks/BookmarkPanel';
import MemoryViewer from '~/components/SidePanel/Memories/MemoryViewer';
import PanelSwitch from '~/components/SidePanel/Builder/PanelSwitch';
import PromptsAccordion from '~/components/Prompts/PromptsAccordion';
import Parameters from '~/components/SidePanel/Parameters/Panel';
@ -42,6 +43,14 @@ export default function useSideNavLinks({
permissionType: PermissionTypes.BOOKMARKS,
permission: Permissions.USE,
});
const hasAccessToMemories = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.USE,
});
const hasAccessToReadMemories = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.READ,
});
const hasAccessToAgents = useHasAccess({
permissionType: PermissionTypes.AGENTS,
permission: Permissions.USE,
@ -97,6 +106,16 @@ export default function useSideNavLinks({
});
}
if (hasAccessToMemories && hasAccessToReadMemories) {
links.push({
title: 'com_ui_memories',
label: '',
icon: Database,
id: 'memories',
Component: MemoryViewer,
});
}
if (
interfaceConfig.parameters === true &&
isParamEndpoint(endpoint ?? '', endpointType ?? '') === true &&
@ -147,6 +166,8 @@ export default function useSideNavLinks({
endpoint,
hasAccessToAgents,
hasAccessToPrompts,
hasAccessToMemories,
hasAccessToReadMemories,
hasAccessToBookmarks,
hasAccessToCreateAgents,
hidePanel,

View file

@ -1,7 +1,8 @@
import { useSetRecoilState } from 'recoil';
import { QueryKeys } from 'librechat-data-provider';
import type { QueryClient } from '@tanstack/react-query';
import type { TAttachment, EventSubmission } from 'librechat-data-provider';
import { QueryKeys, Tools } from 'librechat-data-provider';
import type { TAttachment, EventSubmission, MemoriesResponse } from 'librechat-data-provider';
import { handleMemoryArtifact } from '~/utils/memory';
import store from '~/store';
export default function useAttachmentHandler(queryClient?: QueryClient) {
@ -16,6 +17,18 @@ export default function useAttachmentHandler(queryClient?: QueryClient) {
});
}
if (queryClient && data.type === Tools.memory && data[Tools.memory]) {
const memoryArtifact = data[Tools.memory];
queryClient.setQueryData([QueryKeys.memories], (oldData: MemoriesResponse | undefined) => {
if (!oldData) {
return oldData;
}
return handleMemoryArtifact({ memoryArtifact, currentData: oldData }) || oldData;
});
}
setAttachmentsMap((prevMap) => {
const messageAttachments =
(prevMap as Record<string, TAttachment[] | undefined>)[messageId] || [];

View file

@ -0,0 +1,16 @@
import { PermissionTypes, Permissions } from 'librechat-data-provider';
import useHasAccess from './Roles/useHasAccess';
export default function usePersonalizationAccess() {
const hasMemoryOptOut = useHasAccess({
permissionType: PermissionTypes.MEMORIES,
permission: Permissions.OPT_OUT,
});
const hasAnyPersonalizationFeature = hasMemoryOptOut;
return {
hasMemoryOptOut,
hasAnyPersonalizationFeature,
};
}

View file

@ -446,6 +446,7 @@
"com_nav_setting_data": "Data controls",
"com_nav_setting_general": "General",
"com_nav_setting_speech": "Speech",
"com_nav_setting_personalization": "Personalization",
"com_nav_settings": "Settings",
"com_nav_shared_links": "Shared links",
"com_nav_show_code": "Always show code when using code interpreter",
@ -659,10 +660,12 @@
"com_ui_delete_confirm": "This will delete",
"com_ui_delete_confirm_prompt_version_var": "This will delete the selected version for \"{{0}}.\" If no other versions exist, the prompt will be deleted.",
"com_ui_delete_conversation": "Delete chat?",
"com_ui_delete_memory": "Delete Memory",
"com_ui_delete_prompt": "Delete Prompt?",
"com_ui_delete_shared_link": "Delete shared link?",
"com_ui_delete_tool": "Delete Tool",
"com_ui_delete_tool_confirm": "Are you sure you want to delete this tool?",
"com_ui_deleted": "Deleted",
"com_ui_descending": "Desc",
"com_ui_description": "Description",
"com_ui_description_placeholder": "Optional: Enter a description to display for the prompt",
@ -770,6 +773,7 @@
"com_ui_include_shadcnui_agent": "Include shadcn/ui instructions",
"com_ui_input": "Input",
"com_ui_instructions": "Instructions",
"com_ui_key": "Key",
"com_ui_late_night": "Happy late night",
"com_ui_latest_footer": "Every AI for Everyone.",
"com_ui_latest_production_version": "Latest production version",
@ -783,6 +787,17 @@
"com_ui_manage": "Manage",
"com_ui_max_tags": "Maximum number allowed is {{0}}, using latest values.",
"com_ui_mcp_servers": "MCP Servers",
"com_ui_memories": "Memories",
"com_ui_memories_filter": "Filter memories...",
"com_ui_memories_allow_use": "Allow using Memories",
"com_ui_memories_allow_create": "Allow creating Memories",
"com_ui_memories_allow_update": "Allow updating Memories",
"com_ui_memories_allow_read": "Allow reading Memories",
"com_ui_memories_allow_opt_out": "Allow users to opt out of Memories",
"com_ui_memory": "Memory",
"com_ui_usage": "Usage",
"com_ui_current": "Current",
"com_ui_tokens": "tokens",
"com_ui_mention": "Mention an endpoint, assistant, or preset to quickly switch to it",
"com_ui_min_tags": "Cannot remove more values, a minimum of {{0}} are required.",
"com_ui_misc": "Misc.",
@ -800,7 +815,7 @@
"com_ui_no_bookmarks": "it seems like you have no bookmarks yet. Click on a chat and add a new one",
"com_ui_no_category": "No category",
"com_ui_no_changes": "No changes to update",
"com_ui_no_data": "something needs to go here. was empty",
"com_ui_no_data": "No data available",
"com_ui_no_terms_content": "No terms and conditions content to display",
"com_ui_no_valid_items": "something needs to go here. was empty",
"com_ui_none": "None",
@ -944,6 +959,8 @@
"com_ui_version_var": "Version {{0}}",
"com_ui_versions": "Versions",
"com_ui_view_source": "View source chat",
"com_ui_view_memory": "View Memory",
"com_ui_no_read_access": "You don't have permission to view memories",
"com_ui_web_search": "Web Search",
"com_ui_web_search_api_subtitle": "Search the web for up-to-date information",
"com_ui_web_search_cohere_key": "Enter Cohere API Key",
@ -970,5 +987,23 @@
"com_ui_yes": "Yes",
"com_ui_zoom": "Zoom",
"com_user_message": "You",
"com_warning_resubmit_unsupported": "Resubmitting the AI message is not supported for this endpoint."
"com_warning_resubmit_unsupported": "Resubmitting the AI message is not supported for this endpoint.",
"com_ui_value": "Value",
"com_ui_edit_memory": "Edit Memory",
"com_ui_enter_key": "Enter key",
"com_ui_enter_value": "Enter value",
"com_ui_memory_updated": "Updated saved memory",
"com_ui_memory_updated_items": "Updated Memories",
"com_ui_memory_deleted_items": "Deleted Memories",
"com_ui_memory_deleted": "Memory deleted",
"com_ui_reference_saved_memories": "Reference saved memories",
"com_ui_reference_saved_memories_description": "Allow the assistant to reference and use your saved memories when responding",
"com_ui_no_personalization_available": "No personalization options are currently available",
"com_ui_preferences_updated": "Preferences updated successfully",
"com_ui_error_updating_preferences": "Error updating preferences",
"com_ui_use_memory": "Use memory",
"com_ui_create_memory": "Create Memory",
"com_ui_memory_created": "Memory created successfully",
"com_ui_memory_key_exists": "A memory with this key already exists. Please use a different key."
}

View file

@ -0,0 +1,90 @@
import type { MemoriesResponse, TUserMemory, MemoryArtifact } from 'librechat-data-provider';
type HandleMemoryArtifactParams = {
memoryArtifact: MemoryArtifact;
currentData: MemoriesResponse;
};
/**
* Pure function to handle memory artifact updates
* @param params - Object containing memoryArtifact and currentData
* @returns Updated MemoriesResponse or undefined if no update needed
*/
export function handleMemoryArtifact({
memoryArtifact,
currentData,
}: HandleMemoryArtifactParams): MemoriesResponse | undefined {
const { type, key, value, tokenCount = 0 } = memoryArtifact;
if (type === 'update' && !value) {
return undefined;
}
const memories = currentData.memories;
const existingIndex = memories.findIndex((m) => m.key === key);
if (type === 'delete') {
if (existingIndex === -1) {
return undefined;
}
const deletedMemory = memories[existingIndex];
const newMemories = [...memories];
newMemories.splice(existingIndex, 1);
const totalTokens = currentData.totalTokens - (deletedMemory.tokenCount || 0);
const usagePercentage = currentData.tokenLimit
? Math.min(100, Math.round((totalTokens / currentData.tokenLimit) * 100))
: null;
return {
...currentData,
memories: newMemories,
totalTokens,
usagePercentage,
};
}
if (type === 'update') {
const timestamp = new Date().toISOString();
let totalTokens = currentData.totalTokens;
let newMemories: TUserMemory[];
if (existingIndex >= 0) {
const oldTokenCount = memories[existingIndex].tokenCount || 0;
totalTokens = totalTokens - oldTokenCount + tokenCount;
newMemories = [...memories];
newMemories[existingIndex] = {
key,
value: value!,
tokenCount,
updated_at: timestamp,
};
} else {
totalTokens = totalTokens + tokenCount;
newMemories = [
...memories,
{
key,
value: value!,
tokenCount,
updated_at: timestamp,
},
];
}
const usagePercentage = currentData.tokenLimit
? Math.min(100, Math.round((totalTokens / currentData.tokenLimit) * 100))
: null;
return {
...currentData,
memories: newMemories,
totalTokens,
usagePercentage,
};
}
return undefined;
}

View file

@ -1,3 +1,5 @@
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-nocheck
const fs = require('fs');
const path = require('path');
const { execSync } = require('child_process');
@ -10,7 +12,7 @@ const directories = [
rootDir,
path.resolve(rootDir, 'packages', 'data-provider'),
path.resolve(rootDir, 'packages', 'data-schemas'),
path.resolve(rootDir, 'packages', 'mcp'),
path.resolve(rootDir, 'packages', 'api'),
path.resolve(rootDir, 'client'),
path.resolve(rootDir, 'api'),
];

View file

@ -1,3 +1,5 @@
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-nocheck
const path = require('path');
const { execSync } = require('child_process');
const { askQuestion, isDockerRunning, deleteNodeModules, silentExit } = require('./helpers');
@ -17,7 +19,7 @@ const directories = [
rootDir,
path.resolve(rootDir, 'packages', 'data-provider'),
path.resolve(rootDir, 'packages', 'data-schemas'),
path.resolve(rootDir, 'packages', 'mcp'),
path.resolve(rootDir, 'packages', 'api'),
path.resolve(rootDir, 'client'),
path.resolve(rootDir, 'api'),
];
@ -128,7 +130,7 @@ async function validateDockerRunning() {
console.green('Your LibreChat app is now up to date! Start the app with the following command:');
console.purple(startCommand);
console.orange(
'Note: it\'s also recommended to clear your browser cookies and localStorage for LibreChat to assure a fully clean installation.',
"Note: it's also recommended to clear your browser cookies and localStorage for LibreChat to assure a fully clean installation.",
);
console.orange('Also: Don\'t worry, your data is safe :)');
console.orange("Also: Don't worry, your data is safe :)");
})();

View file

@ -31,9 +31,9 @@ export default [
'client/public/**/*',
'client/coverage/**/*',
'e2e/playwright-report/**/*',
'packages/mcp/types/**/*',
'packages/mcp/dist/**/*',
'packages/mcp/test_bundle/**/*',
'packages/api/types/**/*',
'packages/api/dist/**/*',
'packages/api/test_bundle/**/*',
'api/demo/**/*',
'packages/data-provider/types/**/*',
'packages/data-provider/dist/**/*',
@ -317,7 +317,7 @@ export default [
files: ['./api/demo/**/*.ts'],
},
{
files: ['./packages/mcp/**/*.ts'],
files: ['./packages/api/**/*.ts'],
},
{
files: ['./config/translations/**/*.ts'],
@ -351,12 +351,12 @@ export default [
},
},
{
files: ['./packages/mcp/specs/**/*.ts'],
files: ['./packages/api/specs/**/*.ts'],
languageOptions: {
ecmaVersion: 5,
sourceType: 'script',
parserOptions: {
project: './packages/mcp/tsconfig.spec.json',
project: './packages/api/tsconfig.spec.json',
},
},
},

View file

@ -299,3 +299,25 @@ endpoints:
# px: 1024
# # See the Custom Configuration Guide for more information on Assistants Config:
# # https://www.librechat.ai/docs/configuration/librechat_yaml/object_structure/assistants_endpoint
# Memory configuration for user memories
# memory:
# # (optional) Disable memory functionality
# disabled: false
# # (optional) Restrict memory keys to specific values to limit memory storage and improve consistency
# validKeys: ["preferences", "work_info", "personal_info", "skills", "interests", "context"]
# # (optional) Maximum token limit for memory storage (not yet implemented for token counting)
# tokenLimit: 10000
# # (optional) Enable personalization features (defaults to true if memory is configured)
# # When false, users will not see the Personalization tab in settings
# personalize: true
# # Memory agent configuration - either use an existing agent by ID or define inline
# agent:
# # Option 1: Use existing agent by ID
# id: "your-memory-agent-id"
# # Option 2: Define agent inline
# # provider: "openai"
# # model: "gpt-4o-mini"
# # instructions: "You are a memory management assistant. Store and manage user information accurately."
# # model_parameters:
# # temperature: 0.1

381
package-lock.json generated
View file

@ -65,6 +65,7 @@
"@langchain/google-vertexai": "^0.2.9",
"@langchain/textsplitters": "^0.1.0",
"@librechat/agents": "^2.4.38",
"@librechat/api": "*",
"@librechat/data-schemas": "*",
"@node-saml/passport-saml": "^5.0.0",
"@waylaidwanderer/fetch-event-source": "^3.0.1",
@ -97,7 +98,6 @@
"keyv-file": "^5.1.2",
"klona": "^2.0.6",
"librechat-data-provider": "*",
"librechat-mcp": "*",
"lodash": "^4.17.21",
"meilisearch": "^0.38.0",
"memorystore": "^1.6.7",
@ -106,6 +106,7 @@
"mongoose": "^8.12.1",
"multer": "^2.0.0",
"nanoid": "^3.3.7",
"node-fetch": "^2.7.0",
"nodemailer": "^6.9.15",
"ollama": "^0.5.0",
"openai": "^4.96.2",
@ -126,7 +127,7 @@
"traverse": "^0.6.7",
"ua-parser-js": "^1.0.36",
"winston": "^3.11.0",
"winston-daily-rotate-file": "^4.7.1",
"winston-daily-rotate-file": "^5.0.0",
"youtube-transcript": "^1.2.1",
"zod": "^3.22.4"
},
@ -2372,23 +2373,13 @@
"node": ">= 10.16.0"
}
},
"api/node_modules/node-fetch": {
"version": "2.6.7",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.7.tgz",
"integrity": "sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==",
"dependencies": {
"whatwg-url": "^5.0.0"
},
"api/node_modules/object-hash": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/object-hash/-/object-hash-3.0.0.tgz",
"integrity": "sha512-RSn9F68PjH9HqtltsSnqYC1XXoWe9Bju5+213R98cNGttag9q9yAOTzdbsqvIa7aNm5WffBZFpWYr2aWrklWAw==",
"license": "MIT",
"engines": {
"node": "4.x || >=6.0.0"
},
"peerDependencies": {
"encoding": "^0.1.0"
},
"peerDependenciesMeta": {
"encoding": {
"optional": true
}
"node": ">= 6"
}
},
"api/node_modules/openid-client": {
@ -2442,11 +2433,6 @@
"@img/sharp-win32-x64": "0.33.5"
}
},
"api/node_modules/tr46": {
"version": "0.0.3",
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
},
"api/node_modules/uuid": {
"version": "10.0.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-10.0.0.tgz",
@ -2459,18 +2445,22 @@
"uuid": "dist/bin/uuid"
}
},
"api/node_modules/webidl-conversions": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="
},
"api/node_modules/whatwg-url": {
"api/node_modules/winston-daily-rotate-file": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
"resolved": "https://registry.npmjs.org/winston-daily-rotate-file/-/winston-daily-rotate-file-5.0.0.tgz",
"integrity": "sha512-JDjiXXkM5qvwY06733vf09I2wnMXpZEhxEVOSPenZMii+g7pcDcTBt2MRugnoi8BwVSuCT2jfRXBUy+n1Zz/Yw==",
"license": "MIT",
"dependencies": {
"tr46": "~0.0.3",
"webidl-conversions": "^3.0.0"
"file-stream-rotator": "^0.6.1",
"object-hash": "^3.0.0",
"triple-beam": "^1.4.1",
"winston-transport": "^4.7.0"
},
"engines": {
"node": ">=8"
},
"peerDependencies": {
"winston": "^3"
}
},
"client": {
@ -19850,6 +19840,10 @@
"uuid": "dist/bin/uuid"
}
},
"node_modules/@librechat/api": {
"resolved": "packages/api",
"link": true
},
"node_modules/@librechat/backend": {
"resolved": "api",
"link": true
@ -20107,6 +20101,7 @@
"resolved": "https://registry.npmjs.org/@modelcontextprotocol/sdk/-/sdk-1.11.2.tgz",
"integrity": "sha512-H9vwztj5OAqHg9GockCQC06k1natgcxWQSRpQcPJf6i5+MWBzfKkRtxGbjQf0X2ihii0ffLZCRGbYV2f2bjNCQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"content-type": "^1.0.5",
"cors": "^2.8.5",
@ -20128,6 +20123,7 @@
"resolved": "https://registry.npmjs.org/accepts/-/accepts-2.0.0.tgz",
"integrity": "sha512-5cvg6CtKwfgdmVqY1WIiXKc3Q1bkRqGLi+2W/6ao+6Y7gu/RCwRuAhGEzh5B4KlszSuTLgZYuqFqo5bImjNKng==",
"license": "MIT",
"peer": true,
"dependencies": {
"mime-types": "^3.0.0",
"negotiator": "^1.0.0"
@ -20141,6 +20137,7 @@
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-2.2.0.tgz",
"integrity": "sha512-02qvAaxv8tp7fBa/mw1ga98OGm+eCbqzJOKoRt70sLmfEEi+jyBYVTDGfCL/k06/4EMk/z01gCe7HoCH/f2LTg==",
"license": "MIT",
"peer": true,
"dependencies": {
"bytes": "^3.1.2",
"content-type": "^1.0.5",
@ -20161,6 +20158,7 @@
"resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-1.0.0.tgz",
"integrity": "sha512-Au9nRL8VNUut/XSzbQA38+M78dzP4D+eqg3gfJHMIHHYa3bg067xj1KxMUWj+VULbiZMowKngFFbKczUrNJ1mg==",
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "5.2.1"
},
@ -20173,6 +20171,7 @@
"resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.2.2.tgz",
"integrity": "sha512-D76uU73ulSXrD1UXF4KE2TMxVVwhsnCgfAyTg9k8P6KGZjlXKrOLe4dJQKI3Bxi5wjesZoFXJWElNWBjPZMbhg==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=6.6.0"
}
@ -20182,6 +20181,7 @@
"resolved": "https://registry.npmjs.org/express/-/express-5.1.0.tgz",
"integrity": "sha512-DT9ck5YIRU+8GYzzU5kT3eHGA5iL+1Zd0EutOmTE9Dtk+Tvuzd23VBU+ec7HPNSTxXYO55gPV/hq4pSBJDjFpA==",
"license": "MIT",
"peer": true,
"dependencies": {
"accepts": "^2.0.0",
"body-parser": "^2.2.0",
@ -20224,6 +20224,7 @@
"resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-2.1.0.tgz",
"integrity": "sha512-/t88Ty3d5JWQbWYgaOGCCYfXRwV1+be02WqYYlL6h0lEiUAMPM8o8qKGO01YIkOHzka2up08wvgYD0mDiI+q3Q==",
"license": "MIT",
"peer": true,
"dependencies": {
"debug": "^4.4.0",
"encodeurl": "^2.0.0",
@ -20241,6 +20242,7 @@
"resolved": "https://registry.npmjs.org/fresh/-/fresh-2.0.0.tgz",
"integrity": "sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 0.8"
}
@ -20250,6 +20252,7 @@
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz",
"integrity": "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==",
"license": "MIT",
"peer": true,
"dependencies": {
"safer-buffer": ">= 2.1.2 < 3.0.0"
},
@ -20262,6 +20265,7 @@
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-1.1.0.tgz",
"integrity": "sha512-aisnrDP4GNe06UcKFnV5bfMNPBUw4jsLGaWwWfnH3v02GnBuXX2MCVn5RbrWo0j3pczUilYblq7fQ7Nw2t5XKw==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 0.8"
}
@ -20271,6 +20275,7 @@
"resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-2.0.0.tgz",
"integrity": "sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=18"
},
@ -20283,6 +20288,7 @@
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.54.0.tgz",
"integrity": "sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 0.6"
}
@ -20292,6 +20298,7 @@
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-3.0.1.tgz",
"integrity": "sha512-xRc4oEhT6eaBpU1XF7AjpOFD+xQmXNB5OVKwp4tqCuBpHLS/ZbBDrc07mYTDqVMg6PfxUjjNp85O6Cd2Z/5HWA==",
"license": "MIT",
"peer": true,
"dependencies": {
"mime-db": "^1.54.0"
},
@ -20304,6 +20311,7 @@
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-1.0.0.tgz",
"integrity": "sha512-8Ofs/AUQh8MaEcrlq5xOX0CQ9ypTF5dl78mjlMNfOK08fzpgTHQRQPBxcPlEtIw0yRpws+Zo/3r+5WRby7u3Gg==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 0.6"
}
@ -20313,6 +20321,7 @@
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.0.tgz",
"integrity": "sha512-YWWTjgABSKcvs/nWBi9PycY/JiPJqOD4JA6o9Sej2AtvSGarXxKC3OQSk4pAarbdQlKAh5D4FCQkJNkW+GAn3w==",
"license": "BSD-3-Clause",
"peer": true,
"dependencies": {
"side-channel": "^1.1.0"
},
@ -20328,6 +20337,7 @@
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-3.0.0.tgz",
"integrity": "sha512-RmkhL8CAyCRPXCE28MMH0z2PNWQBNk2Q09ZdxM9IOOXwxwZbN+qbWaatPkdkWIKL2ZVDImrN/pK5HTRz2PcS4g==",
"license": "MIT",
"peer": true,
"dependencies": {
"bytes": "3.1.2",
"http-errors": "2.0.0",
@ -20343,6 +20353,7 @@
"resolved": "https://registry.npmjs.org/send/-/send-1.2.0.tgz",
"integrity": "sha512-uaW0WwXKpL9blXE2o0bRhoL2EGXIrZxQ2ZQ4mgcfoBxdFmQold+qWsD2jLrfZ0trjKL6vOw0j//eAwcALFjKSw==",
"license": "MIT",
"peer": true,
"dependencies": {
"debug": "^4.3.5",
"encodeurl": "^2.0.0",
@ -20365,6 +20376,7 @@
"resolved": "https://registry.npmjs.org/serve-static/-/serve-static-2.2.0.tgz",
"integrity": "sha512-61g9pCh0Vnh7IutZjtLGGpTA355+OPn2TyDv/6ivP2h/AdAVX9azsoxmg2/M6nZeQZNYBEwIcsne1mJd9oQItQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"encodeurl": "^2.0.0",
"escape-html": "^1.0.3",
@ -20380,6 +20392,7 @@
"resolved": "https://registry.npmjs.org/type-is/-/type-is-2.0.1.tgz",
"integrity": "sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==",
"license": "MIT",
"peer": true,
"dependencies": {
"content-type": "^1.0.5",
"media-typer": "^1.1.0",
@ -25236,6 +25249,16 @@
"@types/node": "*"
}
},
"node_modules/@types/bun": {
"version": "1.2.15",
"resolved": "https://registry.npmjs.org/@types/bun/-/bun-1.2.15.tgz",
"integrity": "sha512-U1ljPdBEphF0nw1MIk0hI7kPg7dFdPyM7EenHsp6W5loNHl7zqy6JQf/RKCgnUn2KDzUpkBwHPnEJEjII594bA==",
"dev": true,
"license": "MIT",
"dependencies": {
"bun-types": "1.2.15"
}
},
"node_modules/@types/connect": {
"version": "3.4.38",
"resolved": "https://registry.npmjs.org/@types/connect/-/connect-3.4.38.tgz",
@ -27280,6 +27303,16 @@
"integrity": "sha512-HpGFw18DgFWlncDfjTa2rcQ4W88O1mC8e8yZ2AvQY5KDaktSTwo+KRf6nHK6FRI5FyRyb/5T6+TSxfP7QyGsmQ==",
"dev": true
},
"node_modules/bun-types": {
"version": "1.2.15",
"resolved": "https://registry.npmjs.org/bun-types/-/bun-types-1.2.15.tgz",
"integrity": "sha512-NarRIaS+iOaQU1JPfyKhZm4AsUOrwUOqRNHY0XxI8GI8jYxiLXLcdjYMG9UKS+fwWasc1uw1htV9AX24dD+p4w==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/bundle-name": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/bundle-name/-/bundle-name-4.1.0.tgz",
@ -28881,6 +28914,7 @@
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/diff/-/diff-7.0.0.tgz",
"integrity": "sha512-PJWHUb1RFevKCwaFA9RlG5tCd+FO5iRh9A8HEtkmBH2Li03iJriB6m6JIN4rGz3K3JLawI7/veA1xzRKP6ISBw==",
"peer": true,
"engines": {
"node": ">=0.3.1"
}
@ -30431,6 +30465,7 @@
"version": "7.5.0",
"resolved": "https://registry.npmjs.org/express-rate-limit/-/express-rate-limit-7.5.0.tgz",
"integrity": "sha512-eB5zbQh5h+VenMPM3fh+nw1YExi5nMr6HUCR62ELSP11huvxm/Uir1H1QEyTkk5QX6A58pX6NmaTMceKZ0Eodg==",
"peer": true,
"engines": {
"node": ">= 16"
},
@ -33004,7 +33039,8 @@
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/is-promise/-/is-promise-4.0.0.tgz",
"integrity": "sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==",
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/is-reference": {
"version": "1.2.1",
@ -34656,10 +34692,6 @@
"resolved": "packages/data-provider",
"link": true
},
"node_modules/librechat-mcp": {
"resolved": "packages/mcp",
"link": true
},
"node_modules/lilconfig": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/lilconfig/-/lilconfig-2.1.0.tgz",
@ -37152,6 +37184,7 @@
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.7.0.tgz",
"integrity": "sha512-c4FRfUm/dbcWZ7U+1Wq0AwCyFL+3nt2bEw05wfxSz+DWpWsitgmSgYmy2dQdWyKC1694ELPqMs/YzUSNozLt8A==",
"license": "MIT",
"dependencies": {
"whatwg-url": "^5.0.0"
},
@ -37170,17 +37203,20 @@
"node_modules/node-fetch/node_modules/tr46": {
"version": "0.0.3",
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==",
"license": "MIT"
},
"node_modules/node-fetch/node_modules/webidl-conversions": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ==",
"license": "BSD-2-Clause"
},
"node_modules/node-fetch/node_modules/whatwg-url": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
"license": "MIT",
"dependencies": {
"tr46": "~0.0.3",
"webidl-conversions": "^3.0.0"
@ -37453,14 +37489,6 @@
"node": ">=0.10.0"
}
},
"node_modules/object-hash": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/object-hash/-/object-hash-2.2.0.tgz",
"integrity": "sha512-gScRMn0bS5fH+IuwyIFgnh9zBdo4DV+6GhygmWM9HyNJSgS0hScp1f5vjtm7oIIOiT9trXrShAkLFSc2IqKNgw==",
"engines": {
"node": ">= 6"
}
},
"node_modules/object-inspect": {
"version": "1.13.4",
"resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.4.tgz",
@ -38173,6 +38201,7 @@
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-8.2.0.tgz",
"integrity": "sha512-TdrF7fW9Rphjq4RjrW0Kp2AW0Ahwu9sRGTkS6bvDi0SCwZlEZYmcfDbEsTz8RVk0EHIS/Vd1bv3JhG+1xZuAyQ==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=16"
}
@ -38267,6 +38296,7 @@
"resolved": "https://registry.npmjs.org/pkce-challenge/-/pkce-challenge-5.0.0.tgz",
"integrity": "sha512-ueGLflrrnvwB3xuo/uGob5pd5FN7l0MsLf0Z87o/UQmRtwjvfylfc9MurIxRAWywCYTgrvpXBcqjV4OfCYGCIQ==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=16.20.0"
}
@ -41716,6 +41746,7 @@
"resolved": "https://registry.npmjs.org/router/-/router-2.2.0.tgz",
"integrity": "sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"debug": "^4.4.0",
"depd": "^2.0.0",
@ -45120,23 +45151,6 @@
"node": ">= 12.0.0"
}
},
"node_modules/winston-daily-rotate-file": {
"version": "4.7.1",
"resolved": "https://registry.npmjs.org/winston-daily-rotate-file/-/winston-daily-rotate-file-4.7.1.tgz",
"integrity": "sha512-7LGPiYGBPNyGHLn9z33i96zx/bd71pjBn9tqQzO3I4Tayv94WPmBNwKC7CO1wPHdP9uvu+Md/1nr6VSH9h0iaA==",
"dependencies": {
"file-stream-rotator": "^0.6.1",
"object-hash": "^2.0.1",
"triple-beam": "^1.3.0",
"winston-transport": "^4.4.0"
},
"engines": {
"node": ">=8"
},
"peerDependencies": {
"winston": "^3"
}
},
"node_modules/winston-transport": {
"version": "4.7.0",
"resolved": "https://registry.npmjs.org/winston-transport/-/winston-transport-4.7.0.tgz",
@ -46004,6 +46018,126 @@
"url": "https://github.com/sponsors/wooorm"
}
},
"packages/api": {
"name": "@librechat/api",
"version": "1.2.2",
"license": "ISC",
"devDependencies": {
"@babel/preset-env": "^7.21.5",
"@babel/preset-react": "^7.18.6",
"@babel/preset-typescript": "^7.21.0",
"@rollup/plugin-alias": "^5.1.0",
"@rollup/plugin-commonjs": "^25.0.2",
"@rollup/plugin-json": "^6.1.0",
"@rollup/plugin-node-resolve": "^15.1.0",
"@rollup/plugin-replace": "^5.0.5",
"@rollup/plugin-terser": "^0.4.4",
"@rollup/plugin-typescript": "^12.1.2",
"@types/bun": "^1.2.15",
"@types/diff": "^6.0.0",
"@types/express": "^5.0.0",
"@types/jest": "^29.5.2",
"@types/node": "^20.3.0",
"@types/react": "^18.2.18",
"@types/winston": "^2.4.4",
"jest": "^29.5.0",
"jest-junit": "^16.0.0",
"librechat-data-provider": "*",
"rimraf": "^5.0.1",
"rollup": "^4.22.4",
"rollup-plugin-generate-package-json": "^3.2.0",
"rollup-plugin-peer-deps-external": "^2.2.4",
"ts-node": "^10.9.2",
"typescript": "^5.0.4"
},
"peerDependencies": {
"@librechat/agents": "^2.4.37",
"@librechat/data-schemas": "*",
"@modelcontextprotocol/sdk": "^1.11.2",
"diff": "^7.0.0",
"eventsource": "^3.0.2",
"express": "^4.21.2",
"keyv": "^5.3.2",
"librechat-data-provider": "*",
"node-fetch": "2.7.0",
"tiktoken": "^1.0.15",
"zod": "^3.22.4"
}
},
"packages/api/node_modules/brace-expansion": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.1.tgz",
"integrity": "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==",
"dev": true,
"dependencies": {
"balanced-match": "^1.0.0"
}
},
"packages/api/node_modules/glob": {
"version": "10.4.5",
"resolved": "https://registry.npmjs.org/glob/-/glob-10.4.5.tgz",
"integrity": "sha512-7Bv8RF0k6xjo7d4A/PxYLbUCfb6c+Vpd2/mB2yRDlew7Jb5hEXiCD9ibfO7wpk8i4sevK6DFny9h7EYbM3/sHg==",
"dev": true,
"dependencies": {
"foreground-child": "^3.1.0",
"jackspeak": "^3.1.2",
"minimatch": "^9.0.4",
"minipass": "^7.1.2",
"package-json-from-dist": "^1.0.0",
"path-scurry": "^1.11.1"
},
"bin": {
"glob": "dist/esm/bin.mjs"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"packages/api/node_modules/jackspeak": {
"version": "3.4.3",
"resolved": "https://registry.npmjs.org/jackspeak/-/jackspeak-3.4.3.tgz",
"integrity": "sha512-OGlZQpz2yfahA/Rd1Y8Cd9SIEsqvXkLVoSw/cgwhnhFMDbsQFeZYoJJ7bIZBS9BcamUW96asq/npPWugM+RQBw==",
"dev": true,
"dependencies": {
"@isaacs/cliui": "^8.0.2"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
},
"optionalDependencies": {
"@pkgjs/parseargs": "^0.11.0"
}
},
"packages/api/node_modules/minimatch": {
"version": "9.0.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz",
"integrity": "sha512-G6T0ZX48xgozx7587koeX9Ys2NYy6Gmv//P89sEte9V9whIapMNF4idKxnW2QtCcLiTWlb/wfCabAtAFWhhBow==",
"dev": true,
"dependencies": {
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=16 || 14 >=14.17"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"packages/api/node_modules/rimraf": {
"version": "5.0.10",
"resolved": "https://registry.npmjs.org/rimraf/-/rimraf-5.0.10.tgz",
"integrity": "sha512-l0OE8wL34P4nJH/H2ffoaniAokM2qSmrtXHmlpvYr5AVVX8msAyW0l8NVJFDxlSK4u3Uh/f41cQheDVdnYijwQ==",
"dev": true,
"dependencies": {
"glob": "^10.3.7"
},
"bin": {
"rimraf": "dist/esm/bin.mjs"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"packages/auth": {
"name": "@librechat/auth",
"version": "0.0.1",
@ -46046,7 +46180,7 @@
},
"packages/data-provider": {
"name": "librechat-data-provider",
"version": "0.7.86",
"version": "0.7.87",
"license": "ISC",
"dependencies": {
"axios": "^1.8.2",
@ -46481,121 +46615,6 @@
"engines": {
"node": ">= 12.0.0"
}
},
"packages/mcp": {
"name": "librechat-mcp",
"version": "1.2.2",
"license": "ISC",
"dependencies": {
"@modelcontextprotocol/sdk": "^1.11.2",
"diff": "^7.0.0",
"eventsource": "^3.0.2",
"express": "^4.21.2"
},
"devDependencies": {
"@babel/preset-env": "^7.21.5",
"@babel/preset-react": "^7.18.6",
"@babel/preset-typescript": "^7.21.0",
"@rollup/plugin-alias": "^5.1.0",
"@rollup/plugin-commonjs": "^25.0.2",
"@rollup/plugin-json": "^6.1.0",
"@rollup/plugin-node-resolve": "^15.1.0",
"@rollup/plugin-replace": "^5.0.5",
"@rollup/plugin-terser": "^0.4.4",
"@rollup/plugin-typescript": "^12.1.2",
"@types/diff": "^6.0.0",
"@types/express": "^5.0.0",
"@types/jest": "^29.5.2",
"@types/node": "^20.3.0",
"@types/react": "^18.2.18",
"@types/winston": "^2.4.4",
"jest": "^29.5.0",
"jest-junit": "^16.0.0",
"librechat-data-provider": "*",
"rimraf": "^5.0.1",
"rollup": "^4.22.4",
"rollup-plugin-generate-package-json": "^3.2.0",
"rollup-plugin-peer-deps-external": "^2.2.4",
"ts-node": "^10.9.2",
"typescript": "^5.0.4"
},
"peerDependencies": {
"keyv": "^5.3.2"
}
},
"packages/mcp/node_modules/brace-expansion": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.1.tgz",
"integrity": "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==",
"dev": true,
"dependencies": {
"balanced-match": "^1.0.0"
}
},
"packages/mcp/node_modules/glob": {
"version": "10.4.5",
"resolved": "https://registry.npmjs.org/glob/-/glob-10.4.5.tgz",
"integrity": "sha512-7Bv8RF0k6xjo7d4A/PxYLbUCfb6c+Vpd2/mB2yRDlew7Jb5hEXiCD9ibfO7wpk8i4sevK6DFny9h7EYbM3/sHg==",
"dev": true,
"dependencies": {
"foreground-child": "^3.1.0",
"jackspeak": "^3.1.2",
"minimatch": "^9.0.4",
"minipass": "^7.1.2",
"package-json-from-dist": "^1.0.0",
"path-scurry": "^1.11.1"
},
"bin": {
"glob": "dist/esm/bin.mjs"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"packages/mcp/node_modules/jackspeak": {
"version": "3.4.3",
"resolved": "https://registry.npmjs.org/jackspeak/-/jackspeak-3.4.3.tgz",
"integrity": "sha512-OGlZQpz2yfahA/Rd1Y8Cd9SIEsqvXkLVoSw/cgwhnhFMDbsQFeZYoJJ7bIZBS9BcamUW96asq/npPWugM+RQBw==",
"dev": true,
"dependencies": {
"@isaacs/cliui": "^8.0.2"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
},
"optionalDependencies": {
"@pkgjs/parseargs": "^0.11.0"
}
},
"packages/mcp/node_modules/minimatch": {
"version": "9.0.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz",
"integrity": "sha512-G6T0ZX48xgozx7587koeX9Ys2NYy6Gmv//P89sEte9V9whIapMNF4idKxnW2QtCcLiTWlb/wfCabAtAFWhhBow==",
"dev": true,
"dependencies": {
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=16 || 14 >=14.17"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"packages/mcp/node_modules/rimraf": {
"version": "5.0.10",
"resolved": "https://registry.npmjs.org/rimraf/-/rimraf-5.0.10.tgz",
"integrity": "sha512-l0OE8wL34P4nJH/H2ffoaniAokM2qSmrtXHmlpvYr5AVVX8msAyW0l8NVJFDxlSK4u3Uh/f41cQheDVdnYijwQ==",
"dev": true,
"dependencies": {
"glob": "^10.3.7"
},
"bin": {
"rimraf": "dist/esm/bin.mjs"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
}
}
}

View file

@ -38,9 +38,9 @@
"backend:dev": "cross-env NODE_ENV=development npx nodemon api/server/index.js",
"backend:stop": "node config/stop-backend.js",
"build:data-provider": "cd packages/data-provider && npm run build",
"build:mcp": "cd packages/mcp && npm run build",
"build:api": "cd packages/api && npm run build",
"build:data-schemas": "cd packages/data-schemas && npm run build",
"frontend": "npm run build:data-provider && npm run build:mcp && npm run build:data-schemas && cd client && npm run build",
"frontend": "npm run build:data-provider && npm run build:data-schemas && npm run build:api && cd client && npm run build",
"frontend:ci": "npm run build:data-provider && cd client && npm run build:ci",
"frontend:dev": "cd client && npm run dev",
"e2e": "playwright test --config=e2e/playwright.config.local.ts",
@ -62,7 +62,7 @@
"b:api-inspect": "NODE_ENV=production bun --inspect run api/server/index.js",
"b:api:dev": "NODE_ENV=production bun run --watch api/server/index.js",
"b:data": "cd packages/data-provider && bun run b:build",
"b:mcp": "cd packages/mcp && bun run b:build",
"b:mcp": "cd packages/api && bun run b:build",
"b:data-schemas": "cd packages/data-schemas && bun run b:build",
"b:client": "bun --bun run b:data && bun --bun run b:mcp && bun --bun run b:data-schemas && cd client && bun --bun run b:build",
"b:client:dev": "cd client && bun run b:dev",

View file

@ -1,5 +1,5 @@
{
"name": "librechat-mcp",
"name": "@librechat/api",
"version": "1.2.2",
"type": "commonjs",
"description": "MCP services for LibreChat",
@ -47,6 +47,7 @@
"@rollup/plugin-replace": "^5.0.5",
"@rollup/plugin-terser": "^0.4.4",
"@rollup/plugin-typescript": "^12.1.2",
"@types/bun": "^1.2.15",
"@types/diff": "^6.0.0",
"@types/express": "^5.0.0",
"@types/jest": "^29.5.2",
@ -66,13 +67,17 @@
"publishConfig": {
"registry": "https://registry.npmjs.org/"
},
"dependencies": {
"peerDependencies": {
"@librechat/agents": "^2.4.37",
"@librechat/data-schemas": "*",
"librechat-data-provider": "*",
"@modelcontextprotocol/sdk": "^1.11.2",
"diff": "^7.0.0",
"eventsource": "^3.0.2",
"express": "^4.21.2"
},
"peerDependencies": {
"keyv": "^5.3.2"
"express": "^4.21.2",
"node-fetch": "2.7.0",
"keyv": "^5.3.2",
"zod": "^3.22.4",
"tiktoken": "^1.0.15"
}
}

View file

@ -0,0 +1,3 @@
export * from './memory';
export * from './resources';
export * from './run';

View file

@ -0,0 +1,468 @@
/** Memories */
import { z } from 'zod';
import { tool } from '@langchain/core/tools';
import { Tools } from 'librechat-data-provider';
import { logger } from '@librechat/data-schemas';
import { Run, Providers, GraphEvents } from '@librechat/agents';
import type {
StreamEventData,
ToolEndCallback,
EventHandler,
ToolEndData,
LLMConfig,
} from '@librechat/agents';
import type { TAttachment, MemoryArtifact } from 'librechat-data-provider';
import type { ObjectId, MemoryMethods } from '@librechat/data-schemas';
import type { BaseMessage } from '@langchain/core/messages';
import type { Response as ServerResponse } from 'express';
import { Tokenizer } from '~/utils';
type RequiredMemoryMethods = Pick<
MemoryMethods,
'setMemory' | 'deleteMemory' | 'getFormattedMemories'
>;
type ToolEndMetadata = Record<string, unknown> & {
run_id?: string;
thread_id?: string;
};
export interface MemoryConfig {
validKeys?: string[];
instructions?: string;
llmConfig?: Partial<LLMConfig>;
tokenLimit?: number;
}
export const memoryInstructions =
'The system automatically stores important user information and can update or delete memories based on user requests, enabling dynamic memory management.';
const getDefaultInstructions = (
validKeys?: string[],
tokenLimit?: number,
) => `Use the \`set_memory\` tool to save important information about the user, but ONLY when the user has explicitly provided this information. If there is nothing to note about the user specifically, END THE TURN IMMEDIATELY.
The \`delete_memory\` tool should only be used in two scenarios:
1. When the user explicitly asks to forget or remove specific information
2. When updating existing memories, use the \`set_memory\` tool instead of deleting and re-adding the memory.
${
validKeys && validKeys.length > 0
? `CRITICAL INSTRUCTION: Only the following keys are valid for storing memories:
${validKeys.map((key) => `- ${key}`).join('\n ')}`
: 'You can use any appropriate key to store memories about the user.'
}
${
tokenLimit
? `⚠️ TOKEN LIMIT: Each memory value must not exceed ${tokenLimit} tokens. Be concise and store only essential information.`
: ''
}
WARNING
DO NOT STORE ANY INFORMATION UNLESS THE USER HAS EXPLICITLY PROVIDED IT.
ONLY store information the user has EXPLICITLY shared.
NEVER guess or assume user information.
ALL memory values must be factual statements about THIS specific user.
If nothing needs to be stored, DO NOT CALL any memory tools.
If you're unsure whether to store something, DO NOT store it.
If nothing needs to be stored, END THE TURN IMMEDIATELY.`;
/**
* Creates a memory tool instance with user context
*/
const createMemoryTool = ({
userId,
setMemory,
validKeys,
tokenLimit,
totalTokens = 0,
}: {
userId: string | ObjectId;
setMemory: MemoryMethods['setMemory'];
validKeys?: string[];
tokenLimit?: number;
totalTokens?: number;
}) => {
return tool(
async ({ key, value }) => {
try {
if (validKeys && validKeys.length > 0 && !validKeys.includes(key)) {
logger.warn(
`Memory Agent failed to set memory: Invalid key "${key}". Must be one of: ${validKeys.join(
', ',
)}`,
);
return `Invalid key "${key}". Must be one of: ${validKeys.join(', ')}`;
}
const tokenCount = Tokenizer.getTokenCount(value, 'o200k_base');
if (tokenLimit && tokenCount > tokenLimit) {
logger.warn(
`Memory Agent failed to set memory: Value exceeds token limit. Value has ${tokenCount} tokens, but limit is ${tokenLimit}`,
);
return `Memory value too large: ${tokenCount} tokens exceeds limit of ${tokenLimit}`;
}
if (tokenLimit && totalTokens + tokenCount > tokenLimit) {
const remainingCapacity = tokenLimit - totalTokens;
logger.warn(
`Memory Agent failed to set memory: Would exceed total token limit. Current usage: ${totalTokens}, new memory: ${tokenCount} tokens, limit: ${tokenLimit}`,
);
return `Cannot add memory: would exceed token limit. Current usage: ${totalTokens}/${tokenLimit} tokens. This memory requires ${tokenCount} tokens, but only ${remainingCapacity} tokens available.`;
}
const artifact: Record<Tools.memory, MemoryArtifact> = {
[Tools.memory]: {
key,
value,
tokenCount,
type: 'update',
},
};
const result = await setMemory({ userId, key, value, tokenCount });
if (result.ok) {
logger.debug(`Memory set for key "${key}" (${tokenCount} tokens) for user "${userId}"`);
return [`Memory set for key "${key}" (${tokenCount} tokens)`, artifact];
}
logger.warn(`Failed to set memory for key "${key}" for user "${userId}"`);
return [`Failed to set memory for key "${key}"`, undefined];
} catch (error) {
logger.error('Memory Agent failed to set memory', error);
return [`Error setting memory for key "${key}"`, undefined];
}
},
{
name: 'set_memory',
description: 'Saves important information about the user into memory.',
responseFormat: 'content_and_artifact',
schema: z.object({
key: z
.string()
.describe(
validKeys && validKeys.length > 0
? `The key of the memory value. Must be one of: ${validKeys.join(', ')}`
: 'The key identifier for this memory',
),
value: z
.string()
.describe(
'Value MUST be a complete sentence that fully describes relevant user information.',
),
}),
},
);
};
/**
* Creates a delete memory tool instance with user context
*/
const createDeleteMemoryTool = ({
userId,
deleteMemory,
validKeys,
}: {
userId: string | ObjectId;
deleteMemory: MemoryMethods['deleteMemory'];
validKeys?: string[];
}) => {
return tool(
async ({ key }) => {
try {
if (validKeys && validKeys.length > 0 && !validKeys.includes(key)) {
logger.warn(
`Memory Agent failed to delete memory: Invalid key "${key}". Must be one of: ${validKeys.join(
', ',
)}`,
);
return `Invalid key "${key}". Must be one of: ${validKeys.join(', ')}`;
}
const artifact: Record<Tools.memory, MemoryArtifact> = {
[Tools.memory]: {
key,
type: 'delete',
},
};
const result = await deleteMemory({ userId, key });
if (result.ok) {
logger.debug(`Memory deleted for key "${key}" for user "${userId}"`);
return [`Memory deleted for key "${key}"`, artifact];
}
logger.warn(`Failed to delete memory for key "${key}" for user "${userId}"`);
return [`Failed to delete memory for key "${key}"`, undefined];
} catch (error) {
logger.error('Memory Agent failed to delete memory', error);
return [`Error deleting memory for key "${key}"`, undefined];
}
},
{
name: 'delete_memory',
description:
'Deletes specific memory data about the user using the provided key. For updating existing memories, use the `set_memory` tool instead',
responseFormat: 'content_and_artifact',
schema: z.object({
key: z
.string()
.describe(
validKeys && validKeys.length > 0
? `The key of the memory to delete. Must be one of: ${validKeys.join(', ')}`
: 'The key identifier of the memory to delete',
),
}),
},
);
};
export class BasicToolEndHandler implements EventHandler {
private callback?: ToolEndCallback;
constructor(callback?: ToolEndCallback) {
this.callback = callback;
}
handle(
event: string,
data: StreamEventData | undefined,
metadata?: Record<string, unknown>,
): void {
if (!metadata) {
console.warn(`Graph or metadata not found in ${event} event`);
return;
}
const toolEndData = data as ToolEndData | undefined;
if (!toolEndData?.output) {
console.warn('No output found in tool_end event');
return;
}
this.callback?.(toolEndData, metadata);
}
}
export async function processMemory({
res,
userId,
setMemory,
deleteMemory,
messages,
memory,
messageId,
conversationId,
validKeys,
instructions,
llmConfig,
tokenLimit,
totalTokens = 0,
}: {
res: ServerResponse;
setMemory: MemoryMethods['setMemory'];
deleteMemory: MemoryMethods['deleteMemory'];
userId: string | ObjectId;
memory: string;
messageId: string;
conversationId: string;
messages: BaseMessage[];
validKeys?: string[];
instructions: string;
tokenLimit?: number;
totalTokens?: number;
llmConfig?: Partial<LLMConfig>;
}): Promise<(TAttachment | null)[] | undefined> {
try {
const memoryTool = createMemoryTool({ userId, tokenLimit, setMemory, validKeys, totalTokens });
const deleteMemoryTool = createDeleteMemoryTool({
userId,
validKeys,
deleteMemory,
});
const currentMemoryTokens = totalTokens;
let memoryStatus = `# Existing memory:\n${memory ?? 'No existing memories'}`;
if (tokenLimit) {
const remainingTokens = tokenLimit - currentMemoryTokens;
memoryStatus = `# Memory Status:
Current memory usage: ${currentMemoryTokens} tokens
Token limit: ${tokenLimit} tokens
Remaining capacity: ${remainingTokens} tokens
# Existing memory:
${memory ?? 'No existing memories'}`;
}
const defaultLLMConfig: LLMConfig = {
provider: Providers.OPENAI,
model: 'gpt-4.1-mini',
temperature: 0.4,
streaming: false,
disableStreaming: true,
};
const finalLLMConfig = {
...defaultLLMConfig,
...llmConfig,
/**
* Ensure streaming is always disabled for memory processing
*/
streaming: false,
disableStreaming: true,
};
const artifactPromises: Promise<TAttachment | null>[] = [];
const memoryCallback = createMemoryCallback({ res, artifactPromises });
const customHandlers = {
[GraphEvents.TOOL_END]: new BasicToolEndHandler(memoryCallback),
};
const run = await Run.create({
runId: messageId,
graphConfig: {
type: 'standard',
llmConfig: finalLLMConfig,
tools: [memoryTool, deleteMemoryTool],
instructions,
additional_instructions: memoryStatus,
toolEnd: true,
},
customHandlers,
returnContent: true,
});
const config = {
configurable: {
provider: llmConfig?.provider,
thread_id: `memory-run-${conversationId}`,
},
streamMode: 'values',
version: 'v2',
} as const;
const inputs = {
messages,
};
const content = await run.processStream(inputs, config);
if (content) {
logger.debug('Memory Agent processed memory successfully', content);
} else {
logger.warn('Memory Agent processed memory but returned no content');
}
return await Promise.all(artifactPromises);
} catch (error) {
logger.error('Memory Agent failed to process memory', error);
}
}
export async function createMemoryProcessor({
res,
userId,
messageId,
memoryMethods,
conversationId,
config = {},
}: {
res: ServerResponse;
messageId: string;
conversationId: string;
userId: string | ObjectId;
memoryMethods: RequiredMemoryMethods;
config?: MemoryConfig;
}): Promise<[string, (messages: BaseMessage[]) => Promise<(TAttachment | null)[] | undefined>]> {
const { validKeys, instructions, llmConfig, tokenLimit } = config;
const finalInstructions = instructions || getDefaultInstructions(validKeys, tokenLimit);
const { withKeys, withoutKeys, totalTokens } = await memoryMethods.getFormattedMemories({
userId,
});
return [
withoutKeys,
async function (messages: BaseMessage[]): Promise<(TAttachment | null)[] | undefined> {
try {
return await processMemory({
res,
userId,
messages,
validKeys,
llmConfig,
messageId,
tokenLimit,
conversationId,
memory: withKeys,
totalTokens: totalTokens || 0,
instructions: finalInstructions,
setMemory: memoryMethods.setMemory,
deleteMemory: memoryMethods.deleteMemory,
});
} catch (error) {
logger.error('Memory Agent failed to process memory', error);
}
},
];
}
async function handleMemoryArtifact({
res,
data,
metadata,
}: {
res: ServerResponse;
data: ToolEndData;
metadata?: ToolEndMetadata;
}) {
const output = data?.output;
if (!output) {
return null;
}
if (!output.artifact) {
return null;
}
const memoryArtifact = output.artifact[Tools.memory] as MemoryArtifact | undefined;
if (!memoryArtifact) {
return null;
}
const attachment: Partial<TAttachment> = {
type: Tools.memory,
toolCallId: output.tool_call_id,
messageId: metadata?.run_id ?? '',
conversationId: metadata?.thread_id ?? '',
[Tools.memory]: memoryArtifact,
};
if (!res.headersSent) {
return attachment;
}
res.write(`event: attachment\ndata: ${JSON.stringify(attachment)}\n\n`);
return attachment;
}
/**
* Creates a memory callback for handling memory artifacts
* @param params - The parameters object
* @param params.res - The server response object
* @param params.artifactPromises - Array to collect artifact promises
* @returns The memory callback function
*/
export function createMemoryCallback({
res,
artifactPromises,
}: {
res: ServerResponse;
artifactPromises: Promise<Partial<TAttachment> | null>[];
}): ToolEndCallback {
return async (data: ToolEndData, metadata?: Record<string, unknown>) => {
const output = data?.output;
const memoryArtifact = output?.artifact?.[Tools.memory] as MemoryArtifact;
if (memoryArtifact == null) {
return;
}
artifactPromises.push(
handleMemoryArtifact({ res, data, metadata }).catch((error) => {
logger.error('Error processing memory artifact content:', error);
return null;
}),
);
};
}

View file

@ -0,0 +1,543 @@
import { primeResources } from './resources';
import { logger } from '@librechat/data-schemas';
import { EModelEndpoint, EToolResources, AgentCapabilities } from 'librechat-data-provider';
import type { Request as ServerRequest } from 'express';
import type { TFile } from 'librechat-data-provider';
import type { TGetFiles } from './resources';
// Mock logger
jest.mock('@librechat/data-schemas', () => ({
logger: {
error: jest.fn(),
},
}));
describe('primeResources', () => {
let mockReq: ServerRequest;
let mockGetFiles: jest.MockedFunction<TGetFiles>;
let requestFileSet: Set<string>;
beforeEach(() => {
// Reset mocks
jest.clearAllMocks();
// Setup mock request
mockReq = {
app: {
locals: {
[EModelEndpoint.agents]: {
capabilities: [AgentCapabilities.ocr],
},
},
},
} as unknown as ServerRequest;
// Setup mock getFiles function
mockGetFiles = jest.fn();
// Setup request file set
requestFileSet = new Set(['file1', 'file2', 'file3']);
});
describe('when OCR is enabled and tool_resources has OCR file_ids', () => {
it('should fetch OCR files and include them in attachments', async () => {
const mockOcrFiles: TFile[] = [
{
user: 'user1',
file_id: 'ocr-file-1',
filename: 'document.pdf',
filepath: '/uploads/document.pdf',
object: 'file',
type: 'application/pdf',
bytes: 1024,
embedded: false,
usage: 0,
},
];
mockGetFiles.mockResolvedValue(mockOcrFiles);
const tool_resources = {
[EToolResources.ocr]: {
file_ids: ['ocr-file-1'],
},
};
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments: undefined,
tool_resources,
});
expect(mockGetFiles).toHaveBeenCalledWith({ file_id: { $in: ['ocr-file-1'] } }, {}, {});
expect(result.attachments).toEqual(mockOcrFiles);
expect(result.tool_resources).toEqual(tool_resources);
});
});
describe('when OCR is disabled', () => {
it('should not fetch OCR files even if tool_resources has OCR file_ids', async () => {
(mockReq.app as ServerRequest['app']).locals[EModelEndpoint.agents].capabilities = [];
const tool_resources = {
[EToolResources.ocr]: {
file_ids: ['ocr-file-1'],
},
};
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments: undefined,
tool_resources,
});
expect(mockGetFiles).not.toHaveBeenCalled();
expect(result.attachments).toBeUndefined();
expect(result.tool_resources).toEqual(tool_resources);
});
});
describe('when attachments are provided', () => {
it('should process files with fileIdentifier as execute_code resources', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'script.py',
filepath: '/uploads/script.py',
object: 'file',
type: 'text/x-python',
bytes: 512,
embedded: false,
usage: 0,
metadata: {
fileIdentifier: 'python-script',
},
},
];
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources?.[EToolResources.execute_code]?.files).toEqual(mockFiles);
});
it('should process embedded files as file_search resources', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file2',
filename: 'document.txt',
filepath: '/uploads/document.txt',
object: 'file',
type: 'text/plain',
bytes: 256,
embedded: true,
usage: 0,
},
];
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources?.[EToolResources.file_search]?.files).toEqual(mockFiles);
});
it('should process image files in requestFileSet as image_edit resources', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'image.png',
filepath: '/uploads/image.png',
object: 'file',
type: 'image/png',
bytes: 2048,
embedded: false,
usage: 0,
height: 800,
width: 600,
},
];
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources?.[EToolResources.image_edit]?.files).toEqual(mockFiles);
});
it('should not process image files not in requestFileSet', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file-not-in-set',
filename: 'image.png',
filepath: '/uploads/image.png',
object: 'file',
type: 'image/png',
bytes: 2048,
embedded: false,
usage: 0,
height: 800,
width: 600,
},
];
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources?.[EToolResources.image_edit]).toBeUndefined();
});
it('should not process image files without height and width', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'image.png',
filepath: '/uploads/image.png',
object: 'file',
type: 'image/png',
bytes: 2048,
embedded: false,
usage: 0,
// Missing height and width
},
];
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources?.[EToolResources.image_edit]).toBeUndefined();
});
it('should filter out null files from attachments', async () => {
const mockFiles: Array<TFile | null> = [
{
user: 'user1',
file_id: 'file1',
filename: 'valid.txt',
filepath: '/uploads/valid.txt',
object: 'file',
type: 'text/plain',
bytes: 256,
embedded: false,
usage: 0,
},
null,
{
user: 'user1',
file_id: 'file2',
filename: 'valid2.txt',
filepath: '/uploads/valid2.txt',
object: 'file',
type: 'text/plain',
bytes: 128,
embedded: false,
usage: 0,
},
];
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toHaveLength(2);
expect(result.attachments?.[0]?.file_id).toBe('file1');
expect(result.attachments?.[1]?.file_id).toBe('file2');
});
it('should merge existing tool_resources with new files', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'script.py',
filepath: '/uploads/script.py',
object: 'file',
type: 'text/x-python',
bytes: 512,
embedded: false,
usage: 0,
metadata: {
fileIdentifier: 'python-script',
},
},
];
const existingToolResources = {
[EToolResources.execute_code]: {
files: [
{
user: 'user1',
file_id: 'existing-file',
filename: 'existing.py',
filepath: '/uploads/existing.py',
object: 'file' as const,
type: 'text/x-python',
bytes: 256,
embedded: false,
usage: 0,
},
],
},
};
const attachments = Promise.resolve(mockFiles);
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: existingToolResources,
});
expect(result.tool_resources?.[EToolResources.execute_code]?.files).toHaveLength(2);
expect(result.tool_resources?.[EToolResources.execute_code]?.files?.[0]?.file_id).toBe(
'existing-file',
);
expect(result.tool_resources?.[EToolResources.execute_code]?.files?.[1]?.file_id).toBe(
'file1',
);
});
});
describe('when both OCR and attachments are provided', () => {
it('should include both OCR files and attachment files', async () => {
const mockOcrFiles: TFile[] = [
{
user: 'user1',
file_id: 'ocr-file-1',
filename: 'document.pdf',
filepath: '/uploads/document.pdf',
object: 'file',
type: 'application/pdf',
bytes: 1024,
embedded: false,
usage: 0,
},
];
const mockAttachmentFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'attachment.txt',
filepath: '/uploads/attachment.txt',
object: 'file',
type: 'text/plain',
bytes: 256,
embedded: false,
usage: 0,
},
];
mockGetFiles.mockResolvedValue(mockOcrFiles);
const attachments = Promise.resolve(mockAttachmentFiles);
const tool_resources = {
[EToolResources.ocr]: {
file_ids: ['ocr-file-1'],
},
};
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources,
});
expect(result.attachments).toHaveLength(2);
expect(result.attachments?.[0]?.file_id).toBe('ocr-file-1');
expect(result.attachments?.[1]?.file_id).toBe('file1');
});
});
describe('error handling', () => {
it('should handle errors gracefully and log them', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'test.txt',
filepath: '/uploads/test.txt',
object: 'file',
type: 'text/plain',
bytes: 256,
embedded: false,
usage: 0,
},
];
const attachments = Promise.resolve(mockFiles);
const error = new Error('Test error');
// Mock getFiles to throw an error when called for OCR
mockGetFiles.mockRejectedValue(error);
const tool_resources = {
[EToolResources.ocr]: {
file_ids: ['ocr-file-1'],
},
};
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources,
});
expect(logger.error).toHaveBeenCalledWith('Error priming resources', error);
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources).toEqual(tool_resources);
});
it('should handle promise rejection in attachments', async () => {
const error = new Error('Attachment error');
const attachments = Promise.reject(error);
// The function should now handle rejected attachment promises gracefully
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments,
tool_resources: {},
});
// Should log both the main error and the attachment error
expect(logger.error).toHaveBeenCalledWith('Error priming resources', error);
expect(logger.error).toHaveBeenCalledWith(
'Error resolving attachments in catch block',
error,
);
// Should return empty array when attachments promise is rejected
expect(result.attachments).toEqual([]);
expect(result.tool_resources).toEqual({});
});
});
describe('edge cases', () => {
it('should handle missing app.locals gracefully', async () => {
const reqWithoutLocals = {} as ServerRequest;
const result = await primeResources({
req: reqWithoutLocals,
getFiles: mockGetFiles,
requestFileSet,
attachments: undefined,
tool_resources: {
[EToolResources.ocr]: {
file_ids: ['ocr-file-1'],
},
},
});
expect(mockGetFiles).not.toHaveBeenCalled();
// When app.locals is missing and there's an error accessing properties,
// the function falls back to the catch block which returns an empty array
expect(result.attachments).toEqual([]);
});
it('should handle undefined tool_resources', async () => {
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet,
attachments: undefined,
tool_resources: undefined,
});
expect(result.tool_resources).toEqual({});
expect(result.attachments).toBeUndefined();
});
it('should handle empty requestFileSet', async () => {
const mockFiles: TFile[] = [
{
user: 'user1',
file_id: 'file1',
filename: 'image.png',
filepath: '/uploads/image.png',
object: 'file',
type: 'image/png',
bytes: 2048,
embedded: false,
usage: 0,
height: 800,
width: 600,
},
];
const attachments = Promise.resolve(mockFiles);
const emptyRequestFileSet = new Set<string>();
const result = await primeResources({
req: mockReq,
getFiles: mockGetFiles,
requestFileSet: emptyRequestFileSet,
attachments,
tool_resources: {},
});
expect(result.attachments).toEqual(mockFiles);
expect(result.tool_resources?.[EToolResources.image_edit]).toBeUndefined();
});
});
});

View file

@ -0,0 +1,114 @@
import { logger } from '@librechat/data-schemas';
import { EModelEndpoint, EToolResources, AgentCapabilities } from 'librechat-data-provider';
import type { FilterQuery, QueryOptions, ProjectionType } from 'mongoose';
import type { AgentToolResources, TFile } from 'librechat-data-provider';
import type { IMongoFile } from '@librechat/data-schemas';
import type { Request as ServerRequest } from 'express';
export type TGetFiles = (
filter: FilterQuery<IMongoFile>,
_sortOptions: ProjectionType<IMongoFile> | null | undefined,
selectFields: QueryOptions<IMongoFile> | null | undefined,
) => Promise<Array<TFile>>;
/**
* @param params
* @param params.req
* @param params.attachments
* @param params.requestFileSet
* @param params.tool_resources
*/
export const primeResources = async ({
req,
getFiles,
requestFileSet,
attachments: _attachments,
tool_resources: _tool_resources,
}: {
req: ServerRequest;
requestFileSet: Set<string>;
attachments: Promise<Array<TFile | null>> | undefined;
tool_resources: AgentToolResources | undefined;
getFiles: TGetFiles;
}): Promise<{
attachments: Array<TFile | undefined> | undefined;
tool_resources: AgentToolResources | undefined;
}> => {
try {
let attachments: Array<TFile | undefined> | undefined;
const tool_resources = _tool_resources ?? {};
const isOCREnabled = (req.app.locals?.[EModelEndpoint.agents]?.capabilities ?? []).includes(
AgentCapabilities.ocr,
);
if (tool_resources[EToolResources.ocr]?.file_ids && isOCREnabled) {
const context = await getFiles(
{
file_id: { $in: tool_resources.ocr.file_ids },
},
{},
{},
);
attachments = (attachments ?? []).concat(context);
}
if (!_attachments) {
return { attachments, tool_resources };
}
const files = await _attachments;
if (!attachments) {
attachments = [];
}
for (const file of files) {
if (!file) {
continue;
}
if (file.metadata?.fileIdentifier) {
const execute_code = tool_resources[EToolResources.execute_code] ?? {};
if (!execute_code.files) {
tool_resources[EToolResources.execute_code] = { ...execute_code, files: [] };
}
tool_resources[EToolResources.execute_code]?.files?.push(file);
} else if (file.embedded === true) {
const file_search = tool_resources[EToolResources.file_search] ?? {};
if (!file_search.files) {
tool_resources[EToolResources.file_search] = { ...file_search, files: [] };
}
tool_resources[EToolResources.file_search]?.files?.push(file);
} else if (
requestFileSet.has(file.file_id) &&
file.type.startsWith('image') &&
file.height &&
file.width
) {
const image_edit = tool_resources[EToolResources.image_edit] ?? {};
if (!image_edit.files) {
tool_resources[EToolResources.image_edit] = { ...image_edit, files: [] };
}
tool_resources[EToolResources.image_edit]?.files?.push(file);
}
attachments.push(file);
}
return { attachments, tool_resources };
} catch (error) {
logger.error('Error priming resources', error);
// Safely try to get attachments without rethrowing
let safeAttachments: Array<TFile | undefined> = [];
if (_attachments) {
try {
const attachmentFiles = await _attachments;
safeAttachments = (attachmentFiles?.filter((file) => !!file) ?? []) as Array<TFile>;
} catch (attachmentError) {
// If attachments promise is also rejected, just use empty array
logger.error('Error resolving attachments in catch block', attachmentError);
safeAttachments = [];
}
}
return {
attachments: safeAttachments,
tool_resources: _tool_resources,
};
}
};

View file

@ -0,0 +1,90 @@
import { Run, Providers } from '@librechat/agents';
import { providerEndpointMap, KnownEndpoints } from 'librechat-data-provider';
import type { StandardGraphConfig, EventHandler, GraphEvents, IState } from '@librechat/agents';
import type { Agent } from 'librechat-data-provider';
import type * as t from '~/types';
const customProviders = new Set([
Providers.XAI,
Providers.OLLAMA,
Providers.DEEPSEEK,
Providers.OPENROUTER,
]);
/**
* Creates a new Run instance with custom handlers and configuration.
*
* @param options - The options for creating the Run instance.
* @param options.agent - The agent for this run.
* @param options.signal - The signal for this run.
* @param options.req - The server request.
* @param options.runId - Optional run ID; otherwise, a new run ID will be generated.
* @param options.customHandlers - Custom event handlers.
* @param options.streaming - Whether to use streaming.
* @param options.streamUsage - Whether to stream usage information.
* @returns {Promise<Run<IState>>} A promise that resolves to a new Run instance.
*/
export async function createRun({
runId,
agent,
signal,
customHandlers,
streaming = true,
streamUsage = true,
}: {
agent: Agent;
signal: AbortSignal;
runId?: string;
streaming?: boolean;
streamUsage?: boolean;
customHandlers?: Record<GraphEvents, EventHandler>;
}): Promise<Run<IState>> {
const provider =
providerEndpointMap[agent.provider as keyof typeof providerEndpointMap] ?? agent.provider;
const llmConfig: t.RunLLMConfig = Object.assign(
{
provider,
streaming,
streamUsage,
},
agent.model_parameters,
);
/** Resolves issues with new OpenAI usage field */
if (
customProviders.has(agent.provider) ||
(agent.provider === Providers.OPENAI && agent.endpoint !== agent.provider)
) {
llmConfig.streamUsage = false;
llmConfig.usage = true;
}
let reasoningKey: 'reasoning_content' | 'reasoning' | undefined;
if (
llmConfig.configuration?.baseURL?.includes(KnownEndpoints.openrouter) ||
(agent.endpoint && agent.endpoint.toLowerCase().includes(KnownEndpoints.openrouter))
) {
reasoningKey = 'reasoning';
}
const graphConfig: StandardGraphConfig = {
signal,
llmConfig,
reasoningKey,
tools: agent.tools,
instructions: agent.instructions,
additional_instructions: agent.additional_instructions,
// toolEnd: agent.end_after_tools,
};
// TEMPORARY FOR TESTING
if (agent.provider === Providers.ANTHROPIC || agent.provider === Providers.BEDROCK) {
graphConfig.streamBuffer = 2000;
}
return Run.create({
runId,
graphConfig,
customHandlers,
});
}

View file

@ -0,0 +1 @@
export * from './openai';

View file

@ -0,0 +1,2 @@
export * from './llm';
export * from './initialize';

View file

@ -0,0 +1,176 @@
import {
ErrorTypes,
EModelEndpoint,
resolveHeaders,
mapModelToAzureConfig,
} from 'librechat-data-provider';
import type {
LLMConfigOptions,
UserKeyValues,
InitializeOpenAIOptionsParams,
OpenAIOptionsResult,
} from '~/types';
import { createHandleLLMNewToken } from '~/utils/generators';
import { getAzureCredentials } from '~/utils/azure';
import { isUserProvided } from '~/utils/common';
import { getOpenAIConfig } from './llm';
/**
* Initializes OpenAI options for agent usage. This function always returns configuration
* options and never creates a client instance (equivalent to optionsOnly=true behavior).
*
* @param params - Configuration parameters
* @returns Promise resolving to OpenAI configuration options
* @throws Error if API key is missing or user key has expired
*/
export const initializeOpenAI = async ({
req,
overrideModel,
endpointOption,
overrideEndpoint,
getUserKeyValues,
checkUserKeyExpiry,
}: InitializeOpenAIOptionsParams): Promise<OpenAIOptionsResult> => {
const { PROXY, OPENAI_API_KEY, AZURE_API_KEY, OPENAI_REVERSE_PROXY, AZURE_OPENAI_BASEURL } =
process.env;
const { key: expiresAt } = req.body;
const modelName = overrideModel ?? req.body.model;
const endpoint = overrideEndpoint ?? req.body.endpoint;
if (!endpoint) {
throw new Error('Endpoint is required');
}
const credentials = {
[EModelEndpoint.openAI]: OPENAI_API_KEY,
[EModelEndpoint.azureOpenAI]: AZURE_API_KEY,
};
const baseURLOptions = {
[EModelEndpoint.openAI]: OPENAI_REVERSE_PROXY,
[EModelEndpoint.azureOpenAI]: AZURE_OPENAI_BASEURL,
};
const userProvidesKey = isUserProvided(credentials[endpoint as keyof typeof credentials]);
const userProvidesURL = isUserProvided(baseURLOptions[endpoint as keyof typeof baseURLOptions]);
let userValues: UserKeyValues | null = null;
if (expiresAt && (userProvidesKey || userProvidesURL)) {
checkUserKeyExpiry(expiresAt, endpoint);
userValues = await getUserKeyValues({ userId: req.user.id, name: endpoint });
}
let apiKey = userProvidesKey
? userValues?.apiKey
: credentials[endpoint as keyof typeof credentials];
const baseURL = userProvidesURL
? userValues?.baseURL
: baseURLOptions[endpoint as keyof typeof baseURLOptions];
const clientOptions: LLMConfigOptions = {
proxy: PROXY ?? undefined,
reverseProxyUrl: baseURL || undefined,
streaming: true,
};
const isAzureOpenAI = endpoint === EModelEndpoint.azureOpenAI;
const azureConfig = isAzureOpenAI && req.app.locals[EModelEndpoint.azureOpenAI];
if (isAzureOpenAI && azureConfig) {
const { modelGroupMap, groupMap } = azureConfig;
const {
azureOptions,
baseURL: configBaseURL,
headers = {},
serverless,
} = mapModelToAzureConfig({
modelName: modelName || '',
modelGroupMap,
groupMap,
});
clientOptions.reverseProxyUrl = configBaseURL ?? clientOptions.reverseProxyUrl;
clientOptions.headers = resolveHeaders({ ...headers, ...(clientOptions.headers ?? {}) });
const groupName = modelGroupMap[modelName || '']?.group;
if (groupName && groupMap[groupName]) {
clientOptions.addParams = groupMap[groupName]?.addParams;
clientOptions.dropParams = groupMap[groupName]?.dropParams;
}
apiKey = azureOptions.azureOpenAIApiKey;
clientOptions.azure = !serverless ? azureOptions : undefined;
if (serverless === true) {
clientOptions.defaultQuery = azureOptions.azureOpenAIApiVersion
? { 'api-version': azureOptions.azureOpenAIApiVersion }
: undefined;
if (!clientOptions.headers) {
clientOptions.headers = {};
}
clientOptions.headers['api-key'] = apiKey;
}
} else if (isAzureOpenAI) {
clientOptions.azure =
userProvidesKey && userValues?.apiKey ? JSON.parse(userValues.apiKey) : getAzureCredentials();
apiKey = clientOptions.azure?.azureOpenAIApiKey;
}
if (userProvidesKey && !apiKey) {
throw new Error(
JSON.stringify({
type: ErrorTypes.NO_USER_KEY,
}),
);
}
if (!apiKey) {
throw new Error(`${endpoint} API Key not provided.`);
}
const modelOptions = {
...endpointOption.model_parameters,
model: modelName,
user: req.user.id,
};
const finalClientOptions: LLMConfigOptions = {
...clientOptions,
modelOptions,
};
const options = getOpenAIConfig(apiKey, finalClientOptions, endpoint);
const openAIConfig = req.app.locals[EModelEndpoint.openAI];
const allConfig = req.app.locals.all;
const azureRate = modelName?.includes('gpt-4') ? 30 : 17;
let streamRate: number | undefined;
if (isAzureOpenAI && azureConfig) {
streamRate = azureConfig.streamRate ?? azureRate;
} else if (!isAzureOpenAI && openAIConfig) {
streamRate = openAIConfig.streamRate;
}
if (allConfig?.streamRate) {
streamRate = allConfig.streamRate;
}
if (streamRate) {
options.llmConfig.callbacks = [
{
handleLLMNewToken: createHandleLLMNewToken(streamRate),
},
];
}
const result: OpenAIOptionsResult = {
...options,
streamRate,
};
return result;
};

View file

@ -0,0 +1,156 @@
import { HttpsProxyAgent } from 'https-proxy-agent';
import { KnownEndpoints } from 'librechat-data-provider';
import type * as t from '~/types';
import { sanitizeModelName, constructAzureURL } from '~/utils/azure';
import { isEnabled } from '~/utils/common';
/**
* Generates configuration options for creating a language model (LLM) instance.
* @param apiKey - The API key for authentication.
* @param options - Additional options for configuring the LLM.
* @param endpoint - The endpoint name
* @returns Configuration options for creating an LLM instance.
*/
export function getOpenAIConfig(
apiKey: string,
options: t.LLMConfigOptions = {},
endpoint?: string | null,
): t.LLMConfigResult {
const {
modelOptions = {},
reverseProxyUrl,
defaultQuery,
headers,
proxy,
azure,
streaming = true,
addParams,
dropParams,
} = options;
const llmConfig: Partial<t.ClientOptions> & Partial<t.OpenAIParameters> = Object.assign(
{
streaming,
model: modelOptions.model ?? '',
},
modelOptions,
);
if (addParams && typeof addParams === 'object') {
Object.assign(llmConfig, addParams);
}
// Note: OpenAI Web Search models do not support any known parameters besides `max_tokens`
if (modelOptions.model && /gpt-4o.*search/.test(modelOptions.model)) {
const searchExcludeParams = [
'frequency_penalty',
'presence_penalty',
'temperature',
'top_p',
'top_k',
'stop',
'logit_bias',
'seed',
'response_format',
'n',
'logprobs',
'user',
];
const updatedDropParams = dropParams || [];
const combinedDropParams = [...new Set([...updatedDropParams, ...searchExcludeParams])];
combinedDropParams.forEach((param) => {
if (param in llmConfig) {
delete llmConfig[param as keyof t.ClientOptions];
}
});
} else if (dropParams && Array.isArray(dropParams)) {
dropParams.forEach((param) => {
if (param in llmConfig) {
delete llmConfig[param as keyof t.ClientOptions];
}
});
}
let useOpenRouter = false;
const configOptions: t.OpenAIConfiguration = {};
if (
(reverseProxyUrl && reverseProxyUrl.includes(KnownEndpoints.openrouter)) ||
(endpoint && endpoint.toLowerCase().includes(KnownEndpoints.openrouter))
) {
useOpenRouter = true;
llmConfig.include_reasoning = true;
configOptions.baseURL = reverseProxyUrl;
configOptions.defaultHeaders = Object.assign(
{
'HTTP-Referer': 'https://librechat.ai',
'X-Title': 'LibreChat',
},
headers,
);
} else if (reverseProxyUrl) {
configOptions.baseURL = reverseProxyUrl;
if (headers) {
configOptions.defaultHeaders = headers;
}
}
if (defaultQuery) {
configOptions.defaultQuery = defaultQuery;
}
if (proxy) {
const proxyAgent = new HttpsProxyAgent(proxy);
configOptions.httpAgent = proxyAgent;
}
if (azure) {
const useModelName = isEnabled(process.env.AZURE_USE_MODEL_AS_DEPLOYMENT_NAME);
const updatedAzure = { ...azure };
updatedAzure.azureOpenAIApiDeploymentName = useModelName
? sanitizeModelName(llmConfig.model || '')
: azure.azureOpenAIApiDeploymentName;
if (process.env.AZURE_OPENAI_DEFAULT_MODEL) {
llmConfig.model = process.env.AZURE_OPENAI_DEFAULT_MODEL;
}
if (configOptions.baseURL) {
const azureURL = constructAzureURL({
baseURL: configOptions.baseURL,
azureOptions: updatedAzure,
});
updatedAzure.azureOpenAIBasePath = azureURL.split(
`/${updatedAzure.azureOpenAIApiDeploymentName}`,
)[0];
}
Object.assign(llmConfig, updatedAzure);
llmConfig.model = updatedAzure.azureOpenAIApiDeploymentName;
} else {
llmConfig.apiKey = apiKey;
}
if (process.env.OPENAI_ORGANIZATION && azure) {
configOptions.organization = process.env.OPENAI_ORGANIZATION;
}
if (useOpenRouter && llmConfig.reasoning_effort != null) {
llmConfig.reasoning = {
effort: llmConfig.reasoning_effort,
};
delete llmConfig.reasoning_effort;
}
if (llmConfig.max_tokens != null) {
llmConfig.maxTokens = llmConfig.max_tokens;
delete llmConfig.max_tokens;
}
return {
llmConfig,
configOptions,
};
}

Some files were not shown because too many files have changed in this diff Show more