⏸ refactor: Improve UX for Parallel Streams (Multi-Convo) (#11096)
Some checks are pending
Docker Dev Branch Images Build / build (Dockerfile, lc-dev, node) (push) Waiting to run
Docker Dev Branch Images Build / build (Dockerfile.multi, lc-dev-api, api-build) (push) Waiting to run

* 🌊 feat: Implement multi-conversation feature with added conversation context and payload adjustments

* refactor: Replace isSubmittingFamily with isSubmitting across message components for consistency

* feat: Add loadAddedAgent and processAddedConvo for multi-conversation agent execution

* refactor: Update ContentRender usage to conditionally render PlaceholderRow based on isLast and isSubmitting

* WIP: first pass, sibling index

* feat: Enhance multi-conversation support with agent tracking and display improvements

* refactor: Introduce isEphemeralAgentId utility and update related logic for agent handling

* refactor: Implement createDualMessageContent utility for sibling message display and enhance useStepHandler for added conversations

* refactor: duplicate tools for added agent if ephemeral and primary agent is also ephemeral

* chore: remove deprecated multimessage rendering

* refactor: enhance dual message content creation and agent handling for parallel rendering

* refactor: streamline message rendering and submission handling by removing unused state and optimizing conditional logic

* refactor: adjust content handling in parallel mode to utilize existing content for improved agent display

* refactor: update @librechat/agents dependency to version 3.0.53

* refactor: update @langchain/core and @librechat/agents dependencies to latest versions

* refactor: remove deprecated @langchain/core dependency from package.json

* chore: remove unused SearchToolConfig and GetSourcesParams types from web.ts

* refactor: remove unused message properties from Message component

* refactor: enhance parallel content handling with groupId support in ContentParts and useStepHandler

* refactor: implement parallel content styling in Message, MessageRender, and ContentRender components. use explicit model name

* refactor: improve agent ID handling in createDualMessageContent for dual message display

* refactor: simplify title generation in AddedConvo by removing unused sender and preset logic

* refactor: replace string interpolation with cn utility for className in HoverButtons component

* refactor: enhance agent ID handling by adding suffix management for parallel agents and updating related components

* refactor: enhance column ordering in ContentParts by sorting agents with suffix management

* refactor: update @librechat/agents dependency to version 3.0.55

* feat: implement parallel content rendering with metadata support

- Added `ParallelContentRenderer` and `ParallelColumns` components for rendering messages in parallel based on groupId and agentId.
- Introduced `contentMetadataMap` to store metadata for each content part, allowing efficient parallel content detection.
- Updated `Message` and `ContentRender` components to utilize the new metadata structure for rendering.
- Modified `useStepHandler` to manage content indices and metadata during message processing.
- Enhanced `IJobStore` interface and its implementations to support storing and retrieving content metadata.
- Updated data schemas to include `contentMetadataMap` for messages, enabling multi-agent and parallel execution scenarios.

* refactor: update @librechat/agents dependency to version 3.0.56

* refactor: remove unused EPHEMERAL_AGENT_ID constant and simplify agent ID check

* refactor: enhance multi-agent message processing and primary agent determination

* refactor: implement branch message functionality for parallel responses

* refactor: integrate added conversation retrieval into message editing and regeneration processes

* refactor: remove unused isCard and isMultiMessage props from MessageRender and ContentRender components

* refactor: update @librechat/agents dependency to version 3.0.60

* refactor: replace usage of EPHEMERAL_AGENT_ID constant with isEphemeralAgentId function for improved clarity and consistency

* refactor: standardize agent ID format in tests for consistency

* chore: move addedConvo property to the correct position in payload construction

* refactor: rename agent_id values in loadAgent tests for clarity

* chore: reorder props in ContentParts component for improved readability

* refactor: rename variable 'content' to 'result' for clarity in RedisJobStore tests

* refactor: streamline useMessageActions by removing duplicate handleFeedback assignment

* chore: revert placeholder rendering logic MessageRender and ContentRender components to original

* refactor: implement useContentMetadata hook for optimized content metadata handling

* refactor: remove contentMetadataMap and related logic from the codebase and revert back to agentId/groupId in content parts

- Eliminated contentMetadataMap from various components and services, simplifying the handling of message content.
- Updated functions to directly access agentId and groupId from content parts instead of relying on a separate metadata map.
- Adjusted related hooks and components to reflect the removal of contentMetadataMap, ensuring consistent handling of message content.
- Updated tests and documentation to align with the new structure of message content handling.

* refactor: remove logging from groupParallelContent function to clean up output

* refactor: remove model parameter from TBranchMessageRequest type for simplification

* refactor: enhance branch message creation by stripping metadata for standalone content

* chore: streamline branch message creation by simplifying content filtering and removing unnecessary metadata checks

* refactor: include attachments in branch message creation for improved content handling

* refactor: streamline agent content processing by consolidating primary agent identification and filtering logic

* refactor: simplify multi-agent message processing by creating a dedicated mapping method and enhancing content filtering

* refactor: remove unused parameter from loadEphemeralAgent function for cleaner code

* refactor: update groupId handling in metadata to only set when provided by the server
This commit is contained in:
Danny Avila 2025-12-25 01:43:54 -05:00 committed by GitHub
parent 9b6e7cabc9
commit 439bc98682
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
74 changed files with 2174 additions and 957 deletions

View file

@ -18,6 +18,7 @@ const {
EModelEndpoint,
isParamEndpoint,
isAgentsEndpoint,
isEphemeralAgentId,
supportsBalanceCheck,
} = require('librechat-data-provider');
const {
@ -714,7 +715,7 @@ class BaseClient {
iconURL: this.options.iconURL,
endpoint: this.options.endpoint,
...(this.metadata ?? {}),
metadata,
metadata: Object.keys(metadata ?? {}).length > 0 ? metadata : undefined,
};
if (typeof completion === 'string') {
@ -969,7 +970,7 @@ class BaseClient {
const hasNonEphemeralAgent =
isAgentsEndpoint(this.options.endpoint) &&
endpointOptions?.agent_id &&
endpointOptions.agent_id !== Constants.EPHEMERAL_AGENT_ID;
!isEphemeralAgentId(endpointOptions.agent_id);
if (hasNonEphemeralAgent) {
exceptions.add('model');
}

View file

@ -1,8 +1,18 @@
const mongoose = require('mongoose');
const crypto = require('node:crypto');
const { logger } = require('@librechat/data-schemas');
const { ResourceType, SystemRoles, Tools, actionDelimiter } = require('librechat-data-provider');
const { GLOBAL_PROJECT_NAME, EPHEMERAL_AGENT_ID, mcp_all, mcp_delimiter } =
const { getCustomEndpointConfig } = require('@librechat/api');
const {
Tools,
SystemRoles,
ResourceType,
actionDelimiter,
isAgentsEndpoint,
getResponseSender,
isEphemeralAgentId,
encodeEphemeralAgentId,
} = require('librechat-data-provider');
const { GLOBAL_PROJECT_NAME, mcp_all, mcp_delimiter } =
require('librechat-data-provider').Constants;
const {
removeAgentFromAllProjects,
@ -92,7 +102,7 @@ const getAgents = async (searchParameter) => await Agent.find(searchParameter).l
* @param {import('@librechat/agents').ClientOptions} [params.model_parameters]
* @returns {Promise<Agent|null>} The agent document as a plain object, or null if not found.
*/
const loadEphemeralAgent = async ({ req, spec, agent_id, endpoint, model_parameters: _m }) => {
const loadEphemeralAgent = async ({ req, spec, endpoint, model_parameters: _m }) => {
const { model, ...model_parameters } = _m;
const modelSpecs = req.config?.modelSpecs?.list;
/** @type {TModelSpec | null} */
@ -139,8 +149,28 @@ const loadEphemeralAgent = async ({ req, spec, agent_id, endpoint, model_paramet
}
const instructions = req.body.promptPrefix;
// Compute display name using getResponseSender (same logic used for addedConvo agents)
const appConfig = req.config;
let endpointConfig = appConfig?.endpoints?.[endpoint];
if (!isAgentsEndpoint(endpoint) && !endpointConfig) {
try {
endpointConfig = getCustomEndpointConfig({ endpoint, appConfig });
} catch (err) {
logger.error('[loadEphemeralAgent] Error getting custom endpoint config', err);
}
}
const sender = getResponseSender({
modelLabel: model_parameters?.modelLabel,
modelDisplayLabel: endpointConfig?.modelDisplayLabel,
});
// Encode ephemeral agent ID with endpoint, model, and computed sender for display
const ephemeralId = encodeEphemeralAgentId({ endpoint, model, sender });
const result = {
id: agent_id,
id: ephemeralId,
instructions,
provider: endpoint,
model_parameters,
@ -169,8 +199,8 @@ const loadAgent = async ({ req, spec, agent_id, endpoint, model_parameters }) =>
if (!agent_id) {
return null;
}
if (agent_id === EPHEMERAL_AGENT_ID) {
return await loadEphemeralAgent({ req, spec, agent_id, endpoint, model_parameters });
if (isEphemeralAgentId(agent_id)) {
return await loadEphemeralAgent({ req, spec, endpoint, model_parameters });
}
const agent = await getAgent({
id: agent_id,

View file

@ -1960,7 +1960,8 @@ describe('models/Agent', () => {
});
if (result) {
expect(result.id).toBe(EPHEMERAL_AGENT_ID);
// Ephemeral agent ID is encoded with endpoint and model
expect(result.id).toBe('openai__gpt-4');
expect(result.instructions).toBe('Test instructions');
expect(result.provider).toBe('openai');
expect(result.model).toBe('gpt-4');
@ -1978,7 +1979,7 @@ describe('models/Agent', () => {
const mockReq = { user: { id: 'user123' } };
const result = await loadAgent({
req: mockReq,
agent_id: 'non_existent_agent',
agent_id: 'agent_non_existent',
endpoint: 'openai',
model_parameters: { model: 'gpt-4' },
});
@ -2105,7 +2106,7 @@ describe('models/Agent', () => {
test('should handle loadAgent with malformed req object', async () => {
const result = await loadAgent({
req: null,
agent_id: 'test',
agent_id: 'agent_test',
endpoint: 'openai',
model_parameters: { model: 'gpt-4' },
});

View file

@ -0,0 +1,218 @@
const { logger } = require('@librechat/data-schemas');
const { getCustomEndpointConfig } = require('@librechat/api');
const {
Tools,
Constants,
isAgentsEndpoint,
getResponseSender,
isEphemeralAgentId,
appendAgentIdSuffix,
encodeEphemeralAgentId,
} = require('librechat-data-provider');
const { getMCPServerTools } = require('~/server/services/Config');
const { mcp_all, mcp_delimiter } = Constants;
/**
* Constant for added conversation agent ID
*/
const ADDED_AGENT_ID = 'added_agent';
/**
* Get an agent document based on the provided ID.
* @param {Object} searchParameter - The search parameters to find the agent.
* @param {string} searchParameter.id - The ID of the agent.
* @returns {Promise<import('librechat-data-provider').Agent|null>}
*/
let getAgent;
/**
* Set the getAgent function (dependency injection to avoid circular imports)
* @param {Function} fn
*/
const setGetAgent = (fn) => {
getAgent = fn;
};
/**
* Load an agent from an added conversation (TConversation).
* Used for multi-convo parallel agent execution.
*
* @param {Object} params
* @param {import('express').Request} params.req
* @param {import('librechat-data-provider').TConversation} params.conversation - The added conversation
* @param {import('librechat-data-provider').Agent} [params.primaryAgent] - The primary agent (used to duplicate tools when both are ephemeral)
* @returns {Promise<import('librechat-data-provider').Agent|null>} The agent config as a plain object, or null if invalid.
*/
const loadAddedAgent = async ({ req, conversation, primaryAgent }) => {
if (!conversation) {
return null;
}
// If there's an agent_id, load the existing agent
if (conversation.agent_id && !isEphemeralAgentId(conversation.agent_id)) {
if (!getAgent) {
throw new Error('getAgent not initialized - call setGetAgent first');
}
const agent = await getAgent({
id: conversation.agent_id,
});
if (!agent) {
logger.warn(`[loadAddedAgent] Agent ${conversation.agent_id} not found`);
return null;
}
agent.version = agent.versions ? agent.versions.length : 0;
// Append suffix to distinguish from primary agent (matches ephemeral format)
// This is needed when both agents have the same ID or for consistent parallel content attribution
agent.id = appendAgentIdSuffix(agent.id, 1);
return agent;
}
// Otherwise, create an ephemeral agent config from the conversation
const { model, endpoint, promptPrefix, spec, ...rest } = conversation;
if (!endpoint || !model) {
logger.warn('[loadAddedAgent] Missing required endpoint or model for ephemeral agent');
return null;
}
// If both primary and added agents are ephemeral, duplicate tools from primary agent
const primaryIsEphemeral = primaryAgent && isEphemeralAgentId(primaryAgent.id);
if (primaryIsEphemeral && Array.isArray(primaryAgent.tools)) {
// Get display name using getResponseSender
const appConfig = req.config;
let endpointConfig = appConfig?.endpoints?.[endpoint];
if (!isAgentsEndpoint(endpoint) && !endpointConfig) {
try {
endpointConfig = getCustomEndpointConfig({ endpoint, appConfig });
} catch (err) {
logger.error('[loadAddedAgent] Error getting custom endpoint config', err);
}
}
const sender = getResponseSender({
modelLabel: rest.modelLabel,
modelDisplayLabel: endpointConfig?.modelDisplayLabel,
});
const ephemeralId = encodeEphemeralAgentId({ endpoint, model, sender, index: 1 });
return {
id: ephemeralId,
instructions: promptPrefix || '',
provider: endpoint,
model_parameters: {},
model,
tools: [...primaryAgent.tools],
};
}
// Extract ephemeral agent options from conversation if present
const ephemeralAgent = rest.ephemeralAgent;
const mcpServers = new Set(ephemeralAgent?.mcp);
const userId = req.user?.id;
// Check model spec for MCP servers
const modelSpecs = req.config?.modelSpecs?.list;
let modelSpec = null;
if (spec != null && spec !== '') {
modelSpec = modelSpecs?.find((s) => s.name === spec) || null;
}
if (modelSpec?.mcpServers) {
for (const mcpServer of modelSpec.mcpServers) {
mcpServers.add(mcpServer);
}
}
/** @type {string[]} */
const tools = [];
if (ephemeralAgent?.execute_code === true || modelSpec?.executeCode === true) {
tools.push(Tools.execute_code);
}
if (ephemeralAgent?.file_search === true || modelSpec?.fileSearch === true) {
tools.push(Tools.file_search);
}
if (ephemeralAgent?.web_search === true || modelSpec?.webSearch === true) {
tools.push(Tools.web_search);
}
const addedServers = new Set();
if (mcpServers.size > 0) {
for (const mcpServer of mcpServers) {
if (addedServers.has(mcpServer)) {
continue;
}
const serverTools = await getMCPServerTools(userId, mcpServer);
if (!serverTools) {
tools.push(`${mcp_all}${mcp_delimiter}${mcpServer}`);
addedServers.add(mcpServer);
continue;
}
tools.push(...Object.keys(serverTools));
addedServers.add(mcpServer);
}
}
// Build model_parameters from conversation fields
const model_parameters = {};
const paramKeys = [
'temperature',
'top_p',
'topP',
'topK',
'presence_penalty',
'frequency_penalty',
'maxOutputTokens',
'maxTokens',
'max_tokens',
];
for (const key of paramKeys) {
if (rest[key] != null) {
model_parameters[key] = rest[key];
}
}
// Get endpoint config for modelDisplayLabel (same pattern as initialize.js)
const appConfig = req.config;
let endpointConfig = appConfig?.endpoints?.[endpoint];
if (!isAgentsEndpoint(endpoint) && !endpointConfig) {
try {
endpointConfig = getCustomEndpointConfig({ endpoint, appConfig });
} catch (err) {
logger.error('[loadAddedAgent] Error getting custom endpoint config', err);
}
}
// Compute display name using getResponseSender (same logic used for main agent)
const sender = getResponseSender({
modelLabel: rest.modelLabel,
modelDisplayLabel: endpointConfig?.modelDisplayLabel,
});
/** Encoded ephemeral agent ID with endpoint, model, sender, and index=1 to distinguish from primary */
const ephemeralId = encodeEphemeralAgentId({ endpoint, model, sender, index: 1 });
const result = {
id: ephemeralId,
instructions: promptPrefix || '',
provider: endpoint,
model_parameters,
model,
tools,
};
if (ephemeralAgent?.artifacts != null && ephemeralAgent.artifacts) {
result.artifacts = ephemeralAgent.artifacts;
}
return result;
};
module.exports = {
ADDED_AGENT_ID,
loadAddedAgent,
setGetAgent,
};

View file

@ -42,8 +42,8 @@
"@azure/storage-blob": "^12.27.0",
"@googleapis/youtube": "^20.0.0",
"@keyv/redis": "^4.3.3",
"@langchain/core": "^0.3.79",
"@librechat/agents": "^3.0.52",
"@langchain/core": "^0.3.80",
"@librechat/agents": "^3.0.61",
"@librechat/api": "*",
"@librechat/data-schemas": "*",
"@microsoft/microsoft-graph-client": "^3.0.7",

View file

@ -350,9 +350,6 @@ function disposeClient(client) {
if (client.agentConfigs) {
client.agentConfigs = null;
}
if (client.agentIdMap) {
client.agentIdMap = null;
}
if (client.artifactPromises) {
client.artifactPromises = null;
}

View file

@ -37,14 +37,13 @@ const {
EModelEndpoint,
PermissionTypes,
isAgentsEndpoint,
AgentCapabilities,
isEphemeralAgentId,
bedrockInputSchema,
removeNullishValues,
} = require('librechat-data-provider');
const { spendTokens, spendStructuredTokens } = require('~/models/spendTokens');
const { encodeAndFormat } = require('~/server/services/Files/images/encode');
const { createContextHandlers } = require('~/app/clients/prompts');
const { checkCapability } = require('~/server/services/Config');
const { getConvoFiles } = require('~/models/Conversation');
const BaseClient = require('~/app/clients/BaseClient');
const { getRoleByName } = require('~/models/Role');
@ -96,59 +95,101 @@ function logToolError(graph, error, toolId) {
});
}
/** Regex pattern to match agent ID suffix (____N) */
const AGENT_SUFFIX_PATTERN = /____(\d+)$/;
/**
* Applies agent labeling to conversation history when multi-agent patterns are detected.
* Labels content parts by their originating agent to prevent identity confusion.
* Creates a mapMethod for getMessagesForConversation that processes agent content.
* - Strips agentId/groupId metadata from all content
* - For multi-agent: filters to primary agent content only (no suffix or lowest suffix)
* - For multi-agent: applies agent labels to content
*
* @param {TMessage[]} orderedMessages - The ordered conversation messages
* @param {Agent} primaryAgent - The primary agent configuration
* @param {Map<string, Agent>} agentConfigs - Map of additional agent configurations
* @returns {TMessage[]} Messages with agent labels applied where appropriate
* @param {Agent} primaryAgent - Primary agent configuration
* @param {Map<string, Agent>} [agentConfigs] - Additional agent configurations
* @returns {(message: TMessage) => TMessage} Map method for processing messages
*/
function applyAgentLabelsToHistory(orderedMessages, primaryAgent, agentConfigs) {
const shouldLabelByAgent = (primaryAgent.edges?.length ?? 0) > 0 || (agentConfigs?.size ?? 0) > 0;
if (!shouldLabelByAgent) {
return orderedMessages;
}
const processedMessages = [];
for (let i = 0; i < orderedMessages.length; i++) {
const message = orderedMessages[i];
/** @type {Record<string, string>} */
const agentNames = { [primaryAgent.id]: primaryAgent.name || 'Assistant' };
function createMultiAgentMapper(primaryAgent, agentConfigs) {
const hasMultipleAgents = (primaryAgent.edges?.length ?? 0) > 0 || (agentConfigs?.size ?? 0) > 0;
/** @type {Record<string, string> | null} */
let agentNames = null;
if (hasMultipleAgents) {
agentNames = { [primaryAgent.id]: primaryAgent.name || 'Assistant' };
if (agentConfigs) {
for (const [agentId, agentConfig] of agentConfigs.entries()) {
agentNames[agentId] = agentConfig.name || agentConfig.id;
}
}
if (
!message.isCreatedByUser &&
message.metadata?.agentIdMap &&
Array.isArray(message.content)
) {
try {
const labeledContent = labelContentByAgent(
message.content,
message.metadata.agentIdMap,
agentNames,
);
processedMessages.push({ ...message, content: labeledContent });
} catch (error) {
logger.error('[AgentClient] Error applying agent labels to message:', error);
processedMessages.push(message);
}
} else {
processedMessages.push(message);
}
}
return processedMessages;
return (message) => {
if (message.isCreatedByUser || !Array.isArray(message.content)) {
return message;
}
// Find primary agent ID (no suffix, or lowest suffix number) - only needed for multi-agent
let primaryAgentId = null;
let hasAgentMetadata = false;
if (hasMultipleAgents) {
let lowestSuffixIndex = Infinity;
for (const part of message.content) {
const agentId = part?.agentId;
if (!agentId) {
continue;
}
hasAgentMetadata = true;
const suffixMatch = agentId.match(AGENT_SUFFIX_PATTERN);
if (!suffixMatch) {
primaryAgentId = agentId;
break;
}
const suffixIndex = parseInt(suffixMatch[1], 10);
if (suffixIndex < lowestSuffixIndex) {
lowestSuffixIndex = suffixIndex;
primaryAgentId = agentId;
}
}
} else {
// Single agent: just check if any metadata exists
hasAgentMetadata = message.content.some((part) => part?.agentId || part?.groupId);
}
if (!hasAgentMetadata) {
return message;
}
try {
/** @type {Array<TMessageContentParts>} */
const filteredContent = [];
/** @type {Record<number, string>} */
const agentIdMap = {};
for (const part of message.content) {
const agentId = part?.agentId;
// For single agent: include all parts; for multi-agent: filter to primary
if (!hasMultipleAgents || !agentId || agentId === primaryAgentId) {
const newIndex = filteredContent.length;
const { agentId: _a, groupId: _g, ...cleanPart } = part;
filteredContent.push(cleanPart);
if (agentId && hasMultipleAgents) {
agentIdMap[newIndex] = agentId;
}
}
}
const finalContent =
Object.keys(agentIdMap).length > 0 && agentNames
? labelContentByAgent(filteredContent, agentIdMap, agentNames)
: filteredContent;
return { ...message, content: finalContent };
} catch (error) {
logger.error('[AgentClient] Error processing multi-agent message:', error);
return message;
}
};
}
class AgentClient extends BaseClient {
@ -200,8 +241,6 @@ class AgentClient extends BaseClient {
this.indexTokenCountMap = {};
/** @type {(messages: BaseMessage[]) => Promise<void>} */
this.processMemory;
/** @type {Record<number, string> | null} */
this.agentIdMap = null;
}
/**
@ -288,18 +327,13 @@ class AgentClient extends BaseClient {
{ instructions = null, additional_instructions = null },
opts,
) {
let orderedMessages = this.constructor.getMessagesForConversation({
const orderedMessages = this.constructor.getMessagesForConversation({
messages,
parentMessageId,
summary: this.shouldSummarize,
mapMethod: createMultiAgentMapper(this.options.agent, this.agentConfigs),
});
orderedMessages = applyAgentLabelsToHistory(
orderedMessages,
this.options.agent,
this.agentConfigs,
);
let payload;
/** @type {number | undefined} */
let promptTokens;
@ -551,10 +585,9 @@ class AgentClient extends BaseClient {
agent: prelimAgent,
allowedProviders,
endpointOption: {
endpoint:
prelimAgent.id !== Constants.EPHEMERAL_AGENT_ID
? EModelEndpoint.agents
: memoryConfig.agent?.provider,
endpoint: !isEphemeralAgentId(prelimAgent.id)
? EModelEndpoint.agents
: memoryConfig.agent?.provider,
},
},
{
@ -693,9 +726,7 @@ class AgentClient extends BaseClient {
});
const completion = filterMalformedContentParts(this.contentParts);
const metadata = this.agentIdMap ? { agentIdMap: this.agentIdMap } : undefined;
return { completion, metadata };
return { completion };
}
/**
@ -891,12 +922,10 @@ class AgentClient extends BaseClient {
*/
const runAgents = async (messages) => {
const agents = [this.options.agent];
if (
this.agentConfigs &&
this.agentConfigs.size > 0 &&
((this.options.agent.edges?.length ?? 0) > 0 ||
(await checkCapability(this.options.req, AgentCapabilities.chain)))
) {
// Include additional agents when:
// - agentConfigs has agents (from addedConvo parallel execution or agent handoffs)
// - Agents without incoming edges become start nodes and run in parallel automatically
if (this.agentConfigs && this.agentConfigs.size > 0) {
agents.push(...this.agentConfigs.values());
}
@ -992,24 +1021,6 @@ class AgentClient extends BaseClient {
);
});
}
try {
/** Capture agent ID map if we have edges or multiple agents */
const shouldStoreAgentMap =
(this.options.agent.edges?.length ?? 0) > 0 || (this.agentConfigs?.size ?? 0) > 0;
if (shouldStoreAgentMap && run?.Graph) {
const contentPartAgentMap = run.Graph.getContentPartAgentMap();
if (contentPartAgentMap && contentPartAgentMap.size > 0) {
this.agentIdMap = Object.fromEntries(contentPartAgentMap);
logger.debug('[AgentClient] Captured agent ID map:', {
totalParts: this.contentParts.length,
mappedParts: Object.keys(this.agentIdMap).length,
});
}
}
} catch (error) {
logger.error('[AgentClient] Error capturing agent ID map:', error);
}
} catch (err) {
logger.error(
'[api/server/controllers/agents/client.js #sendCompletion] Operation aborted',

View file

@ -1,5 +1,10 @@
const { logger } = require('@librechat/data-schemas');
const { Constants, isAgentsEndpoint, ResourceType } = require('librechat-data-provider');
const {
Constants,
ResourceType,
isAgentsEndpoint,
isEphemeralAgentId,
} = require('librechat-data-provider');
const { canAccessResource } = require('./canAccessResource');
const { getAgent } = require('~/models/Agent');
@ -13,7 +18,8 @@ const { getAgent } = require('~/models/Agent');
*/
const resolveAgentIdFromBody = async (agentCustomId) => {
// Handle ephemeral agents - they don't need permission checks
if (agentCustomId === Constants.EPHEMERAL_AGENT_ID) {
// Real agent IDs always start with "agent_", so anything else is ephemeral
if (isEphemeralAgentId(agentCustomId)) {
return null; // No permission check needed for ephemeral agents
}
@ -62,7 +68,8 @@ const canAccessAgentFromBody = (options) => {
}
// Skip permission checks for ephemeral agents
if (agentId === Constants.EPHEMERAL_AGENT_ID) {
// Real agent IDs always start with "agent_", so anything else is ephemeral
if (isEphemeralAgentId(agentId)) {
return next();
}

View file

@ -1,4 +1,5 @@
const express = require('express');
const { v4: uuidv4 } = require('uuid');
const { logger } = require('@librechat/data-schemas');
const { ContentTypes } = require('librechat-data-provider');
const { unescapeLaTeX, countTokens } = require('@librechat/api');
@ -111,6 +112,91 @@ router.get('/', async (req, res) => {
}
});
/**
* Creates a new branch message from a specific agent's content within a parallel response message.
* Filters the original message's content to only include parts attributed to the specified agentId.
* Only available for non-user messages with content attributions.
*
* @route POST /branch
* @param {string} req.body.messageId - The ID of the source message
* @param {string} req.body.agentId - The agentId to filter content by
* @returns {TMessage} The newly created branch message
*/
router.post('/branch', async (req, res) => {
try {
const { messageId, agentId } = req.body;
const userId = req.user.id;
if (!messageId || !agentId) {
return res.status(400).json({ error: 'messageId and agentId are required' });
}
const sourceMessage = await getMessage({ user: userId, messageId });
if (!sourceMessage) {
return res.status(404).json({ error: 'Source message not found' });
}
if (sourceMessage.isCreatedByUser) {
return res.status(400).json({ error: 'Cannot branch from user messages' });
}
if (!Array.isArray(sourceMessage.content)) {
return res.status(400).json({ error: 'Message does not have content' });
}
const hasAgentMetadata = sourceMessage.content.some((part) => part?.agentId);
if (!hasAgentMetadata) {
return res
.status(400)
.json({ error: 'Message does not have parallel content with attributions' });
}
/** @type {Array<import('librechat-data-provider').TMessageContentParts>} */
const filteredContent = [];
for (const part of sourceMessage.content) {
if (part?.agentId === agentId) {
const { agentId: _a, groupId: _g, ...cleanPart } = part;
filteredContent.push(cleanPart);
}
}
if (filteredContent.length === 0) {
return res.status(400).json({ error: 'No content found for the specified agentId' });
}
const newMessageId = uuidv4();
/** @type {import('librechat-data-provider').TMessage} */
const newMessage = {
messageId: newMessageId,
conversationId: sourceMessage.conversationId,
parentMessageId: sourceMessage.parentMessageId,
attachments: sourceMessage.attachments,
isCreatedByUser: false,
model: sourceMessage.model,
endpoint: sourceMessage.endpoint,
sender: sourceMessage.sender,
iconURL: sourceMessage.iconURL,
content: filteredContent,
unfinished: false,
error: false,
user: userId,
};
const savedMessage = await saveMessage(req, newMessage, {
context: 'POST /api/messages/branch',
});
if (!savedMessage) {
return res.status(500).json({ error: 'Failed to save branch message' });
}
res.status(201).json(savedMessage);
} catch (error) {
logger.error('Error creating branch message:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
router.post('/artifact/:messageId', async (req, res) => {
try {
const { messageId } = req.params;

View file

@ -0,0 +1,136 @@
const { logger } = require('@librechat/data-schemas');
const { initializeAgent, validateAgentModel } = require('@librechat/api');
const { loadAddedAgent, setGetAgent, ADDED_AGENT_ID } = require('~/models/loadAddedAgent');
const { getConvoFiles } = require('~/models/Conversation');
const { getAgent } = require('~/models/Agent');
const db = require('~/models');
// Initialize the getAgent dependency
setGetAgent(getAgent);
/**
* Process addedConvo for parallel agent execution.
* Creates a parallel agent config from an added conversation.
*
* When an added agent has no incoming edges, it becomes a start node
* and runs in parallel with the primary agent automatically.
*
* Edge cases handled:
* - Primary agent has edges (handoffs): Added agent runs in parallel with primary,
* but doesn't participate in the primary's handoff graph
* - Primary agent has agent_ids (legacy chain): Added agent runs in parallel with primary,
* but doesn't participate in the chain
* - Primary agent has both: Added agent is independent, runs parallel from start
*
* @param {Object} params
* @param {import('express').Request} params.req
* @param {import('express').Response} params.res
* @param {Object} params.endpointOption - The endpoint option containing addedConvo
* @param {Object} params.modelsConfig - The models configuration
* @param {Function} params.logViolation - Function to log violations
* @param {Function} params.loadTools - Function to load agent tools
* @param {Array} params.requestFiles - Request files
* @param {string} params.conversationId - The conversation ID
* @param {Set} params.allowedProviders - Set of allowed providers
* @param {Map} params.agentConfigs - Map of agent configs to add to
* @param {string} params.primaryAgentId - The primary agent ID
* @param {Object|undefined} params.userMCPAuthMap - User MCP auth map to merge into
* @returns {Promise<{userMCPAuthMap: Object|undefined}>} The updated userMCPAuthMap
*/
const processAddedConvo = async ({
req,
res,
endpointOption,
modelsConfig,
logViolation,
loadTools,
requestFiles,
conversationId,
allowedProviders,
agentConfigs,
primaryAgentId,
primaryAgent,
userMCPAuthMap,
}) => {
const addedConvo = endpointOption.addedConvo;
logger.debug('[processAddedConvo] Called with addedConvo:', {
hasAddedConvo: addedConvo != null,
addedConvoEndpoint: addedConvo?.endpoint,
addedConvoModel: addedConvo?.model,
addedConvoAgentId: addedConvo?.agent_id,
});
if (addedConvo == null) {
return { userMCPAuthMap };
}
try {
const addedAgent = await loadAddedAgent({ req, conversation: addedConvo, primaryAgent });
if (!addedAgent) {
return { userMCPAuthMap };
}
const addedValidation = await validateAgentModel({
req,
res,
modelsConfig,
logViolation,
agent: addedAgent,
});
if (!addedValidation.isValid) {
logger.warn(
`[processAddedConvo] Added agent validation failed: ${addedValidation.error?.message}`,
);
return { userMCPAuthMap };
}
const addedConfig = await initializeAgent(
{
req,
res,
loadTools,
requestFiles,
conversationId,
agent: addedAgent,
endpointOption,
allowedProviders,
},
{
getConvoFiles,
getFiles: db.getFiles,
getUserKey: db.getUserKey,
updateFilesUsage: db.updateFilesUsage,
getUserKeyValues: db.getUserKeyValues,
getToolFilesByIds: db.getToolFilesByIds,
},
);
if (userMCPAuthMap != null) {
Object.assign(userMCPAuthMap, addedConfig.userMCPAuthMap ?? {});
} else {
userMCPAuthMap = addedConfig.userMCPAuthMap;
}
const addedAgentId = addedConfig.id || ADDED_AGENT_ID;
agentConfigs.set(addedAgentId, addedConfig);
// No edges needed - agent without incoming edges becomes a start node
// and runs in parallel with the primary agent automatically.
// This is independent of any edges/agent_ids the primary agent has.
logger.debug(
`[processAddedConvo] Added parallel agent: ${addedAgentId} (primary: ${primaryAgentId}, ` +
`primary has edges: ${!!endpointOption.edges}, primary has agent_ids: ${!!endpointOption.agent_ids})`,
);
return { userMCPAuthMap };
} catch (err) {
logger.error('[processAddedConvo] Error processing addedConvo for parallel agent', err);
return { userMCPAuthMap };
}
};
module.exports = {
processAddedConvo,
ADDED_AGENT_ID,
};

View file

@ -15,6 +15,9 @@ const buildOptions = (req, endpoint, parsedBody, endpointType) => {
return undefined;
});
/** @type {import('librechat-data-provider').TConversation | undefined} */
const addedConvo = req.body?.addedConvo;
return removeNullishValues({
spec,
iconURL,
@ -23,6 +26,7 @@ const buildOptions = (req, endpoint, parsedBody, endpointType) => {
endpointType,
model_parameters,
agent: agentPromise,
addedConvo,
});
};

View file

@ -7,10 +7,10 @@ const {
createSequentialChainEdges,
} = require('@librechat/api');
const {
Constants,
EModelEndpoint,
isAgentsEndpoint,
getResponseSender,
isEphemeralAgentId,
} = require('librechat-data-provider');
const {
createToolEndCallback,
@ -20,6 +20,7 @@ const { getModelsConfig } = require('~/server/controllers/ModelController');
const { loadAgentTools } = require('~/server/services/ToolService');
const AgentClient = require('~/server/controllers/agents/client');
const { getConvoFiles } = require('~/models/Conversation');
const { processAddedConvo } = require('./addedConvo');
const { getAgent } = require('~/models/Agent');
const { logViolation } = require('~/cache');
const db = require('~/models');
@ -233,6 +234,33 @@ const initializeClient = async ({ req, res, signal, endpointOption }) => {
edges = edges ? edges.concat(chain) : chain;
}
/** Multi-Convo: Process addedConvo for parallel agent execution */
const { userMCPAuthMap: updatedMCPAuthMap } = await processAddedConvo({
req,
res,
endpointOption,
modelsConfig,
logViolation,
loadTools,
requestFiles,
conversationId,
allowedProviders,
agentConfigs,
primaryAgentId: primaryConfig.id,
primaryAgent,
userMCPAuthMap,
});
if (updatedMCPAuthMap) {
userMCPAuthMap = updatedMCPAuthMap;
}
// Ensure edges is an array when we have multiple agents (multi-agent mode)
// MultiAgentGraph.categorizeEdges requires edges to be iterable
if (agentConfigs.size > 0 && !edges) {
edges = [];
}
primaryConfig.edges = edges;
let endpointConfig = appConfig.endpoints?.[primaryConfig.endpoint];
@ -276,10 +304,7 @@ const initializeClient = async ({ req, res, signal, endpointOption }) => {
endpointType: endpointOption.endpointType,
resendFiles: primaryConfig.resendFiles ?? true,
maxContextTokens: primaryConfig.maxContextTokens,
endpoint:
primaryConfig.id === Constants.EPHEMERAL_AGENT_ID
? primaryConfig.endpoint
: EModelEndpoint.agents,
endpoint: isEphemeralAgentId(primaryConfig.id) ? primaryConfig.endpoint : EModelEndpoint.agents,
});
return { client, userMCPAuthMap };

View file

@ -9,7 +9,6 @@ const {
} = require('@librechat/api');
const {
Tools,
Constants,
ErrorTypes,
ContentTypes,
imageGenTools,
@ -18,6 +17,7 @@ const {
ImageVisionTool,
openapiToFunction,
AgentCapabilities,
isEphemeralAgentId,
validateActionDomain,
defaultAgentCapabilities,
validateAndParseOpenAPISpec,
@ -393,7 +393,7 @@ async function loadAgentTools({
const endpointsConfig = await getEndpointsConfig(req);
let enabledCapabilities = new Set(endpointsConfig?.[EModelEndpoint.agents]?.capabilities ?? []);
/** Edge case: use defined/fallback capabilities when the "agents" endpoint is not enabled */
if (enabledCapabilities.size === 0 && agent.id === Constants.EPHEMERAL_AGENT_ID) {
if (enabledCapabilities.size === 0 && isEphemeralAgentId(agent.id)) {
enabledCapabilities = new Set(
appConfig.endpoints?.[EModelEndpoint.agents]?.capabilities ?? defaultAgentCapabilities,
);

View file

@ -1,6 +1,13 @@
import { createContext, useContext } from 'react';
import useAddedResponse from '~/hooks/Chat/useAddedResponse';
type TAddedChatContext = ReturnType<typeof useAddedResponse>;
import type { TConversation } from 'librechat-data-provider';
import type { SetterOrUpdater } from 'recoil';
import type { ConvoGenerator } from '~/common';
type TAddedChatContext = {
conversation: TConversation | null;
setConversation: SetterOrUpdater<TConversation | null>;
generateConversation: ConvoGenerator;
};
export const AddedChatContext = createContext<TAddedChatContext>({} as TAddedChatContext);
export const useAddedChatContext = () => useContext(AddedChatContext);

View file

@ -1,5 +1,4 @@
import React, { createContext, useContext, useMemo } from 'react';
import { useAddedChatContext } from './AddedChatContext';
import { useChatContext } from './ChatContext';
interface MessagesViewContextValue {
@ -9,7 +8,6 @@ interface MessagesViewContextValue {
/** Submission and control states */
isSubmitting: ReturnType<typeof useChatContext>['isSubmitting'];
isSubmittingFamily: boolean;
abortScroll: ReturnType<typeof useChatContext>['abortScroll'];
setAbortScroll: ReturnType<typeof useChatContext>['setAbortScroll'];
@ -34,13 +32,12 @@ export type { MessagesViewContextValue };
export function MessagesViewProvider({ children }: { children: React.ReactNode }) {
const chatContext = useChatContext();
const addedChatContext = useAddedChatContext();
const {
ask,
index,
regenerate,
isSubmitting: isSubmittingRoot,
isSubmitting,
conversation,
latestMessage,
setAbortScroll,
@ -51,8 +48,6 @@ export function MessagesViewProvider({ children }: { children: React.ReactNode }
setMessages,
} = chatContext;
const { isSubmitting: isSubmittingAdditional } = addedChatContext;
/** Memoize conversation-related values */
const conversationValues = useMemo(
() => ({
@ -65,12 +60,11 @@ export function MessagesViewProvider({ children }: { children: React.ReactNode }
/** Memoize submission states */
const submissionStates = useMemo(
() => ({
isSubmitting: isSubmittingRoot,
isSubmittingFamily: isSubmittingRoot || isSubmittingAdditional,
abortScroll,
isSubmitting,
setAbortScroll,
}),
[isSubmittingRoot, isSubmittingAdditional, abortScroll, setAbortScroll],
[isSubmitting, abortScroll, setAbortScroll],
);
/** Memoize message operations (these are typically stable references) */
@ -127,11 +121,10 @@ export function useMessagesConversation() {
/** Hook for components that only need submission states */
export function useMessagesSubmission() {
const { isSubmitting, isSubmittingFamily, abortScroll, setAbortScroll } =
useMessagesViewContext();
const { isSubmitting, abortScroll, setAbortScroll } = useMessagesViewContext();
return useMemo(
() => ({ isSubmitting, isSubmittingFamily, abortScroll, setAbortScroll }),
[isSubmitting, isSubmittingFamily, abortScroll, setAbortScroll],
() => ({ isSubmitting, abortScroll, setAbortScroll }),
[isSubmitting, abortScroll, setAbortScroll],
);
}

View file

@ -1,5 +1,5 @@
import { RefObject } from 'react';
import { Constants, FileSources, EModelEndpoint } from 'librechat-data-provider';
import { FileSources, EModelEndpoint, isEphemeralAgentId } from 'librechat-data-provider';
import type { UseMutationResult } from '@tanstack/react-query';
import type * as InputNumberPrimitive from 'rc-input-number';
import type { SetterOrUpdater, RecoilState } from 'recoil';
@ -10,7 +10,7 @@ import type { TranslationKeys } from '~/hooks';
import { MCPServerDefinition } from '~/hooks/MCP/useMCPServerManager';
export function isEphemeralAgent(agentId: string | null | undefined): boolean {
return agentId == null || agentId === '' || agentId === Constants.EPHEMERAL_AGENT_ID;
return isEphemeralAgentId(agentId);
}
export interface ConfigFieldDetail {
@ -356,6 +356,8 @@ export type TOptions = {
isResubmission?: boolean;
/** Currently only utilized when `isResubmission === true`, uses that message's currently attached files */
overrideFiles?: t.TMessage['files'];
/** Added conversation for multi-convo feature - sent to server as part of submission payload */
addedConvo?: t.TConversation;
};
export type TAskFunction = (props: TAskProps, options?: TOptions) => void;

View file

@ -16,7 +16,7 @@ function AddMultiConvo() {
setAddedConvo({
...convo,
title: '',
});
} as TConversation);
const textarea = document.getElementById(mainTextareaId);
if (textarea) {
@ -34,13 +34,12 @@ function AddMultiConvo() {
return (
<TooltipAnchor
id="add-multi-conversation-button"
aria-label={localize('com_ui_add_multi_conversation')}
description={localize('com_ui_add_multi_conversation')}
tabIndex={0}
role="button"
tabIndex={0}
aria-label={localize('com_ui_add_multi_conversation')}
onClick={clickHandler}
data-testid="parameters-button"
data-testid="add-multi-convo-button"
className="inline-flex size-10 flex-shrink-0 items-center justify-center rounded-xl border border-border-light bg-transparent text-text-primary transition-all ease-in-out hover:bg-surface-tertiary disabled:pointer-events-none disabled:opacity-50 radix-state-open:bg-surface-tertiary"
>
<PlusCircle size={16} aria-hidden="true" />

View file

@ -54,7 +54,7 @@ function ChatView({ index = 0 }: { index?: number }) {
});
const chatHelpers = useChatHelpers(index, conversationId);
const addedChatHelpers = useAddedResponse({ rootIndex: index });
const addedChatHelpers = useAddedResponse();
useResumableStreamToggle(
chatHelpers.conversation?.endpoint,

View file

@ -10,7 +10,7 @@ import { useGetStartupConfig } from '~/data-provider';
import ExportAndShareMenu from './ExportAndShareMenu';
import BookmarkMenu from './Menus/BookmarkMenu';
import { TemporaryChat } from './TemporaryChat';
// import AddMultiConvo from './AddMultiConvo';
import AddMultiConvo from './AddMultiConvo';
import { useHasAccess } from '~/hooks';
import { cn } from '~/utils';
@ -30,10 +30,10 @@ export default function Header() {
permission: Permissions.USE,
});
// const hasAccessToMultiConvo = useHasAccess({
// permissionType: PermissionTypes.MULTI_CONVO,
// permission: Permissions.USE,
// });
const hasAccessToMultiConvo = useHasAccess({
permissionType: PermissionTypes.MULTI_CONVO,
permission: Permissions.USE,
});
const isSmallScreen = useMediaQuery('(max-width: 768px)');
@ -67,7 +67,7 @@ export default function Header() {
<ModelSelector startupConfig={startupConfig} />
{interfaceConfig.presets === true && interfaceConfig.modelSelect && <PresetsMenu />}
{hasAccessToBookmarks === true && <BookmarkMenu />}
{/* {hasAccessToMultiConvo === true && <AddMultiConvo />} */}
{hasAccessToMultiConvo === true && <AddMultiConvo />}
{isSmallScreen && (
<>
<ExportAndShareMenu

View file

@ -1,10 +1,10 @@
import { useMemo } from 'react';
import type { TConversation, TEndpointOption, TPreset } from 'librechat-data-provider';
import { isAgentsEndpoint } from 'librechat-data-provider';
import type { TConversation } from 'librechat-data-provider';
import type { SetterOrUpdater } from 'recoil';
import useGetSender from '~/hooks/Conversations/useGetSender';
import { useGetEndpointsQuery } from '~/data-provider';
import { EndpointIcon } from '~/components/Endpoints';
import { getPresetTitle } from '~/utils';
import { useAgentsMapContext } from '~/Providers';
export default function AddedConvo({
addedConvo,
@ -13,13 +13,23 @@ export default function AddedConvo({
addedConvo: TConversation | null;
setAddedConvo: SetterOrUpdater<TConversation | null>;
}) {
const getSender = useGetSender();
const agentsMap = useAgentsMapContext();
const { data: endpointsConfig } = useGetEndpointsQuery();
const title = useMemo(() => {
const sender = getSender(addedConvo as TEndpointOption);
const title = getPresetTitle(addedConvo as TPreset);
return `+ ${sender}: ${title}`;
}, [addedConvo, getSender]);
// Priority: agent name > modelDisplayLabel > modelLabel > model
if (isAgentsEndpoint(addedConvo?.endpoint) && addedConvo?.agent_id) {
const agent = agentsMap?.[addedConvo.agent_id];
if (agent?.name) {
return `+ ${agent.name}`;
}
}
const endpointConfig = endpointsConfig?.[addedConvo?.endpoint ?? ''];
const displayLabel =
endpointConfig?.modelDisplayLabel || addedConvo?.modelLabel || addedConvo?.model || 'AI';
return `+ ${displayLabel}`;
}, [addedConvo, agentsMap, endpointsConfig]);
if (!addedConvo) {
return null;

View file

@ -75,14 +75,11 @@ const ChatForm = memo(({ index = 0 }: { index?: number }) => {
handleStopGenerating,
} = useChatContext();
const {
addedIndex,
generateConversation,
conversation: addedConvo,
setConversation: setAddedConvo,
isSubmitting: isSubmittingAdded,
} = useAddedChatContext();
const assistantMap = useAssistantsMapContext();
const showStopAdded = useRecoilValue(store.showStopButtonByIndex(addedIndex));
const endpoint = useMemo(
() => conversation?.endpointType ?? conversation?.endpoint,
@ -128,7 +125,7 @@ const ChatForm = memo(({ index = 0 }: { index?: number }) => {
setFiles,
textAreaRef,
conversationId,
isSubmitting: isSubmitting || isSubmittingAdded,
isSubmitting,
});
const { submitMessage, submitPrompt } = useSubmitMessage();
@ -324,7 +321,7 @@ const ChatForm = memo(({ index = 0 }: { index?: number }) => {
</div>
<BadgeRow
showEphemeralBadges={!isAgentsEndpoint(endpoint) && !isAssistantsEndpoint(endpoint)}
isSubmitting={isSubmitting || isSubmittingAdded}
isSubmitting={isSubmitting}
conversationId={conversationId}
onChange={setBadges}
isInChat={
@ -342,7 +339,7 @@ const ChatForm = memo(({ index = 0 }: { index?: number }) => {
/>
)}
<div className={`${isRTL ? 'ml-2' : 'mr-2'}`}>
{(isSubmitting || isSubmittingAdded) && (showStopButton || showStopAdded) ? (
{isSubmitting && showStopButton ? (
<StopButton stop={handleStopGenerating} setShowStopButton={setShowStopButton} />
) : (
endpoint && (

View file

@ -1,4 +1,4 @@
import { memo, useMemo } from 'react';
import { memo, useMemo, useCallback } from 'react';
import { ContentTypes } from 'librechat-data-provider';
import type {
TMessageContentParts,
@ -7,10 +7,11 @@ import type {
Agents,
} from 'librechat-data-provider';
import { MessageContext, SearchContext } from '~/Providers';
import { ParallelContentRenderer, type PartWithIndex } from './ParallelContent';
import { mapAttachments } from '~/utils';
import { EditTextPart, EmptyText } from './Parts';
import MemoryArtifacts from './MemoryArtifacts';
import Sources from '~/components/Web/Sources';
import { mapAttachments } from '~/utils/map';
import Container from './Container';
import Part from './Part';
@ -33,120 +34,159 @@ type ContentPartsProps = {
| undefined;
};
const ContentParts = memo(
({
content,
messageId,
conversationId,
attachments,
searchResults,
isCreatedByUser,
isLast,
isSubmitting,
isLatestMessage,
edit,
enterEdit,
siblingIdx,
setSiblingIdx,
}: ContentPartsProps) => {
const attachmentMap = useMemo(() => mapAttachments(attachments ?? []), [attachments]);
/**
* ContentParts renders message content parts, handling both sequential and parallel layouts.
*
* For 90% of messages (single-agent, no parallel execution), this renders sequentially.
* For multi-agent parallel execution, it uses ParallelContentRenderer to show columns.
*/
const ContentParts = memo(function ContentParts({
edit,
isLast,
content,
messageId,
enterEdit,
siblingIdx,
attachments,
isSubmitting,
setSiblingIdx,
searchResults,
conversationId,
isCreatedByUser,
isLatestMessage,
}: ContentPartsProps) {
const attachmentMap = useMemo(() => mapAttachments(attachments ?? []), [attachments]);
const effectiveIsSubmitting = isLatestMessage ? isSubmitting : false;
const effectiveIsSubmitting = isLatestMessage ? isSubmitting : false;
/**
* Render a single content part with proper context.
*/
const renderPart = useCallback(
(part: TMessageContentParts, idx: number, isLastPart: boolean) => {
const toolCallId = (part?.[ContentTypes.TOOL_CALL] as Agents.ToolCall | undefined)?.id ?? '';
const partAttachments = attachmentMap[toolCallId];
if (!content) {
return null;
}
if (edit === true && enterEdit && setSiblingIdx) {
return (
<>
{content.map((part, idx) => {
if (!part) {
return null;
}
const isTextPart =
part?.type === ContentTypes.TEXT ||
typeof (part as unknown as Agents.MessageContentText)?.text !== 'string';
const isThinkPart =
part?.type === ContentTypes.THINK ||
typeof (part as unknown as Agents.ReasoningDeltaUpdate)?.think !== 'string';
if (!isTextPart && !isThinkPart) {
return null;
}
const isToolCall =
part.type === ContentTypes.TOOL_CALL || part['tool_call_ids'] != null;
if (isToolCall) {
return null;
}
return (
<EditTextPart
index={idx}
part={part as Agents.MessageContentText | Agents.ReasoningDeltaUpdate}
messageId={messageId}
isSubmitting={isSubmitting}
enterEdit={enterEdit}
siblingIdx={siblingIdx ?? null}
setSiblingIdx={setSiblingIdx}
key={`edit-${messageId}-${idx}`}
/>
);
})}
</>
<MessageContext.Provider
key={`provider-${messageId}-${idx}`}
value={{
messageId,
isExpanded: true,
conversationId,
partIndex: idx,
nextType: content?.[idx + 1]?.type,
isSubmitting: effectiveIsSubmitting,
isLatestMessage,
}}
>
<Part
part={part}
attachments={partAttachments}
isSubmitting={effectiveIsSubmitting}
key={`part-${messageId}-${idx}`}
isCreatedByUser={isCreatedByUser}
isLast={isLastPart}
showCursor={isLastPart && isLast}
/>
</MessageContext.Provider>
);
}
},
[
attachmentMap,
content,
conversationId,
effectiveIsSubmitting,
isCreatedByUser,
isLast,
isLatestMessage,
messageId,
],
);
/** Show cursor placeholder when content is empty but actively submitting */
const showEmptyCursor = content.length === 0 && effectiveIsSubmitting;
// Early return: no content
if (!content) {
return null;
}
// Edit mode: render editable text parts
if (edit === true && enterEdit && setSiblingIdx) {
return (
<>
<SearchContext.Provider value={{ searchResults }}>
<MemoryArtifacts attachments={attachments} />
<Sources messageId={messageId} conversationId={conversationId || undefined} />
{showEmptyCursor && (
<Container>
<EmptyText />
</Container>
)}
{content.map((part, idx) => {
if (!part) {
return null;
}
{content.map((part, idx) => {
if (!part) {
return null;
}
const isTextPart =
part?.type === ContentTypes.TEXT ||
typeof (part as unknown as Agents.MessageContentText)?.text !== 'string';
const isThinkPart =
part?.type === ContentTypes.THINK ||
typeof (part as unknown as Agents.ReasoningDeltaUpdate)?.think !== 'string';
if (!isTextPart && !isThinkPart) {
return null;
}
const toolCallId =
(part?.[ContentTypes.TOOL_CALL] as Agents.ToolCall | undefined)?.id ?? '';
const partAttachments = attachmentMap[toolCallId];
const isToolCall = part.type === ContentTypes.TOOL_CALL || part['tool_call_ids'] != null;
if (isToolCall) {
return null;
}
return (
<MessageContext.Provider
key={`provider-${messageId}-${idx}`}
value={{
messageId,
isExpanded: true,
conversationId,
partIndex: idx,
nextType: content[idx + 1]?.type,
isSubmitting: effectiveIsSubmitting,
isLatestMessage,
}}
>
<Part
part={part}
attachments={partAttachments}
isSubmitting={effectiveIsSubmitting}
key={`part-${messageId}-${idx}`}
isCreatedByUser={isCreatedByUser}
isLast={idx === content.length - 1}
showCursor={idx === content.length - 1 && isLast}
/>
</MessageContext.Provider>
);
})}
</SearchContext.Provider>
return (
<EditTextPart
index={idx}
part={part as Agents.MessageContentText | Agents.ReasoningDeltaUpdate}
messageId={messageId}
isSubmitting={isSubmitting}
enterEdit={enterEdit}
siblingIdx={siblingIdx ?? null}
setSiblingIdx={setSiblingIdx}
key={`edit-${messageId}-${idx}`}
/>
);
})}
</>
);
},
);
}
const showEmptyCursor = content.length === 0 && effectiveIsSubmitting;
const lastContentIdx = content.length - 1;
// Parallel content: use dedicated renderer with columns (TMessageContentParts includes ContentMetadata)
const hasParallelContent = content.some((part) => part?.groupId != null);
if (hasParallelContent) {
return (
<ParallelContentRenderer
content={content}
messageId={messageId}
conversationId={conversationId}
attachments={attachments}
searchResults={searchResults}
isSubmitting={effectiveIsSubmitting}
renderPart={renderPart}
/>
);
}
// Sequential content: render parts in order (90% of cases)
const sequentialParts: PartWithIndex[] = [];
content.forEach((part, idx) => {
if (part) {
sequentialParts.push({ part, idx });
}
});
return (
<SearchContext.Provider value={{ searchResults }}>
<MemoryArtifacts attachments={attachments} />
<Sources messageId={messageId} conversationId={conversationId || undefined} />
{showEmptyCursor && (
<Container>
<EmptyText />
</Container>
)}
{sequentialParts.map(({ part, idx }) => renderPart(part, idx, idx === lastContentIdx))}
</SearchContext.Provider>
);
});
export default ContentParts;

View file

@ -1,10 +1,11 @@
import { useRef, useEffect, useCallback } from 'react';
import { useForm } from 'react-hook-form';
import { useRecoilState, useRecoilValue } from 'recoil';
import { useRecoilValue } from 'recoil';
import { TextareaAutosize, TooltipAnchor } from '@librechat/client';
import { useUpdateMessageMutation } from 'librechat-data-provider/react-query';
import type { TEditProps } from '~/common';
import { useMessagesOperations, useMessagesConversation, useAddedChatContext } from '~/Providers';
import { useMessagesOperations, useMessagesConversation } from '~/Providers';
import { useGetAddedConvo } from '~/hooks/Chat';
import { cn, removeFocusRings } from '~/utils';
import { useLocalize } from '~/hooks';
import Container from './Container';
@ -19,14 +20,10 @@ const EditMessage = ({
siblingIdx,
setSiblingIdx,
}: TEditProps) => {
const { addedIndex } = useAddedChatContext();
const saveButtonRef = useRef<HTMLButtonElement | null>(null);
const submitButtonRef = useRef<HTMLButtonElement | null>(null);
const { conversation } = useMessagesConversation();
const { getMessages, setMessages } = useMessagesOperations();
const [latestMultiMessage, setLatestMultiMessage] = useRecoilState(
store.latestMessageFamily(addedIndex),
);
const textAreaRef = useRef<HTMLTextAreaElement | null>(null);
@ -37,6 +34,8 @@ const EditMessage = ({
const chatDirection = useRecoilValue(store.chatDirection).toLowerCase();
const isRTL = chatDirection === 'rtl';
const getAddedConvo = useGetAddedConvo();
const { register, handleSubmit, setValue } = useForm({
defaultValues: {
text: text ?? '',
@ -62,6 +61,7 @@ const EditMessage = ({
},
{
overrideFiles: message.files,
addedConvo: getAddedConvo() || undefined,
},
);
@ -80,6 +80,7 @@ const EditMessage = ({
editedMessageId: messageId,
isRegenerate: true,
isEdited: true,
addedConvo: getAddedConvo() || undefined,
},
);
@ -101,10 +102,6 @@ const EditMessage = ({
messageId,
});
if (message.messageId === latestMultiMessage?.messageId) {
setLatestMultiMessage({ ...latestMultiMessage, text: data.text });
}
const isInMessages = messages.some((message) => message.messageId === messageId);
if (!isInMessages) {
message.text = data.text;

View file

@ -0,0 +1,269 @@
import { memo, useMemo } from 'react';
import type { TMessageContentParts, SearchResultData, TAttachment } from 'librechat-data-provider';
import { SearchContext } from '~/Providers';
import MemoryArtifacts from './MemoryArtifacts';
import Sources from '~/components/Web/Sources';
import { EmptyText } from './Parts';
import SiblingHeader from './SiblingHeader';
import Container from './Container';
import { cn } from '~/utils';
export type PartWithIndex = { part: TMessageContentParts; idx: number };
export type ParallelColumn = {
agentId: string;
parts: PartWithIndex[];
};
export type ParallelSection = {
groupId: number;
columns: ParallelColumn[];
};
/**
* Groups content parts by groupId for parallel rendering.
* Parts with same groupId are displayed in columns, grouped by agentId.
*
* @param content - Array of content parts
* @returns Object containing parallel sections and sequential parts
*/
export function groupParallelContent(
content: Array<TMessageContentParts | undefined> | undefined,
): { parallelSections: ParallelSection[]; sequentialParts: PartWithIndex[] } {
if (!content) {
return { parallelSections: [], sequentialParts: [] };
}
const groupMap = new Map<number, PartWithIndex[]>();
// Track placeholder agentIds per groupId (parts with empty type that establish columns)
const placeholderAgents = new Map<number, Set<string>>();
const noGroup: PartWithIndex[] = [];
content.forEach((part, idx) => {
if (!part) {
return;
}
// Read metadata directly from content part (TMessageContentParts includes ContentMetadata)
const { groupId } = part;
// Check for placeholder (empty type) before narrowing - access agentId via casting
const partAgentId = (part as { agentId?: string }).agentId;
if (groupId != null) {
// Track placeholder parts (empty type) to establish columns for pending agents
if (!part.type && partAgentId) {
if (!placeholderAgents.has(groupId)) {
placeholderAgents.set(groupId, new Set());
}
placeholderAgents.get(groupId)!.add(partAgentId);
return; // Don't add to groupMap - we'll handle these separately
}
if (!groupMap.has(groupId)) {
groupMap.set(groupId, []);
}
groupMap.get(groupId)!.push({ part, idx });
} else {
noGroup.push({ part, idx });
}
});
// Collect all groupIds (from both real content and placeholders)
const allGroupIds = new Set([...groupMap.keys(), ...placeholderAgents.keys()]);
// Build parallel sections with columns grouped by agentId
const sections: ParallelSection[] = [];
for (const groupId of allGroupIds) {
const columnMap = new Map<string, PartWithIndex[]>();
const parts = groupMap.get(groupId) ?? [];
for (const { part, idx } of parts) {
// Read agentId directly from content part (TMessageContentParts includes ContentMetadata)
const agentId = part.agentId ?? 'unknown';
if (!columnMap.has(agentId)) {
columnMap.set(agentId, []);
}
columnMap.get(agentId)!.push({ part, idx });
}
// Add empty columns for placeholder agents that don't have real content yet
const groupPlaceholders = placeholderAgents.get(groupId);
if (groupPlaceholders) {
for (const placeholderAgentId of groupPlaceholders) {
if (!columnMap.has(placeholderAgentId)) {
// Empty array signals this column should show loading state
columnMap.set(placeholderAgentId, []);
}
}
}
// Sort columns: primary agent (no ____N suffix) first, added agents (with suffix) second
// This ensures consistent column ordering regardless of which agent responds first
const sortedAgentIds = Array.from(columnMap.keys()).sort((a, b) => {
const aHasSuffix = a.includes('____');
const bHasSuffix = b.includes('____');
if (aHasSuffix && !bHasSuffix) {
return 1;
}
if (!aHasSuffix && bHasSuffix) {
return -1;
}
return 0;
});
const columns = sortedAgentIds.map((agentId) => ({
agentId,
parts: columnMap.get(agentId)!,
}));
sections.push({ groupId, columns });
}
// Sort sections by the minimum index in each section (sections with only placeholders go last)
sections.sort((a, b) => {
const aParts = a.columns.flatMap((c) => c.parts.map((p) => p.idx));
const bParts = b.columns.flatMap((c) => c.parts.map((p) => p.idx));
const aMin = aParts.length > 0 ? Math.min(...aParts) : Infinity;
const bMin = bParts.length > 0 ? Math.min(...bParts) : Infinity;
return aMin - bMin;
});
return { parallelSections: sections, sequentialParts: noGroup };
}
type ParallelColumnsProps = {
columns: ParallelColumn[];
groupId: number;
messageId: string;
isSubmitting: boolean;
lastContentIdx: number;
conversationId?: string | null;
renderPart: (part: TMessageContentParts, idx: number, isLastPart: boolean) => React.ReactNode;
};
/**
* Renders parallel content columns for a single groupId.
*/
export const ParallelColumns = memo(function ParallelColumns({
columns,
groupId,
messageId,
conversationId,
isSubmitting,
lastContentIdx,
renderPart,
}: ParallelColumnsProps) {
return (
<div className={cn('flex w-full flex-col gap-3 md:flex-row', 'sibling-content-group')}>
{columns.map(({ agentId, parts: columnParts }, colIdx) => {
// Show loading cursor if column has no content parts yet (empty array from placeholder)
const showLoadingCursor = isSubmitting && columnParts.length === 0;
return (
<div
key={`column-${messageId}-${groupId}-${agentId || colIdx}`}
className="min-w-0 flex-1 rounded-lg border border-border-light p-3"
>
<SiblingHeader
agentId={agentId}
messageId={messageId}
isSubmitting={isSubmitting}
conversationId={conversationId}
/>
{showLoadingCursor ? (
<Container>
<EmptyText />
</Container>
) : (
columnParts.map(({ part, idx }) => {
const isLastInColumn = idx === columnParts[columnParts.length - 1]?.idx;
const isLastContent = idx === lastContentIdx;
return renderPart(part, idx, isLastInColumn && isLastContent);
})
)}
</div>
);
})}
</div>
);
});
type ParallelContentRendererProps = {
content: Array<TMessageContentParts | undefined>;
messageId: string;
conversationId?: string | null;
attachments?: TAttachment[];
searchResults?: { [key: string]: SearchResultData };
isSubmitting: boolean;
renderPart: (part: TMessageContentParts, idx: number, isLastPart: boolean) => React.ReactNode;
};
/**
* Renders content with parallel sections (columns) and sequential parts.
* Handles the layout of before/parallel/after content sections.
*/
export const ParallelContentRenderer = memo(function ParallelContentRenderer({
content,
messageId,
conversationId,
attachments,
searchResults,
isSubmitting,
renderPart,
}: ParallelContentRendererProps) {
const { parallelSections, sequentialParts } = useMemo(
() => groupParallelContent(content),
[content],
);
const lastContentIdx = content.length - 1;
// Split sequential parts into before/after parallel sections
const { before, after } = useMemo(() => {
if (parallelSections.length === 0) {
return { before: sequentialParts, after: [] };
}
const allParallelIndices = parallelSections.flatMap((s) =>
s.columns.flatMap((c) => c.parts.map((p) => p.idx)),
);
const minParallelIdx = Math.min(...allParallelIndices);
const maxParallelIdx = Math.max(...allParallelIndices);
return {
before: sequentialParts.filter(({ idx }) => idx < minParallelIdx),
after: sequentialParts.filter(({ idx }) => idx > maxParallelIdx),
};
}, [parallelSections, sequentialParts]);
return (
<SearchContext.Provider value={{ searchResults }}>
<MemoryArtifacts attachments={attachments} />
<Sources messageId={messageId} conversationId={conversationId || undefined} />
{/* Sequential content BEFORE parallel sections */}
{before.map(({ part, idx }) => renderPart(part, idx, false))}
{/* Parallel sections - each group renders as columns */}
{parallelSections.map(({ groupId, columns }) => (
<ParallelColumns
key={`parallel-group-${messageId}-${groupId}`}
columns={columns}
groupId={groupId}
messageId={messageId}
renderPart={renderPart}
isSubmitting={isSubmitting}
conversationId={conversationId}
lastContentIdx={lastContentIdx}
/>
))}
{/* Sequential content AFTER parallel sections */}
{after.map(({ part, idx }) => renderPart(part, idx, idx === lastContentIdx))}
</SearchContext.Provider>
);
});
export default ParallelContentRenderer;

View file

@ -1,14 +1,15 @@
import { useRef, useEffect, useCallback, useMemo } from 'react';
import { useRecoilValue } from 'recoil';
import { useForm } from 'react-hook-form';
import { TextareaAutosize } from '@librechat/client';
import { ContentTypes } from 'librechat-data-provider';
import { useRecoilState, useRecoilValue } from 'recoil';
import { Lightbulb, MessageSquare } from 'lucide-react';
import { useUpdateMessageContentMutation } from 'librechat-data-provider/react-query';
import type { Agents } from 'librechat-data-provider';
import type { TEditProps } from '~/common';
import { useMessagesOperations, useMessagesConversation, useAddedChatContext } from '~/Providers';
import { useMessagesOperations, useMessagesConversation } from '~/Providers';
import Container from '~/components/Chat/Messages/Content/Container';
import { useGetAddedConvo } from '~/hooks/Chat';
import { cn, removeFocusRings } from '~/utils';
import { useLocalize } from '~/hooks';
import store from '~/store';
@ -25,12 +26,8 @@ const EditTextPart = ({
part: Agents.MessageContentText | Agents.ReasoningDeltaUpdate;
}) => {
const localize = useLocalize();
const { addedIndex } = useAddedChatContext();
const { conversation } = useMessagesConversation();
const { ask, getMessages, setMessages } = useMessagesOperations();
const [latestMultiMessage, setLatestMultiMessage] = useRecoilState(
store.latestMessageFamily(addedIndex),
);
const { conversationId = '' } = conversation ?? {};
const message = useMemo(
@ -40,6 +37,8 @@ const EditTextPart = ({
const chatDirection = useRecoilValue(store.chatDirection);
const getAddedConvo = useGetAddedConvo();
const textAreaRef = useRef<HTMLTextAreaElement | null>(null);
const updateMessageContentMutation = useUpdateMessageContentMutation(conversationId ?? '');
@ -87,6 +86,7 @@ const EditTextPart = ({
editedMessageId: messageId,
isRegenerate: true,
isEdited: true,
addedConvo: getAddedConvo() || undefined,
},
);
@ -105,10 +105,6 @@ const EditTextPart = ({
messageId,
});
if (messageId === latestMultiMessage?.messageId) {
setLatestMultiMessage({ ...latestMultiMessage, text: data.text });
}
const isInMessages = messages.some((msg) => msg.messageId === messageId);
if (!isInMessages) {
return enterEdit(true);

View file

@ -0,0 +1,140 @@
import { useMemo } from 'react';
import { GitBranchPlus } from 'lucide-react';
import { useToastContext } from '@librechat/client';
import { EModelEndpoint, parseEphemeralAgentId, stripAgentIdSuffix } from 'librechat-data-provider';
import type { TMessage, Agent } from 'librechat-data-provider';
import { useBranchMessageMutation } from '~/data-provider/Messages';
import MessageIcon from '~/components/Share/MessageIcon';
import { useAgentsMapContext } from '~/Providers';
import { useLocalize } from '~/hooks';
import { cn } from '~/utils';
type SiblingHeaderProps = {
/** The agentId from the content part (could be real agent ID or endpoint__model format) */
agentId?: string;
/** The messageId of the parent message */
messageId?: string;
/** The conversationId */
conversationId?: string | null;
/** Whether a submission is in progress */
isSubmitting?: boolean;
};
/**
* Header component for sibling content parts in parallel agent responses.
* Displays the agent/model icon and name for each parallel response.
*/
export default function SiblingHeader({
agentId,
messageId,
conversationId,
isSubmitting,
}: SiblingHeaderProps) {
const agentsMap = useAgentsMapContext();
const localize = useLocalize();
const { showToast } = useToastContext();
const branchMessage = useBranchMessageMutation(conversationId ?? null, {
onSuccess: () => {
showToast({
message: localize('com_ui_branch_created'),
status: 'success',
});
},
onError: () => {
showToast({
message: localize('com_ui_branch_error'),
status: 'error',
});
},
});
const handleBranch = () => {
if (!messageId || !agentId || isSubmitting || branchMessage.isLoading) {
return;
}
branchMessage.mutate({ messageId, agentId });
};
const { displayName, displayEndpoint, displayModel, agent } = useMemo(() => {
// First, try to look up as a real agent
if (agentId) {
// Strip ____N suffix if present (used to distinguish parallel agents with same ID)
const baseAgentId = stripAgentIdSuffix(agentId);
const foundAgent = agentsMap?.[baseAgentId] as Agent | undefined;
if (foundAgent) {
return {
displayName: foundAgent.name,
displayEndpoint: EModelEndpoint.agents,
displayModel: foundAgent.model,
agent: foundAgent,
};
}
// Try to parse as ephemeral agent ID (endpoint__model___sender format)
const parsed = parseEphemeralAgentId(agentId);
if (parsed) {
return {
displayName: parsed.sender || parsed.model || 'AI',
displayEndpoint: parsed.endpoint,
displayModel: parsed.model,
agent: undefined,
};
}
// agentId exists but couldn't be parsed as ephemeral - use it as-is for display
return {
displayName: baseAgentId,
displayEndpoint: EModelEndpoint.agents,
displayModel: undefined,
agent: undefined,
};
}
// Use message model/endpoint as last resort
return {
displayName: 'Agent',
displayEndpoint: EModelEndpoint.agents,
displayModel: undefined,
agent: undefined,
};
}, [agentId, agentsMap]);
return (
<div className="mb-2 flex items-center justify-between gap-2 border-b border-border-light pb-2">
<div className="flex min-w-0 items-center gap-2">
<div className="flex h-5 w-5 flex-shrink-0 items-center justify-center overflow-hidden rounded-full">
<MessageIcon
message={
{
endpoint: displayEndpoint,
model: displayModel,
isCreatedByUser: false,
} as TMessage
}
agent={agent || undefined}
/>
</div>
<span className="truncate text-sm font-medium text-text-primary">{displayName}</span>
</div>
{messageId && agentId && !isSubmitting && (
<button
type="button"
onClick={handleBranch}
disabled={isSubmitting || branchMessage.isLoading}
className={cn(
'flex h-6 w-6 flex-shrink-0 items-center justify-center rounded-md',
'text-text-secondary transition-colors hover:bg-surface-hover hover:text-text-primary',
'focus:outline-none focus:ring-2 focus:ring-border-medium focus:ring-offset-1',
'disabled:cursor-not-allowed disabled:opacity-50',
)}
aria-label={localize('com_ui_branch_message')}
title={localize('com_ui_branch_message')}
>
<GitBranchPlus className="h-4 w-4" aria-hidden="true" />
</button>
)}
</div>
);
}

View file

@ -213,7 +213,10 @@ const HoverButtons = ({
}
icon={isCopied ? <CheckMark className="h-[18px] w-[18px]" /> : <Clipboard size="19" />}
isLast={isLast}
className={`ml-0 flex items-center gap-1.5 text-xs ${isSubmitting && isCreatedByUser ? 'md:opacity-0 md:group-hover:opacity-100' : ''}`}
className={cn(
'ml-0 flex items-center gap-1.5 text-xs',
isSubmitting && isCreatedByUser ? 'md:opacity-0 md:group-hover:opacity-100' : '',
)}
/>
{/* Edit Button */}

View file

@ -1,12 +1,8 @@
import React from 'react';
import { useRecoilValue } from 'recoil';
import { useMessageProcess } from '~/hooks';
import type { TMessageProps } from '~/common';
import MessageRender from './ui/MessageRender';
import MultiMessage from './MultiMessage';
import { cn } from '~/utils';
import store from '~/store';
const MessageContainer = React.memo(
({
@ -29,16 +25,10 @@ const MessageContainer = React.memo(
);
export default function Message(props: TMessageProps) {
const {
showSibling,
conversation,
handleScroll,
siblingMessage,
latestMultiMessage,
isSubmittingFamily,
} = useMessageProcess({ message: props.message });
const { conversation, handleScroll } = useMessageProcess({
message: props.message,
});
const { message, currentEditId, setCurrentEditId } = props;
const maximizeChatSpace = useRecoilValue(store.maximizeChatSpace);
if (!message || typeof message !== 'object') {
return null;
@ -49,34 +39,9 @@ export default function Message(props: TMessageProps) {
return (
<>
<MessageContainer handleScroll={handleScroll}>
{showSibling ? (
<div className="m-auto my-2 flex justify-center p-4 py-2 md:gap-6">
<div
className={cn(
'flex w-full flex-row flex-wrap justify-between gap-1 md:flex-nowrap md:gap-2',
maximizeChatSpace ? 'w-full max-w-full' : 'md:max-w-5xl xl:max-w-6xl',
)}
>
<MessageRender
{...props}
message={message}
isSubmittingFamily={isSubmittingFamily}
isCard
/>
<MessageRender
{...props}
isMultiMessage
isCard
message={siblingMessage ?? latestMultiMessage ?? undefined}
isSubmittingFamily={isSubmittingFamily}
/>
</div>
</div>
) : (
<div className="m-auto justify-center p-4 py-2 md:gap-6">
<MessageRender {...props} />
</div>
)}
<div className="m-auto justify-center p-4 py-2 md:gap-6">
<MessageRender {...props} />
</div>
</MessageContainer>
<MultiMessage
key={messageId}

View file

@ -3,7 +3,7 @@ import { useAtomValue } from 'jotai';
import { useRecoilValue } from 'recoil';
import type { TMessageContentParts } from 'librechat-data-provider';
import type { TMessageProps, TMessageIcon } from '~/common';
import { useMessageHelpers, useLocalize, useAttachments } from '~/hooks';
import { useMessageHelpers, useLocalize, useAttachments, useContentMetadata } from '~/hooks';
import MessageIcon from '~/components/Chat/Messages/MessageIcon';
import ContentParts from './Content/ContentParts';
import { fontSizeAtom } from '~/store/fontSize';
@ -75,15 +75,25 @@ export default function Message(props: TMessageProps) {
],
);
const { hasParallelContent } = useContentMetadata(message);
if (!message) {
return null;
}
const getChatWidthClass = () => {
if (maximizeChatSpace) {
return 'w-full max-w-full md:px-5 lg:px-1 xl:px-5';
}
if (hasParallelContent) {
return 'md:max-w-[58rem] xl:max-w-[70rem]';
}
return 'md:max-w-[47rem] xl:max-w-[55rem]';
};
const baseClasses = {
common: 'group mx-auto flex flex-1 gap-3 transition-all duration-300 transform-gpu',
chat: maximizeChatSpace
? 'w-full max-w-full md:px-5 lg:px-1 xl:px-5'
: 'md:max-w-[47rem] xl:max-w-[55rem]',
chat: getChatWidthClass(),
};
return (
@ -99,20 +109,25 @@ export default function Message(props: TMessageProps) {
aria-label={getMessageAriaLabel(message, localize)}
className={cn(baseClasses.common, baseClasses.chat, 'message-render')}
>
<div className="relative flex flex-shrink-0 flex-col items-center">
<div className="flex h-6 w-6 items-center justify-center overflow-hidden rounded-full pt-0.5">
<MessageIcon iconData={iconData} assistant={assistant} agent={agent} />
{!hasParallelContent && (
<div className="relative flex flex-shrink-0 flex-col items-center">
<div className="flex h-6 w-6 items-center justify-center overflow-hidden rounded-full pt-0.5">
<MessageIcon iconData={iconData} assistant={assistant} agent={agent} />
</div>
</div>
</div>
)}
<div
className={cn(
'relative flex w-11/12 flex-col',
'relative flex flex-col',
hasParallelContent ? 'w-full' : 'w-11/12',
isCreatedByUser ? 'user-turn' : 'agent-turn',
)}
>
<h2 className={cn('select-none font-semibold text-text-primary', fontSize)}>
{name}
</h2>
{!hasParallelContent && (
<h2 className={cn('select-none font-semibold text-text-primary', fontSize)}>
{name}
</h2>
)}
<div className="flex flex-col gap-1">
<div className="flex max-w-full flex-grow flex-col gap-0">
<ContentParts

View file

@ -8,18 +8,16 @@ import PlaceholderRow from '~/components/Chat/Messages/ui/PlaceholderRow';
import SiblingSwitch from '~/components/Chat/Messages/SiblingSwitch';
import HoverButtons from '~/components/Chat/Messages/HoverButtons';
import MessageIcon from '~/components/Chat/Messages/MessageIcon';
import { useLocalize, useMessageActions, useContentMetadata } from '~/hooks';
import SubRow from '~/components/Chat/Messages/SubRow';
import { cn, getMessageAriaLabel } from '~/utils';
import { fontSizeAtom } from '~/store/fontSize';
import { MessageContext } from '~/Providers';
import { useLocalize, useMessageActions } from '~/hooks';
import { cn, getMessageAriaLabel, logger } from '~/utils';
import store from '~/store';
type MessageRenderProps = {
message?: TMessage;
isCard?: boolean;
isMultiMessage?: boolean;
isSubmittingFamily?: boolean;
isSubmitting?: boolean;
} & Pick<
TMessageProps,
'currentEditId' | 'setCurrentEditId' | 'siblingIdx' | 'setSiblingIdx' | 'siblingCount'
@ -28,14 +26,12 @@ type MessageRenderProps = {
const MessageRender = memo(
({
message: msg,
isCard = false,
siblingIdx,
siblingCount,
setSiblingIdx,
currentEditId,
isMultiMessage = false,
setCurrentEditId,
isSubmittingFamily = false,
isSubmitting = false,
}: MessageRenderProps) => {
const localize = useLocalize();
const {
@ -47,17 +43,14 @@ const MessageRender = memo(
enterEdit,
conversation,
messageLabel,
isSubmitting,
latestMessage,
handleFeedback,
handleContinue,
copyToClipboard,
setLatestMessage,
regenerateMessage,
handleFeedback,
} = useMessageActions({
message: msg,
currentEditId,
isMultiMessage,
setCurrentEditId,
});
const fontSize = useAtomValue(fontSizeAtom);
@ -70,9 +63,6 @@ const MessageRender = memo(
[hasNoChildren, msg?.depth, latestMessage?.depth],
);
const isLatestMessage = msg?.messageId === latestMessage?.messageId;
const showCardRender = isLast && !isSubmittingFamily && isCard;
const isLatestCard = isCard && !isSubmittingFamily && isLatestMessage;
/** Only pass isSubmitting to the latest message to prevent unnecessary re-renders */
const effectiveIsSubmitting = isLatestMessage ? isSubmitting : false;
@ -95,36 +85,28 @@ const MessageRender = memo(
],
);
const clickHandler = useMemo(
() =>
showCardRender && !isLatestMessage
? () => {
logger.log(
'latest_message',
`Message Card click: Setting ${msg?.messageId} as latest message`,
);
logger.dir(msg);
setLatestMessage(msg!);
}
: undefined,
[showCardRender, isLatestMessage, msg, setLatestMessage],
);
const { hasParallelContent } = useContentMetadata(msg);
if (!msg) {
return null;
}
const getChatWidthClass = () => {
if (maximizeChatSpace) {
return 'w-full max-w-full md:px-5 lg:px-1 xl:px-5';
}
if (hasParallelContent) {
return 'md:max-w-[58rem] xl:max-w-[70rem]';
}
return 'md:max-w-[47rem] xl:max-w-[55rem]';
};
const baseClasses = {
common: 'group mx-auto flex flex-1 gap-3 transition-all duration-300 transform-gpu ',
card: 'relative w-full gap-1 rounded-lg border border-border-medium bg-surface-primary-alt p-2 md:w-1/2 md:gap-3 md:p-4',
chat: maximizeChatSpace
? 'w-full max-w-full md:px-5 lg:px-1 xl:px-5'
: 'md:max-w-[47rem] xl:max-w-[55rem]',
chat: getChatWidthClass(),
};
const conditionalClasses = {
latestCard: isLatestCard ? 'bg-surface-secondary' : '',
cardRender: showCardRender ? 'cursor-pointer transition-colors duration-300' : '',
focus: 'focus:outline-none focus:ring-2 focus:ring-border-xheavy',
};
@ -134,38 +116,29 @@ const MessageRender = memo(
aria-label={getMessageAriaLabel(msg, localize)}
className={cn(
baseClasses.common,
isCard ? baseClasses.card : baseClasses.chat,
conditionalClasses.latestCard,
conditionalClasses.cardRender,
baseClasses.chat,
conditionalClasses.focus,
'message-render',
)}
onClick={clickHandler}
onKeyDown={(e) => {
if ((e.key === 'Enter' || e.key === ' ') && clickHandler) {
clickHandler();
}
}}
role={showCardRender ? 'button' : undefined}
tabIndex={showCardRender ? 0 : undefined}
>
{isLatestCard && (
<div className="absolute right-0 top-0 m-2 h-3 w-3 rounded-full bg-text-primary" />
)}
<div className="relative flex flex-shrink-0 flex-col items-center">
<div className="flex h-6 w-6 items-center justify-center overflow-hidden rounded-full">
<MessageIcon iconData={iconData} assistant={assistant} agent={agent} />
{!hasParallelContent && (
<div className="relative flex flex-shrink-0 flex-col items-center">
<div className="flex h-6 w-6 items-center justify-center overflow-hidden rounded-full">
<MessageIcon iconData={iconData} assistant={assistant} agent={agent} />
</div>
</div>
</div>
)}
<div
className={cn(
'relative flex w-11/12 flex-col',
'relative flex flex-col',
hasParallelContent ? 'w-full' : 'w-11/12',
msg.isCreatedByUser ? 'user-turn' : 'agent-turn',
)}
>
<h2 className={cn('select-none font-semibold', fontSize)}>{messageLabel}</h2>
{!hasParallelContent && (
<h2 className={cn('select-none font-semibold', fontSize)}>{messageLabel}</h2>
)}
<div className="flex flex-col gap-1">
<div className="flex max-w-full flex-grow flex-col gap-0">
@ -194,9 +167,8 @@ const MessageRender = memo(
/>
</MessageContext.Provider>
</div>
{hasNoChildren && (isSubmittingFamily === true || effectiveIsSubmitting) ? (
<PlaceholderRow isCard={isCard} />
{hasNoChildren && effectiveIsSubmitting ? (
<PlaceholderRow />
) : (
<SubRow classes="text-xs">
<SiblingSwitch

View file

@ -1,9 +1,6 @@
import { memo } from 'react';
const PlaceholderRow = memo(({ isCard }: { isCard?: boolean }) => {
if (!isCard) {
return null;
}
const PlaceholderRow = memo(() => {
return <div className="mt-1 h-[27px] bg-transparent" />;
});

View file

@ -3,22 +3,20 @@ import { useAtomValue } from 'jotai';
import { useRecoilValue } from 'recoil';
import type { TMessage, TMessageContentParts } from 'librechat-data-provider';
import type { TMessageProps, TMessageIcon } from '~/common';
import { useAttachments, useLocalize, useMessageActions, useContentMetadata } from '~/hooks';
import ContentParts from '~/components/Chat/Messages/Content/ContentParts';
import PlaceholderRow from '~/components/Chat/Messages/ui/PlaceholderRow';
import SiblingSwitch from '~/components/Chat/Messages/SiblingSwitch';
import HoverButtons from '~/components/Chat/Messages/HoverButtons';
import MessageIcon from '~/components/Chat/Messages/MessageIcon';
import { useAttachments, useLocalize, useMessageActions } from '~/hooks';
import SubRow from '~/components/Chat/Messages/SubRow';
import { cn, getMessageAriaLabel } from '~/utils';
import { fontSizeAtom } from '~/store/fontSize';
import { cn, getMessageAriaLabel, logger } from '~/utils';
import store from '~/store';
type ContentRenderProps = {
message?: TMessage;
isCard?: boolean;
isMultiMessage?: boolean;
isSubmittingFamily?: boolean;
isSubmitting?: boolean;
} & Pick<
TMessageProps,
'currentEditId' | 'setCurrentEditId' | 'siblingIdx' | 'setSiblingIdx' | 'siblingCount'
@ -27,14 +25,12 @@ type ContentRenderProps = {
const ContentRender = memo(
({
message: msg,
isCard = false,
siblingIdx,
siblingCount,
setSiblingIdx,
currentEditId,
isMultiMessage = false,
setCurrentEditId,
isSubmittingFamily = false,
isSubmitting = false,
}: ContentRenderProps) => {
const localize = useLocalize();
const { attachments, searchResults } = useAttachments({
@ -49,18 +45,15 @@ const ContentRender = memo(
enterEdit,
conversation,
messageLabel,
isSubmitting,
latestMessage,
handleContinue,
copyToClipboard,
setLatestMessage,
regenerateMessage,
handleFeedback,
copyToClipboard,
regenerateMessage,
} = useMessageActions({
message: msg,
searchResults,
currentEditId,
isMultiMessage,
setCurrentEditId,
});
const fontSize = useAtomValue(fontSizeAtom);
@ -72,9 +65,10 @@ const ContentRender = memo(
!(msg?.children?.length ?? 0) && (msg?.depth === latestMessage?.depth || msg?.depth === -1),
[msg?.children, msg?.depth, latestMessage?.depth],
);
const hasNoChildren = !(msg?.children?.length ?? 0);
const isLatestMessage = msg?.messageId === latestMessage?.messageId;
const showCardRender = isLast && !isSubmittingFamily && isCard;
const isLatestCard = isCard && !isSubmittingFamily && isLatestMessage;
/** Only pass isSubmitting to the latest message to prevent unnecessary re-renders */
const effectiveIsSubmitting = isLatestMessage ? isSubmitting : false;
const iconData: TMessageIcon = useMemo(
() => ({
@ -95,36 +89,28 @@ const ContentRender = memo(
],
);
const clickHandler = useMemo(
() =>
showCardRender && !isLatestMessage
? () => {
logger.log(
'latest_message',
`Message Card click: Setting ${msg?.messageId} as latest message`,
);
logger.dir(msg);
setLatestMessage(msg!);
}
: undefined,
[showCardRender, isLatestMessage, msg, setLatestMessage],
);
const { hasParallelContent } = useContentMetadata(msg);
if (!msg) {
return null;
}
const getChatWidthClass = () => {
if (maximizeChatSpace) {
return 'w-full max-w-full md:px-5 lg:px-1 xl:px-5';
}
if (hasParallelContent) {
return 'md:max-w-[58rem] xl:max-w-[70rem]';
}
return 'md:max-w-[47rem] xl:max-w-[55rem]';
};
const baseClasses = {
common: 'group mx-auto flex flex-1 gap-3 transition-all duration-300 transform-gpu ',
card: 'relative w-full gap-1 rounded-lg border border-border-medium bg-surface-primary-alt p-2 md:w-1/2 md:gap-3 md:p-4',
chat: maximizeChatSpace
? 'w-full max-w-full md:px-5 lg:px-1 xl:px-5'
: 'md:max-w-[47rem] xl:max-w-[55rem]',
chat: getChatWidthClass(),
};
const conditionalClasses = {
latestCard: isLatestCard ? 'bg-surface-secondary' : '',
cardRender: showCardRender ? 'cursor-pointer transition-colors duration-300' : '',
focus: 'focus:outline-none focus:ring-2 focus:ring-border-xheavy',
};
@ -134,38 +120,29 @@ const ContentRender = memo(
aria-label={getMessageAriaLabel(msg, localize)}
className={cn(
baseClasses.common,
isCard ? baseClasses.card : baseClasses.chat,
conditionalClasses.latestCard,
conditionalClasses.cardRender,
baseClasses.chat,
conditionalClasses.focus,
'message-render',
)}
onClick={clickHandler}
onKeyDown={(e) => {
if ((e.key === 'Enter' || e.key === ' ') && clickHandler) {
clickHandler();
}
}}
role={showCardRender ? 'button' : undefined}
tabIndex={showCardRender ? 0 : undefined}
>
{isLatestCard && (
<div className="absolute right-0 top-0 m-2 h-3 w-3 rounded-full bg-text-primary" />
)}
<div className="relative flex flex-shrink-0 flex-col items-center">
<div className="flex h-6 w-6 items-center justify-center overflow-hidden rounded-full">
<MessageIcon iconData={iconData} assistant={assistant} agent={agent} />
{!hasParallelContent && (
<div className="relative flex flex-shrink-0 flex-col items-center">
<div className="flex h-6 w-6 items-center justify-center overflow-hidden rounded-full">
<MessageIcon iconData={iconData} assistant={assistant} agent={agent} />
</div>
</div>
</div>
)}
<div
className={cn(
'relative flex w-11/12 flex-col',
'relative flex flex-col',
hasParallelContent ? 'w-full' : 'w-11/12',
msg.isCreatedByUser ? 'user-turn' : 'agent-turn',
)}
>
<h2 className={cn('select-none font-semibold', fontSize)}>{messageLabel}</h2>
{!hasParallelContent && (
<h2 className={cn('select-none font-semibold', fontSize)}>{messageLabel}</h2>
)}
<div className="flex flex-col gap-1">
<div className="flex max-w-full flex-grow flex-col gap-0">
@ -176,18 +153,17 @@ const ContentRender = memo(
siblingIdx={siblingIdx}
messageId={msg.messageId}
attachments={attachments}
isSubmitting={isSubmitting}
searchResults={searchResults}
setSiblingIdx={setSiblingIdx}
isLatestMessage={isLatestMessage}
isSubmitting={effectiveIsSubmitting}
isCreatedByUser={msg.isCreatedByUser}
conversationId={conversation?.conversationId}
content={msg.content as Array<TMessageContentParts | undefined>}
/>
</div>
{(isSubmittingFamily || isSubmitting) && !(msg.children?.length ?? 0) ? (
<PlaceholderRow isCard={isCard} />
{hasNoChildren && effectiveIsSubmitting ? (
<PlaceholderRow />
) : (
<SubRow classes="text-xs">
<SiblingSwitch
@ -197,8 +173,8 @@ const ContentRender = memo(
/>
<HoverButtons
index={index}
isEditing={edit}
message={msg}
isEditing={edit}
enterEdit={enterEdit}
isSubmitting={isSubmitting}
conversation={conversation ?? null}

View file

@ -26,14 +26,9 @@ const MessageContainer = React.memo(
);
export default function MessageContent(props: TMessageProps) {
const {
showSibling,
conversation,
handleScroll,
siblingMessage,
latestMultiMessage,
isSubmittingFamily,
} = useMessageProcess({ message: props.message });
const { conversation, handleScroll, isSubmitting } = useMessageProcess({
message: props.message,
});
const { message, currentEditId, setCurrentEditId } = props;
if (!message || typeof message !== 'object') {
@ -45,29 +40,9 @@ export default function MessageContent(props: TMessageProps) {
return (
<>
<MessageContainer handleScroll={handleScroll}>
{showSibling ? (
<div className="m-auto my-2 flex justify-center p-4 py-2 md:gap-6">
<div className="flex w-full flex-row flex-wrap justify-between gap-1 md:max-w-5xl md:flex-nowrap md:gap-2 lg:max-w-5xl xl:max-w-6xl">
<ContentRender
{...props}
message={message}
isSubmittingFamily={isSubmittingFamily}
isCard
/>
<ContentRender
{...props}
isMultiMessage
isCard
message={siblingMessage ?? latestMultiMessage ?? undefined}
isSubmittingFamily={isSubmittingFamily}
/>
</div>
</div>
) : (
<div className="m-auto justify-center p-4 py-2 md:gap-6">
<ContentRender {...props} />
</div>
)}
<div className="m-auto justify-center p-4 py-2 md:gap-6">
<ContentRender {...props} isSubmitting={isSubmitting} />
</div>
</MessageContainer>
<MultiMessage
key={messageId}

View file

@ -27,7 +27,6 @@ export function ShareMessagesProvider({ messages, children }: ShareMessagesProvi
handleContinue: () => {},
latestMessage: messages[messages.length - 1] ?? null,
isSubmitting: false,
isSubmittingFamily: false,
abortScroll: false,
setAbortScroll: () => {},
index: 0,

View file

@ -97,3 +97,78 @@ export const useEditArtifact = (
return useMutation(mutationOptions);
};
type BranchMessageContext = {
previousMessages: t.TMessage[] | undefined;
conversationId: string | null;
};
export const useBranchMessageMutation = (
conversationId: string | null,
_options?: t.BranchMessageOptions,
): UseMutationResult<
t.TBranchMessageResponse,
Error,
t.TBranchMessageRequest,
BranchMessageContext
> => {
const queryClient = useQueryClient();
const { onSuccess, onError, onMutate: userOnMutate, ...options } = _options ?? {};
const mutationOptions: UseMutationOptions<
t.TBranchMessageResponse,
Error,
t.TBranchMessageRequest,
BranchMessageContext
> = {
mutationFn: (variables: t.TBranchMessageRequest) => dataService.branchMessage(variables),
onMutate: async (vars) => {
// Call user's onMutate if provided
if (userOnMutate) {
await userOnMutate(vars);
}
// Cancel any outgoing queries for messages
if (conversationId) {
await queryClient.cancelQueries([QueryKeys.messages, conversationId]);
}
// Get the previous messages for rollback
const previousMessages = conversationId
? queryClient.getQueryData<t.TMessage[]>([QueryKeys.messages, conversationId])
: undefined;
return { previousMessages, conversationId };
},
onError: (error, vars, context) => {
// Rollback to previous messages on error
if (context?.conversationId && context?.previousMessages) {
queryClient.setQueryData(
[QueryKeys.messages, context.conversationId],
context.previousMessages,
);
}
onError?.(error, vars, context);
},
onSuccess: (data, vars, context) => {
// Add the new message to the cache
const targetConversationId = data.conversationId || context?.conversationId;
if (targetConversationId) {
queryClient.setQueryData<t.TMessage[]>(
[QueryKeys.messages, targetConversationId],
(prev) => {
if (!prev) {
return [data];
}
return [...prev, data];
},
);
}
onSuccess?.(data, vars, context);
},
...options,
};
return useMutation(mutationOptions);
};

View file

@ -153,7 +153,7 @@ describe('useAgentToolPermissions', () => {
});
it('should not affect regular agents when ephemeralAgent is provided', () => {
const agentId = 'regular-agent';
const agentId = 'agent_regular';
const mockAgent = {
id: agentId,
tools: [Tools.file_search],
@ -179,7 +179,7 @@ describe('useAgentToolPermissions', () => {
describe('Regular Agent with Tools', () => {
it('should allow file_search when agent has the tool', () => {
const agentId = 'agent-123';
const agentId = 'agent_123';
const mockAgent = {
id: agentId,
tools: [Tools.file_search, 'other_tool'],
@ -198,7 +198,7 @@ describe('useAgentToolPermissions', () => {
});
it('should allow execute_code when agent has the tool', () => {
const agentId = 'agent-456';
const agentId = 'agent_456';
const mockAgent = {
id: agentId,
tools: [Tools.execute_code, 'another_tool'],
@ -217,7 +217,7 @@ describe('useAgentToolPermissions', () => {
});
it('should allow both tools when agent has both', () => {
const agentId = 'agent-789';
const agentId = 'agent_789';
const mockAgent = {
id: agentId,
tools: [Tools.file_search, Tools.execute_code, 'custom_tool'],
@ -236,7 +236,7 @@ describe('useAgentToolPermissions', () => {
});
it('should disallow both tools when agent has neither', () => {
const agentId = 'agent-no-tools';
const agentId = 'agent_no_tools';
const mockAgent = {
id: agentId,
tools: ['custom_tool1', 'custom_tool2'],
@ -255,7 +255,7 @@ describe('useAgentToolPermissions', () => {
});
it('should handle agent with empty tools array', () => {
const agentId = 'agent-empty-tools';
const agentId = 'agent_empty_tools';
const mockAgent = {
id: agentId,
tools: [],
@ -274,7 +274,7 @@ describe('useAgentToolPermissions', () => {
});
it('should handle agent with undefined tools', () => {
const agentId = 'agent-undefined-tools';
const agentId = 'agent_undefined_tools';
const mockAgent = {
id: agentId,
tools: undefined,
@ -295,7 +295,7 @@ describe('useAgentToolPermissions', () => {
describe('Agent Data from Query', () => {
it('should prioritize agentData tools over selectedAgent tools', () => {
const agentId = 'agent-with-query-data';
const agentId = 'agent_with_query_data';
const mockAgent = {
id: agentId,
tools: ['old_tool'],
@ -318,7 +318,7 @@ describe('useAgentToolPermissions', () => {
});
it('should fallback to selectedAgent tools when agentData has no tools', () => {
const agentId = 'agent-fallback';
const agentId = 'agent_fallback';
const mockAgent = {
id: agentId,
tools: [Tools.file_search],
@ -343,7 +343,7 @@ describe('useAgentToolPermissions', () => {
describe('Agent Not Found Scenarios', () => {
it('should disallow all tools when agent is not found in map', () => {
const agentId = 'non-existent-agent';
const agentId = 'agent_nonexistent';
(useAgentsMapContext as jest.Mock).mockReturnValue({});
(useGetAgentByIdQuery as jest.Mock).mockReturnValue({ data: undefined });
@ -356,7 +356,7 @@ describe('useAgentToolPermissions', () => {
});
it('should disallow all tools when agentsMap is null', () => {
const agentId = 'agent-with-null-map';
const agentId = 'agent_with_null_map';
(useAgentsMapContext as jest.Mock).mockReturnValue(null);
(useGetAgentByIdQuery as jest.Mock).mockReturnValue({ data: undefined });
@ -369,7 +369,7 @@ describe('useAgentToolPermissions', () => {
});
it('should disallow all tools when agentsMap is undefined', () => {
const agentId = 'agent-with-undefined-map';
const agentId = 'agent_with_undefined_map';
(useAgentsMapContext as jest.Mock).mockReturnValue(undefined);
(useGetAgentByIdQuery as jest.Mock).mockReturnValue({ data: undefined });
@ -384,7 +384,7 @@ describe('useAgentToolPermissions', () => {
describe('Memoization and Performance', () => {
it('should memoize results when inputs do not change', () => {
const agentId = 'memoized-agent';
const agentId = 'agent_memoized';
const mockAgent = {
id: agentId,
tools: [Tools.file_search],
@ -417,8 +417,8 @@ describe('useAgentToolPermissions', () => {
});
it('should recompute when agentId changes', () => {
const agentId1 = 'agent-1';
const agentId2 = 'agent-2';
const agentId1 = 'agent_1';
const agentId2 = 'agent_2';
const mockAgents = {
[agentId1]: { id: agentId1, tools: [Tools.file_search] },
[agentId2]: { id: agentId2, tools: [Tools.execute_code] },
@ -442,7 +442,7 @@ describe('useAgentToolPermissions', () => {
});
it('should handle switching between ephemeral and regular agents', () => {
const regularAgentId = 'regular-agent';
const regularAgentId = 'agent_regular';
const mockAgent = {
id: regularAgentId,
tools: [],
@ -486,7 +486,7 @@ describe('useAgentToolPermissions', () => {
describe('Edge Cases', () => {
it('should handle agents with null tools gracefully', () => {
const agentId = 'agent-null-tools';
const agentId = 'agent_null_tools';
const mockAgent = {
id: agentId,
tools: null as any,
@ -520,7 +520,7 @@ describe('useAgentToolPermissions', () => {
});
it('should handle query loading state', () => {
const agentId = 'loading-agent';
const agentId = 'agent_loading';
(useAgentsMapContext as jest.Mock).mockReturnValue({});
(useGetAgentByIdQuery as jest.Mock).mockReturnValue({
@ -538,7 +538,7 @@ describe('useAgentToolPermissions', () => {
});
it('should handle query error state', () => {
const agentId = 'error-agent';
const agentId = 'agent_error';
(useAgentsMapContext as jest.Mock).mockReturnValue({});
(useGetAgentByIdQuery as jest.Mock).mockReturnValue({

View file

@ -59,7 +59,7 @@ describe('useAgentToolPermissions', () => {
mockUseAgentsMapContext.mockReturnValue({});
mockUseGetAgentByIdQuery.mockReturnValue({ data: undefined });
const { result } = renderHook(() => useAgentToolPermissions('non-existent-agent'));
const { result } = renderHook(() => useAgentToolPermissions('agent_nonexistent'));
expect(result.current.fileSearchAllowedByAgent).toBe(false);
expect(result.current.codeAllowedByAgent).toBe(false);
@ -69,7 +69,7 @@ describe('useAgentToolPermissions', () => {
describe('when agent is found with tools', () => {
it('should allow tools that are included in the agent tools array', () => {
const agentId = 'test-agent';
const agentId = 'agent_test';
const agent = {
id: agentId,
tools: [Tools.file_search],
@ -86,7 +86,7 @@ describe('useAgentToolPermissions', () => {
});
it('should allow both tools when both are included', () => {
const agentId = 'test-agent';
const agentId = 'agent_test';
const agent = {
id: agentId,
tools: [Tools.file_search, Tools.execute_code],
@ -103,7 +103,7 @@ describe('useAgentToolPermissions', () => {
});
it('should use data from API query when available', () => {
const agentId = 'test-agent';
const agentId = 'agent_test';
const agentMapData = {
id: agentId,
tools: [Tools.file_search],
@ -125,7 +125,7 @@ describe('useAgentToolPermissions', () => {
});
it('should fallback to agent map data when API data is not available', () => {
const agentId = 'test-agent';
const agentId = 'agent_test';
const agentMapData = {
id: agentId,
tools: [Tools.execute_code],
@ -144,7 +144,7 @@ describe('useAgentToolPermissions', () => {
describe('when agent has no tools', () => {
it('should disallow all tools with empty array', () => {
const agentId = 'test-agent';
const agentId = 'agent_test';
const agent = {
id: agentId,
tools: [],
@ -161,7 +161,7 @@ describe('useAgentToolPermissions', () => {
});
it('should disallow all tools with undefined tools', () => {
const agentId = 'test-agent';
const agentId = 'agent_test';
const agent = {
id: agentId,
tools: undefined,
@ -226,7 +226,7 @@ describe('useAgentToolPermissions', () => {
});
it('should not affect regular agents when ephemeralAgent is provided', () => {
const agentId = 'regular-agent';
const agentId = 'agent_regular';
const agent = {
id: agentId,
tools: [Tools.file_search],

View file

@ -1,6 +1,6 @@
export { default as useChatHelpers } from './useChatHelpers';
export { default as useAddedHelpers } from './useAddedHelpers';
export { default as useAddedResponse } from './useAddedResponse';
export { default as useChatFunctions } from './useChatFunctions';
export { default as useGetAddedConvo } from './useGetAddedConvo';
export { default as useIdChangeEffect } from './useIdChangeEffect';
export { default as useFocusChatEffect } from './useFocusChatEffect';

View file

@ -1,128 +0,0 @@
import { useCallback } from 'react';
import { useQueryClient } from '@tanstack/react-query';
import { QueryKeys } from 'librechat-data-provider';
import { useRecoilState, useRecoilValue, useSetRecoilState } from 'recoil';
import type { TMessage } from 'librechat-data-provider';
import useChatFunctions from '~/hooks/Chat/useChatFunctions';
import store from '~/store';
// this to be set somewhere else
export default function useAddedHelpers({
rootIndex = 0,
currentIndex,
paramId,
}: {
rootIndex?: number;
currentIndex: number;
paramId?: string;
}) {
const queryClient = useQueryClient();
const clearAllSubmissions = store.useClearSubmissionState();
const [files, setFiles] = useRecoilState(store.filesByIndex(rootIndex));
const latestMessage = useRecoilValue(store.latestMessageFamily(rootIndex));
const setLatestMultiMessage = useSetRecoilState(store.latestMessageFamily(currentIndex));
const { useCreateConversationAtom } = store;
const { conversation, setConversation } = useCreateConversationAtom(currentIndex);
const [isSubmitting, setIsSubmitting] = useRecoilState(store.isSubmittingFamily(currentIndex));
const setSiblingIdx = useSetRecoilState(
store.messagesSiblingIdxFamily(latestMessage?.parentMessageId ?? null),
);
const queryParam = paramId === 'new' ? paramId : (conversation?.conversationId ?? paramId ?? '');
const setMessages = useCallback(
(messages: TMessage[]) => {
queryClient.setQueryData<TMessage[]>(
[QueryKeys.messages, queryParam, currentIndex],
messages,
);
const latestMultiMessage = messages[messages.length - 1];
if (latestMultiMessage) {
setLatestMultiMessage({ ...latestMultiMessage, depth: -1 });
}
},
[queryParam, queryClient, currentIndex, setLatestMultiMessage],
);
const getMessages = useCallback(() => {
return queryClient.getQueryData<TMessage[]>([QueryKeys.messages, queryParam, currentIndex]);
}, [queryParam, queryClient, currentIndex]);
const setSubmission = useSetRecoilState(store.submissionByIndex(currentIndex));
const { ask, regenerate } = useChatFunctions({
index: currentIndex,
files,
setFiles,
getMessages,
setMessages,
isSubmitting,
conversation,
setSubmission,
latestMessage,
});
const continueGeneration = () => {
if (!latestMessage) {
console.error('Failed to regenerate the message: latestMessage not found.');
return;
}
const messages = getMessages();
const parentMessage = messages?.find(
(element) => element.messageId == latestMessage.parentMessageId,
);
if (parentMessage && parentMessage.isCreatedByUser) {
ask({ ...parentMessage }, { isContinued: true, isRegenerate: true, isEdited: true });
} else {
console.error(
'Failed to regenerate the message: parentMessage not found, or not created by user.',
);
}
};
const stopGenerating = () => clearAllSubmissions();
const handleStopGenerating = (e: React.MouseEvent<HTMLButtonElement>) => {
e.preventDefault();
stopGenerating();
};
const handleRegenerate = (e: React.MouseEvent<HTMLButtonElement>) => {
e.preventDefault();
const parentMessageId = latestMessage?.parentMessageId;
if (!parentMessageId) {
console.error('Failed to regenerate the message: parentMessageId not found.');
return;
}
regenerate({ parentMessageId });
};
const handleContinue = (e: React.MouseEvent<HTMLButtonElement>) => {
e.preventDefault();
continueGeneration();
setSiblingIdx(0);
};
return {
ask,
regenerate,
getMessages,
setMessages,
conversation,
isSubmitting,
setSiblingIdx,
latestMessage,
stopGenerating,
handleContinue,
setConversation,
setIsSubmitting,
handleRegenerate,
handleStopGenerating,
};
}

View file

@ -1,39 +1,123 @@
import { useMemo } from 'react';
import useGenerateConvo from '~/hooks/Conversations/useGenerateConvo';
import useAddedHelpers from '~/hooks/Chat/useAddedHelpers';
import { useCallback } from 'react';
import { useRecoilValue } from 'recoil';
import { useGetModelsQuery } from 'librechat-data-provider/react-query';
import { getEndpointField, LocalStorageKeys, isAssistantsEndpoint } from 'librechat-data-provider';
import type { TEndpointsConfig, EModelEndpoint, TConversation } from 'librechat-data-provider';
import type { AssistantListItem, NewConversationParams } from '~/common';
import useAssistantListMap from '~/hooks/Assistants/useAssistantListMap';
import { buildDefaultConvo, getDefaultEndpoint } from '~/utils';
import { useGetEndpointsQuery } from '~/data-provider';
import { mainTextareaId } from '~/common';
import store from '~/store';
export default function useAddedResponse({ rootIndex }: { rootIndex: number }) {
const currentIndex = useMemo(() => rootIndex + 1, [rootIndex]);
const {
ask,
regenerate,
setMessages,
getMessages,
conversation,
isSubmitting,
setConversation,
setIsSubmitting,
} = useAddedHelpers({
rootIndex,
currentIndex,
});
const ADDED_INDEX = 1;
const { generateConversation } = useGenerateConvo({
index: currentIndex,
rootIndex,
setConversation,
});
/**
* Simplified hook for added conversation state.
* Provides just the conversation state and a function to generate a new conversation,
* mirroring the pattern from useNewConvo.
*/
export default function useAddedResponse() {
const modelsQuery = useGetModelsQuery();
const assistantsListMap = useAssistantListMap();
const rootConvo = useRecoilValue(store.conversationByKeySelector(0));
const { data: endpointsConfig = {} as TEndpointsConfig } = useGetEndpointsQuery();
const { conversation, setConversation } = store.useCreateConversationAtom(ADDED_INDEX);
/**
* Generate a new conversation based on template and preset.
* Mirrors the logic from useNewConvo's switchToConversation.
*/
const generateConversation = useCallback(
({ template = {}, preset, modelsData }: NewConversationParams = {}) => {
let newConversation: TConversation = {
conversationId: rootConvo?.conversationId ?? 'new',
title: '',
endpoint: null,
...template,
createdAt: '',
updatedAt: '',
} as TConversation;
const modelsConfig = modelsData ?? modelsQuery.data;
const activePreset = preset ?? newConversation;
const defaultEndpoint = getDefaultEndpoint({
convoSetup: activePreset,
endpointsConfig,
});
const endpointType = getEndpointField(endpointsConfig, defaultEndpoint, 'type');
if (!newConversation.endpointType && endpointType) {
newConversation.endpointType = endpointType;
} else if (newConversation.endpointType && !endpointType) {
newConversation.endpointType = undefined;
}
const isAssistantEndpoint = isAssistantsEndpoint(defaultEndpoint);
const assistants: AssistantListItem[] = assistantsListMap[defaultEndpoint ?? ''] ?? [];
if (
newConversation.assistant_id &&
!assistantsListMap[defaultEndpoint ?? '']?.[newConversation.assistant_id]
) {
newConversation.assistant_id = undefined;
}
if (!newConversation.assistant_id && isAssistantEndpoint) {
newConversation.assistant_id =
localStorage.getItem(`${LocalStorageKeys.ASST_ID_PREFIX}0${defaultEndpoint}`) ??
assistants[0]?.id;
}
if (
newConversation.assistant_id != null &&
isAssistantEndpoint &&
newConversation.conversationId === 'new'
) {
const assistant = assistants.find((asst) => asst.id === newConversation.assistant_id);
newConversation.model = assistant?.model;
}
if (newConversation.assistant_id != null && !isAssistantEndpoint) {
newConversation.assistant_id = undefined;
}
const models = modelsConfig?.[defaultEndpoint ?? ''] ?? [];
newConversation = buildDefaultConvo({
conversation: newConversation,
lastConversationSetup: preset as TConversation,
endpoint: defaultEndpoint ?? ('' as EModelEndpoint),
models,
});
if (preset?.title != null && preset.title !== '') {
newConversation.title = preset.title;
}
setConversation(newConversation);
setTimeout(() => {
const textarea = document.getElementById(mainTextareaId);
if (textarea) {
textarea.focus();
}
}, 150);
return newConversation;
},
[
endpointsConfig,
setConversation,
modelsQuery.data,
assistantsListMap,
rootConvo?.conversationId,
],
);
return {
ask,
regenerate,
getMessages,
setMessages,
conversation,
isSubmitting,
setConversation,
setIsSubmitting,
generateConversation,
addedIndex: currentIndex,
};
}

View file

@ -1,6 +1,8 @@
import { v4 } from 'uuid';
import { cloneDeep } from 'lodash';
import { useNavigate } from 'react-router-dom';
import { useQueryClient } from '@tanstack/react-query';
import { useSetRecoilState, useResetRecoilState, useRecoilValue } from 'recoil';
import {
Constants,
QueryKeys,
@ -12,7 +14,6 @@ import {
replaceSpecialVars,
isAssistantsEndpoint,
} from 'librechat-data-provider';
import { useSetRecoilState, useResetRecoilState, useRecoilValue } from 'recoil';
import type {
TMessage,
TSubmission,
@ -25,11 +26,10 @@ import type { SetterOrUpdater } from 'recoil';
import type { TAskFunction, ExtendedFile } from '~/common';
import useSetFilesToDelete from '~/hooks/Files/useSetFilesToDelete';
import useGetSender from '~/hooks/Conversations/useGetSender';
import { logger, createDualMessageContent } from '~/utils';
import store, { useGetEphemeralAgent } from '~/store';
import useUserKey from '~/hooks/Input/useUserKey';
import { useNavigate } from 'react-router-dom';
import { useAuthContext } from '~/hooks';
import { logger } from '~/utils';
const logChatRequest = (request: Record<string, unknown>) => {
logger.log('=====================================\nAsk function called with:');
@ -69,6 +69,7 @@ export default function useChatFunctions({
const getEphemeralAgent = useGetEphemeralAgent();
const isTemporary = useRecoilValue(store.isTemporary);
const { getExpiry } = useUserKey(immutableConversation?.endpoint ?? '');
const setIsSubmitting = useSetRecoilState(store.isSubmittingFamily(index));
const setShowStopButton = useSetRecoilState(store.showStopButtonByIndex(index));
const resetLatestMultiMessage = useResetRecoilState(store.latestMessageFamily(index + 1));
@ -89,6 +90,7 @@ export default function useChatFunctions({
isEdited = false,
overrideMessages,
overrideFiles,
addedConvo,
} = {},
) => {
setShowStopButton(false);
@ -282,9 +284,18 @@ export default function useChatFunctions({
contentPart[ContentTypes.TEXT] = part[ContentTypes.TEXT];
}
}
} else if (addedConvo && conversation) {
// Pre-populate placeholders for smooth UI - these will be overridden/extended
// as SSE events arrive with actual content, preserving the agent-based agentId
initialResponse.content = createDualMessageContent(
conversation,
addedConvo,
endpointsConfig,
);
} else {
initialResponse.content = [];
}
setIsSubmitting(true);
setShowStopButton(true);
}
@ -312,6 +323,7 @@ export default function useChatFunctions({
isTemporary,
ephemeralAgent,
editedContent,
addedConvo,
};
if (isRegenerate) {
@ -327,12 +339,15 @@ export default function useChatFunctions({
logger.dir('message_stream', submission, { depth: null });
};
const regenerate = ({ parentMessageId }) => {
const regenerate = ({ parentMessageId }, options?: { addedConvo?: TConversation | null }) => {
const messages = getMessages();
const parentMessage = messages?.find((element) => element.messageId == parentMessageId);
if (parentMessage && parentMessage.isCreatedByUser) {
ask({ ...parentMessage }, { isRegenerate: true });
ask(
{ ...parentMessage },
{ isRegenerate: true, addedConvo: options?.addedConvo ?? undefined },
);
} else {
console.error(
'Failed to regenerate the message: parentMessage not found or not created by user.',

View file

@ -0,0 +1,15 @@
import { useRecoilCallback } from 'recoil';
import store from '~/store';
/**
* Hook that provides lazy access to addedConvo without subscribing to changes.
* Use this to avoid unnecessary re-renders when addedConvo changes.
*/
export default function useGetAddedConvo() {
return useRecoilCallback(
({ snapshot }) =>
() =>
snapshot.getLoadable(store.conversationByKeySelector(1)).getValue(),
[],
);
}

View file

@ -1,5 +1,10 @@
import React, { useMemo, useEffect, useRef } from 'react';
import { isAgentsEndpoint, isAssistantsEndpoint, LocalStorageKeys } from 'librechat-data-provider';
import {
isAgentsEndpoint,
LocalStorageKeys,
isEphemeralAgentId,
isAssistantsEndpoint,
} from 'librechat-data-provider';
import type * as t from 'librechat-data-provider';
import type { SelectedValues } from '~/common';
import useSetIndexOptions from '~/hooks/Conversations/useSetIndexOptions';
@ -39,7 +44,7 @@ export default function useSelectorEffects({
}
if (selectedAgentId == null && agents.length > 0) {
let agent_id = localStorage.getItem(`${LocalStorageKeys.AGENT_ID_PREFIX}${index}`);
if (agent_id == null) {
if (agent_id == null || isEphemeralAgentId(agent_id)) {
agent_id = agents[0]?.id;
}
const agent = agentsMap?.[agent_id];

View file

@ -6,3 +6,5 @@ export { default as useMessageProcess } from './useMessageProcess';
export { default as useMessageHelpers } from './useMessageHelpers';
export { default as useCopyToClipboard } from './useCopyToClipboard';
export { default as useMessageScrolling } from './useMessageScrolling';
export { default as useContentMetadata } from './useContentMetadata';
export type { ContentMetadataResult } from './useContentMetadata';

View file

@ -0,0 +1,30 @@
import { useMemo } from 'react';
import type { TMessage } from 'librechat-data-provider';
export type ContentMetadataResult = {
/** Whether the message has parallel content (content with groupId) */
hasParallelContent: boolean;
};
/**
* Hook to check if a message has parallel content.
* Returns whether any content part has a groupId.
*
* @param message - The message to check
* @returns ContentMetadataResult with hasParallelContent boolean
*/
export default function useContentMetadata(
message: TMessage | null | undefined,
): ContentMetadataResult {
return useMemo(() => {
const content = message?.content;
if (!content || !Array.isArray(content)) {
return { hasParallelContent: false };
}
// Check if any content part has a groupId (TMessageContentParts now includes ContentMetadata)
const hasParallelContent = content.some((part) => part?.groupId != null);
return { hasParallelContent };
}, [message?.content]);
}

View file

@ -1,24 +1,20 @@
import { useRecoilValue } from 'recoil';
import { useCallback, useMemo, useState } from 'react';
import { useRecoilValue } from 'recoil';
import { useUpdateFeedbackMutation } from 'librechat-data-provider/react-query';
import {
isAssistantsEndpoint,
isAgentsEndpoint,
TUpdateFeedbackRequest,
getTagByKey,
TFeedback,
toMinimalFeedback,
getTagByKey,
isAgentsEndpoint,
SearchResultData,
toMinimalFeedback,
isAssistantsEndpoint,
TUpdateFeedbackRequest,
} from 'librechat-data-provider';
import type { TMessageProps } from '~/common';
import {
useChatContext,
useAddedChatContext,
useAssistantsMapContext,
useAgentsMapContext,
} from '~/Providers';
import { useChatContext, useAssistantsMapContext, useAgentsMapContext } from '~/Providers';
import useCopyToClipboard from './useCopyToClipboard';
import { useAuthContext } from '~/hooks/AuthContext';
import { useGetAddedConvo } from '~/hooks/Chat';
import { useLocalize } from '~/hooks';
import store from '~/store';
@ -26,7 +22,6 @@ export type TMessageActions = Pick<
TMessageProps,
'message' | 'currentEditId' | 'setCurrentEditId'
> & {
isMultiMessage?: boolean;
searchResults?: { [key: string]: SearchResultData };
};
@ -34,23 +29,12 @@ export default function useMessageActions(props: TMessageActions) {
const localize = useLocalize();
const { user } = useAuthContext();
const UsernameDisplay = useRecoilValue<boolean>(store.UsernameDisplay);
const { message, currentEditId, setCurrentEditId, isMultiMessage, searchResults } = props;
const { message, currentEditId, setCurrentEditId, searchResults } = props;
const {
ask,
index,
regenerate,
latestMessage,
handleContinue,
setLatestMessage,
conversation: rootConvo,
isSubmitting: isSubmittingRoot,
} = useChatContext();
const { conversation: addedConvo, isSubmitting: isSubmittingAdditional } = useAddedChatContext();
const conversation = useMemo(
() => (isMultiMessage === true ? addedConvo : rootConvo),
[isMultiMessage, addedConvo, rootConvo],
);
const { ask, index, regenerate, isSubmitting, conversation, latestMessage, handleContinue } =
useChatContext();
const getAddedConvo = useGetAddedConvo();
const agentsMap = useAgentsMapContext();
const assistantMap = useAssistantsMapContext();
@ -106,18 +90,13 @@ export default function useMessageActions(props: TMessageActions) {
}
}, [agentsMap, conversation?.agent_id, conversation?.endpoint, message?.model]);
const isSubmitting = useMemo(
() => (isMultiMessage === true ? isSubmittingAdditional : isSubmittingRoot),
[isMultiMessage, isSubmittingAdditional, isSubmittingRoot],
);
const regenerateMessage = useCallback(() => {
if ((isSubmitting && isCreatedByUser === true) || !message) {
return;
}
regenerate(message);
}, [isSubmitting, isCreatedByUser, message, regenerate]);
regenerate(message, { addedConvo: getAddedConvo() });
}, [isSubmitting, isCreatedByUser, message, regenerate, getAddedConvo]);
const copyToClipboard = useCopyToClipboard({ text, content, searchResults });
@ -170,17 +149,15 @@ export default function useMessageActions(props: TMessageActions) {
edit,
index,
agent,
feedback,
assistant,
enterEdit,
conversation,
messageLabel,
isSubmitting,
latestMessage,
handleFeedback,
handleContinue,
copyToClipboard,
setLatestMessage,
regenerateMessage,
handleFeedback,
feedback,
};
}

View file

@ -1,10 +1,11 @@
import throttle from 'lodash/throttle';
import { useEffect, useRef, useCallback, useMemo } from 'react';
import throttle from 'lodash/throttle';
import { Constants, isAssistantsEndpoint, isAgentsEndpoint } from 'librechat-data-provider';
import type { TMessageProps } from '~/common';
import { useMessagesViewContext, useAssistantsMapContext, useAgentsMapContext } from '~/Providers';
import { getTextKey, TEXT_KEY_DIVIDER, logger } from '~/utils';
import useCopyToClipboard from './useCopyToClipboard';
import { useGetAddedConvo } from '~/hooks/Chat';
export default function useMessageHelpers(props: TMessageProps) {
const latestText = useRef<string | number>('');
@ -24,6 +25,8 @@ export default function useMessageHelpers(props: TMessageProps) {
const agentsMap = useAgentsMapContext();
const assistantMap = useAssistantsMapContext();
const getAddedConvo = useGetAddedConvo();
const { text, content, children, messageId = null, isCreatedByUser } = message ?? {};
const edit = messageId === currentEditId;
const isLast = children?.length === 0 || children?.length === undefined;
@ -122,7 +125,7 @@ export default function useMessageHelpers(props: TMessageProps) {
return;
}
regenerate(message);
regenerate(message, { addedConvo: getAddedConvo() });
};
const copyToClipboard = useCopyToClipboard({ text, content });

View file

@ -1,26 +1,15 @@
import throttle from 'lodash/throttle';
import { useRecoilValue } from 'recoil';
import { Constants } from 'librechat-data-provider';
import { useEffect, useRef, useCallback, useMemo, useState } from 'react';
import { useEffect, useRef, useCallback, useMemo } from 'react';
import type { TMessage } from 'librechat-data-provider';
import { getTextKey, TEXT_KEY_DIVIDER, logger } from '~/utils';
import { useMessagesViewContext } from '~/Providers';
import store from '~/store';
export default function useMessageProcess({ message }: { message?: TMessage | null }) {
const latestText = useRef<string | number>('');
const [siblingMessage, setSiblingMessage] = useState<TMessage | null>(null);
const hasNoChildren = useMemo(() => (message?.children?.length ?? 0) === 0, [message]);
const {
index,
conversation,
latestMessage,
setAbortScroll,
setLatestMessage,
isSubmittingFamily,
} = useMessagesViewContext();
const latestMultiMessage = useRecoilValue(store.latestMessageFamily(index + 1));
const { conversation, setAbortScroll, setLatestMessage, isSubmitting } = useMessagesViewContext();
useEffect(() => {
const convoId = conversation?.conversationId;
@ -72,47 +61,22 @@ export default function useMessageProcess({ message }: { message?: TMessage | nu
throttle(() => {
logger.log(
'message_scrolling',
`useMessageProcess: setting abort scroll to ${isSubmittingFamily}, handleScroll event`,
`useMessageProcess: setting abort scroll to ${isSubmitting}, handleScroll event`,
event,
);
if (isSubmittingFamily) {
if (isSubmitting) {
setAbortScroll(true);
} else {
setAbortScroll(false);
}
}, 500)();
},
[isSubmittingFamily, setAbortScroll],
[isSubmitting, setAbortScroll],
);
const showSibling = useMemo(
() =>
(hasNoChildren && latestMultiMessage && (latestMultiMessage.children?.length ?? 0) === 0) ||
!!siblingMessage,
[hasNoChildren, latestMultiMessage, siblingMessage],
);
useEffect(() => {
if (
hasNoChildren &&
latestMultiMessage &&
latestMultiMessage.conversationId === message?.conversationId
) {
const newSibling = Object.assign({}, latestMultiMessage, {
parentMessageId: message.parentMessageId,
depth: message.depth,
});
setSiblingMessage(newSibling);
}
}, [hasNoChildren, latestMultiMessage, message, setSiblingMessage, latestMessage]);
return {
showSibling,
handleScroll,
isSubmitting,
conversation,
siblingMessage,
setSiblingMessage,
isSubmittingFamily,
latestMultiMessage,
};
}

View file

@ -1,26 +1,17 @@
import { v4 } from 'uuid';
import { useCallback } from 'react';
import { useRecoilValue, useSetRecoilState } from 'recoil';
import { Constants, replaceSpecialVars } from 'librechat-data-provider';
import { replaceSpecialVars } from 'librechat-data-provider';
import { useChatContext, useChatFormContext, useAddedChatContext } from '~/Providers';
import { useAuthContext } from '~/hooks/AuthContext';
import store from '~/store';
const appendIndex = (index: number, value?: string) => {
if (!value) {
return value;
}
return `${value}${Constants.COMMON_DIVIDER}${index}`;
};
export default function useSubmitMessage() {
const { user } = useAuthContext();
const methods = useChatFormContext();
const { conversation: addedConvo } = useAddedChatContext();
const { ask, index, getMessages, setMessages, latestMessage } = useChatContext();
const { addedIndex, ask: askAdditional, conversation: addedConvo } = useAddedChatContext();
const autoSendPrompts = useRecoilValue(store.autoSendPrompts);
const activeConvos = useRecoilValue(store.allConversationsSelector);
const setActivePrompt = useSetRecoilState(store.activePromptByIndex(index));
const submitMessage = useCallback(
@ -36,47 +27,17 @@ export default function useSubmitMessage() {
setMessages([...(rootMessages || []), latestMessage]);
}
const hasAdded = addedIndex && activeConvos[addedIndex] && addedConvo;
const isNewMultiConvo =
hasAdded &&
activeConvos.every((convoId) => convoId === Constants.NEW_CONVO) &&
!rootMessages?.length;
const overrideConvoId = isNewMultiConvo ? v4() : undefined;
const overrideUserMessageId = hasAdded ? v4() : undefined;
const rootIndex = addedIndex - 1;
const clientTimestamp = new Date().toISOString();
ask({
text: data.text,
overrideConvoId: appendIndex(rootIndex, overrideConvoId),
overrideUserMessageId: appendIndex(rootIndex, overrideUserMessageId),
clientTimestamp,
});
if (hasAdded) {
askAdditional(
{
text: data.text,
overrideConvoId: appendIndex(addedIndex, overrideConvoId),
overrideUserMessageId: appendIndex(addedIndex, overrideUserMessageId),
clientTimestamp,
},
{ overrideMessages: rootMessages },
);
}
ask(
{
text: data.text,
},
{
addedConvo: addedConvo ?? undefined,
},
);
methods.reset();
},
[
ask,
methods,
addedIndex,
addedConvo,
setMessages,
getMessages,
activeConvos,
askAdditional,
latestMessage,
],
[ask, methods, addedConvo, setMessages, getMessages, latestMessage],
);
const submitPrompt = useCallback(

View file

@ -304,6 +304,7 @@ export default function useResumableSSE(
}
}
setIsSubmitting(true);
setShowStopButton(true);
return;
}

View file

@ -10,6 +10,7 @@ import type {
Agents,
TMessage,
PartMetadata,
ContentMetadata,
EventSubmission,
TMessageContentParts,
} from 'librechat-data-provider';
@ -61,31 +62,41 @@ export default function useStepHandler({
const messageMap = useRef(new Map<string, TMessage>());
const stepMap = useRef(new Map<string, Agents.RunStep>());
const calculateContentIndex = (
baseIndex: number,
initialContent: TMessageContentParts[],
incomingContentType: string,
existingContent?: TMessageContentParts[],
): number => {
/** Only apply -1 adjustment for TEXT or THINK types when they match existing content */
if (
initialContent.length > 0 &&
(incomingContentType === ContentTypes.TEXT || incomingContentType === ContentTypes.THINK)
) {
const targetIndex = baseIndex + initialContent.length - 1;
const existingType = existingContent?.[targetIndex]?.type;
if (existingType === incomingContentType) {
return targetIndex;
/**
* Calculate content index for a run step.
* For edited content scenarios, offset by initialContent length.
*/
const calculateContentIndex = useCallback(
(
serverIndex: number,
initialContent: TMessageContentParts[],
incomingContentType: string,
existingContent?: TMessageContentParts[],
): number => {
/** Only apply -1 adjustment for TEXT or THINK types when they match existing content */
if (
initialContent.length > 0 &&
(incomingContentType === ContentTypes.TEXT || incomingContentType === ContentTypes.THINK)
) {
const targetIndex = serverIndex + initialContent.length - 1;
const existingType = existingContent?.[targetIndex]?.type;
if (existingType === incomingContentType) {
return targetIndex;
}
}
}
return baseIndex + initialContent.length;
};
return serverIndex + initialContent.length;
},
[],
);
/** Metadata to propagate onto content parts for parallel rendering - uses ContentMetadata from data-provider */
const updateContent = (
message: TMessage,
index: number,
contentPart: Agents.MessageContentComplex,
finalUpdate = false,
metadata?: ContentMetadata,
) => {
const contentType = contentPart.type ?? '';
if (!contentType) {
@ -99,6 +110,7 @@ export default function useStepHandler({
if (!updatedContent[index]) {
updatedContent[index] = { type: contentPart.type as AllContentTypes };
}
/** Prevent overwriting an existing content part with a different type */
const existingType = (updatedContent[index]?.type as string | undefined) ?? '';
if (
@ -196,9 +208,36 @@ export default function useStepHandler({
};
}
// Apply metadata to the content part for parallel rendering
// This must happen AFTER all content updates to avoid being overwritten
if (metadata?.agentId != null || metadata?.groupId != null) {
const part = updatedContent[index] as TMessageContentParts & ContentMetadata;
if (metadata.agentId != null) {
part.agentId = metadata.agentId;
}
if (metadata.groupId != null) {
part.groupId = metadata.groupId;
}
}
return { ...message, content: updatedContent as TMessageContentParts[] };
};
/** Extract metadata from runStep for parallel content rendering */
const getStepMetadata = (runStep: Agents.RunStep | undefined): ContentMetadata | undefined => {
if (!runStep?.agentId && runStep?.groupId == null) {
return undefined;
}
const metadata = {
agentId: runStep.agentId,
// Only set groupId when explicitly provided by the server
// Sequential handoffs have agentId but no groupId
// Parallel execution has both agentId AND groupId
groupId: runStep.groupId,
};
return metadata;
};
const stepHandler = useCallback(
({ event, data }: TStepEvent, submission: EventSubmission) => {
const messages = getMessages() || [];
@ -212,6 +251,7 @@ export default function useStepHandler({
}
let initialContent: TMessageContentParts[] = [];
// For editedContent scenarios, use the initial response content for index offsetting
if (submission?.editedContent != null) {
initialContent = submission?.initialResponse?.content ?? initialContent;
}
@ -229,6 +269,10 @@ export default function useStepHandler({
}
stepMap.current.set(runStep.id, runStep);
// Calculate content index - use server index, offset by initialContent for edit scenarios
const contentIndex = runStep.index + initialContent.length;
let response = messageMap.current.get(responseMessageId);
if (!response) {
@ -242,7 +286,8 @@ export default function useStepHandler({
// For edit scenarios, initialContent IS the complete starting content (not to be merged)
// For resume scenarios (no editedContent), initialContent is empty and we use existingContent
const existingContent = responseMessage?.content ?? [];
const mergedContent = initialContent.length > 0 ? initialContent : existingContent;
const mergedContent: TMessageContentParts[] =
initialContent.length > 0 ? initialContent : existingContent;
response = {
...responseMessage,
@ -288,9 +333,14 @@ export default function useStepHandler({
},
};
/** Tool calls don't need index adjustment */
const currentIndex = runStep.index + initialContent.length;
updatedResponse = updateContent(updatedResponse, currentIndex, contentPart);
// Use the pre-calculated contentIndex which handles parallel agent indexing
updatedResponse = updateContent(
updatedResponse,
contentIndex,
contentPart,
false,
getStepMetadata(runStep),
);
});
messageMap.current.set(responseMessageId, updatedResponse);
@ -316,7 +366,17 @@ export default function useStepHandler({
if (response) {
// Agent updates don't need index adjustment
const currentIndex = agent_update.index + initialContent.length;
const updatedResponse = updateContent(response, currentIndex, data);
// Agent updates carry their own agentId - use default groupId if agentId is present
const agentUpdateMeta: ContentMetadata | undefined = agent_update.agentId
? { agentId: agent_update.agentId, groupId: 1 }
: undefined;
const updatedResponse = updateContent(
response,
currentIndex,
data,
false,
agentUpdateMeta,
);
messageMap.current.set(responseMessageId, updatedResponse);
const currentMessages = getMessages() || [];
setMessages([...currentMessages.slice(0, -1), updatedResponse]);
@ -351,8 +411,13 @@ export default function useStepHandler({
contentPart.type || '',
response.content,
);
const updatedResponse = updateContent(response, currentIndex, contentPart);
const updatedResponse = updateContent(
response,
currentIndex,
contentPart,
false,
getStepMetadata(runStep),
);
messageMap.current.set(responseMessageId, updatedResponse);
const currentMessages = getMessages() || [];
setMessages([...currentMessages.slice(0, -1), updatedResponse]);
@ -387,8 +452,13 @@ export default function useStepHandler({
contentPart.type || '',
response.content,
);
const updatedResponse = updateContent(response, currentIndex, contentPart);
const updatedResponse = updateContent(
response,
currentIndex,
contentPart,
false,
getStepMetadata(runStep),
);
messageMap.current.set(responseMessageId, updatedResponse);
const currentMessages = getMessages() || [];
setMessages([...currentMessages.slice(0, -1), updatedResponse]);
@ -432,9 +502,15 @@ export default function useStepHandler({
contentPart.tool_call.expires_at = runStepDelta.delta.expires_at;
}
/** Tool calls don't need index adjustment */
// Use server's index, offset by initialContent for edit scenarios
const currentIndex = runStep.index + initialContent.length;
updatedResponse = updateContent(updatedResponse, currentIndex, contentPart);
updatedResponse = updateContent(
updatedResponse,
currentIndex,
contentPart,
false,
getStepMetadata(runStep),
);
});
messageMap.current.set(responseMessageId, updatedResponse);
@ -470,9 +546,15 @@ export default function useStepHandler({
tool_call: result.tool_call,
};
/** Tool calls don't need index adjustment */
// Use server's index, offset by initialContent for edit scenarios
const currentIndex = runStep.index + initialContent.length;
updatedResponse = updateContent(updatedResponse, currentIndex, contentPart, true);
updatedResponse = updateContent(
updatedResponse,
currentIndex,
contentPart,
true,
getStepMetadata(runStep),
);
messageMap.current.set(responseMessageId, updatedResponse);
const updatedMessages = messages.map((msg) =>
@ -489,7 +571,7 @@ export default function useStepHandler({
stepMap.current.clear();
};
},
[getMessages, lastAnnouncementTimeRef, announcePolite, setMessages],
[getMessages, lastAnnouncementTimeRef, announcePolite, setMessages, calculateContentIndex],
);
const clearStepMaps = useCallback(() => {

View file

@ -5,14 +5,15 @@ import { useRecoilState, useRecoilValue, useSetRecoilState, useRecoilCallback }
import {
Constants,
FileSources,
Permissions,
EModelEndpoint,
isParamEndpoint,
getEndpointField,
LocalStorageKeys,
isAssistantsEndpoint,
isAgentsEndpoint,
PermissionTypes,
Permissions,
getEndpointField,
isAgentsEndpoint,
LocalStorageKeys,
isEphemeralAgentId,
isAssistantsEndpoint,
} from 'librechat-data-provider';
import type {
TPreset,
@ -120,8 +121,8 @@ const useNewConvo = (index = 0) => {
isAgentsEndpoint(lastConversationSetup?.endpoint) && lastConversationSetup?.agent_id;
const isExistingAgentConvo =
isAgentsEndpoint(defaultEndpoint) &&
((conversation.agent_id && conversation.agent_id !== Constants.EPHEMERAL_AGENT_ID) ||
(storedAgentId && storedAgentId !== Constants.EPHEMERAL_AGENT_ID));
((conversation.agent_id && !isEphemeralAgentId(conversation.agent_id)) ||
(storedAgentId && !isEphemeralAgentId(storedAgentId)));
if (
defaultEndpoint &&
isAgentsEndpoint(defaultEndpoint) &&

View file

@ -987,6 +987,9 @@
"com_ui_fork_split_target_setting": "Start fork from target message by default",
"com_ui_fork_success": "Successfully forked conversation",
"com_ui_fork_visible": "Visible messages only",
"com_ui_branch_message": "Create branch from this response",
"com_ui_branch_created": "Branch created successfully",
"com_ui_branch_error": "Failed to create branch",
"com_ui_generate_qrcode": "Generate QR Code",
"com_ui_generating": "Generating...",
"com_ui_generation_settings": "Generation Settings",

View file

@ -11,7 +11,7 @@ import {
useSetRecoilState,
useRecoilCallback,
} from 'recoil';
import { LocalStorageKeys, Constants } from 'librechat-data-provider';
import { LocalStorageKeys, isEphemeralAgentId, Constants } from 'librechat-data-provider';
import type { TMessage, TPreset, TConversation, TSubmission } from 'librechat-data-provider';
import type { TOptionSettings, ExtendedFile } from '~/common';
import {
@ -88,7 +88,7 @@ const conversationByIndex = atomFamily<TConversation | null, string | number>({
newValue.assistant_id,
);
}
if (newValue?.agent_id != null && newValue.agent_id) {
if (newValue?.agent_id != null && !isEphemeralAgentId(newValue.agent_id)) {
localStorage.setItem(`${LocalStorageKeys.AGENT_ID_PREFIX}${index}`, newValue.agent_id);
}
if (newValue?.spec != null && newValue.spec) {

View file

@ -1,9 +1,9 @@
import {
Constants,
parseConvo,
EModelEndpoint,
isAssistantsEndpoint,
isAgentsEndpoint,
isEphemeralAgentId,
isAssistantsEndpoint,
} from 'librechat-data-provider';
import type { TConversation, EndpointSchemaKey } from 'librechat-data-provider';
import { clearModelForNonEphemeralAgent } from './endpoints';
@ -71,7 +71,7 @@ const buildDefaultConvo = ({
if (
isAgentsEndpoint(endpoint) &&
agentId &&
(!defaultAgentId || defaultAgentId === Constants.EPHEMERAL_AGENT_ID)
(!defaultAgentId || isEphemeralAgentId(defaultAgentId))
) {
defaultConvo.agent_id = agentId;
}

View file

@ -6,6 +6,7 @@ import {
LocalStorageKeys,
getEndpointField,
isAgentsEndpoint,
isEphemeralAgentId,
isAssistantsEndpoint,
} from 'librechat-data-provider';
import type * as t from 'librechat-data-provider';
@ -26,7 +27,7 @@ export function clearModelForNonEphemeralAgent<
if (
isAgentsEndpoint(template.endpoint) &&
template.agent_id &&
template.agent_id !== Constants.EPHEMERAL_AGENT_ID
!isEphemeralAgentId(template.agent_id)
) {
template.model = undefined as T['model'];
}
@ -150,7 +151,7 @@ export function getConvoSwitchLogic(params: ConversationInitParams): InitiatedTe
if (
!isAgentsEndpoint(newEndpoint) &&
template.agent_id &&
template.agent_id !== Constants.EPHEMERAL_AGENT_ID
!isEphemeralAgentId(template.agent_id)
) {
template.agent_id = Constants.EPHEMERAL_AGENT_ID;
}

View file

@ -1,5 +1,18 @@
import { ContentTypes, QueryKeys, Constants } from 'librechat-data-provider';
import type { TMessage, TMessageContentParts } from 'librechat-data-provider';
import {
QueryKeys,
Constants,
ContentTypes,
getResponseSender,
isEphemeralAgentId,
appendAgentIdSuffix,
encodeEphemeralAgentId,
} from 'librechat-data-provider';
import type {
TMessage,
TConversation,
TEndpointsConfig,
TMessageContentParts,
} from 'librechat-data-provider';
import type { QueryClient } from '@tanstack/react-query';
import type { LocalizeFunction } from '~/common';
import _ from 'lodash';
@ -178,3 +191,83 @@ export const getMessageAriaLabel = (message: TMessage, localize: LocalizeFunctio
? localize('com_endpoint_message_new', { 0: message.depth + 1 })
: localize('com_endpoint_message');
};
/**
* Creates initial content parts for dual message display with agent-based grouping.
* Sets up primary and added agent content parts with agentId for column rendering.
*
* @param primaryConvo - The primary conversation configuration
* @param addedConvo - The added conversation configuration
* @param endpointsConfig - Endpoints configuration for getting model display labels
* @returns Array of content parts with agentId for side-by-side rendering
*/
export const createDualMessageContent = (
primaryConvo: TConversation,
addedConvo: TConversation,
endpointsConfig?: TEndpointsConfig,
): TMessageContentParts[] => {
// For real agents (agent_id starts with "agent_"), use agent_id directly
// Otherwise create ephemeral ID from endpoint/model
let primaryAgentId: string;
if (primaryConvo.agent_id && !isEphemeralAgentId(primaryConvo.agent_id)) {
primaryAgentId = primaryConvo.agent_id;
} else {
const primaryEndpoint = primaryConvo.endpoint;
const primaryModel = primaryConvo.model ?? '';
const primarySender = getResponseSender({
modelDisplayLabel: primaryEndpoint
? endpointsConfig?.[primaryEndpoint]?.modelDisplayLabel
: undefined,
});
primaryAgentId = encodeEphemeralAgentId({
endpoint: primaryEndpoint ?? '',
model: primaryModel,
sender: primarySender,
});
}
// Both agents run in parallel, so they share the same groupId
const parallelGroupId = 1;
// Use empty type - these are just placeholders to establish agentId/groupId
// The actual type will be set when real content arrives from the server
const primaryContent = {
type: '' as const,
agentId: primaryAgentId,
groupId: parallelGroupId,
};
// For added agent, use agent_id if it's a real agent (starts with "agent_")
// Otherwise create ephemeral ID with index suffix
// Always append index suffix for added agent to distinguish from primary (even if same agent_id)
let addedAgentId: string;
if (addedConvo.agent_id && !isEphemeralAgentId(addedConvo.agent_id)) {
// Append suffix to distinguish from primary agent (matches ephemeral format)
addedAgentId = appendAgentIdSuffix(addedConvo.agent_id, 1);
} else {
const addedEndpoint = addedConvo.endpoint;
const addedModel = addedConvo.model ?? '';
const addedSender = addedEndpoint
? getResponseSender({
modelDisplayLabel: endpointsConfig?.[addedEndpoint]?.modelDisplayLabel,
})
: '';
addedAgentId = encodeEphemeralAgentId({
endpoint: addedEndpoint ?? '',
model: addedModel,
sender: addedSender,
index: 1,
});
}
// Use empty type - placeholder to establish agentId/groupId
const addedContent = {
type: '' as const,
agentId: addedAgentId,
groupId: parallelGroupId,
};
// Cast through unknown since these are placeholder objects with empty type
// that will be replaced by real content with proper types from the server
return [primaryContent, addedContent] as unknown as TMessageContentParts[];
};

23
package-lock.json generated
View file

@ -56,8 +56,8 @@
"@azure/storage-blob": "^12.27.0",
"@googleapis/youtube": "^20.0.0",
"@keyv/redis": "^4.3.3",
"@langchain/core": "^0.3.79",
"@librechat/agents": "^3.0.52",
"@langchain/core": "^0.3.80",
"@librechat/agents": "^3.0.61",
"@librechat/api": "*",
"@librechat/data-schemas": "*",
"@microsoft/microsoft-graph-client": "^3.0.7",
@ -18214,9 +18214,9 @@
}
},
"node_modules/@langchain/core": {
"version": "0.3.79",
"resolved": "https://registry.npmjs.org/@langchain/core/-/core-0.3.79.tgz",
"integrity": "sha512-ZLAs5YMM5N2UXN3kExMglltJrKKoW7hs3KMZFlXUnD7a5DFKBYxPFMeXA4rT+uvTxuJRZPCYX0JKI5BhyAWx4A==",
"version": "0.3.80",
"resolved": "https://registry.npmjs.org/@langchain/core/-/core-0.3.80.tgz",
"integrity": "sha512-vcJDV2vk1AlCwSh3aBm/urQ1ZrlXFFBocv11bz/NBUfLWD5/UDNMzwPdaAd2dKvNmTWa9FM2lirLU3+JCf4cRA==",
"license": "MIT",
"dependencies": {
"@cfworker/json-schema": "^4.0.2",
@ -18807,14 +18807,14 @@
}
},
"node_modules/@librechat/agents": {
"version": "3.0.52",
"resolved": "https://registry.npmjs.org/@librechat/agents/-/agents-3.0.52.tgz",
"integrity": "sha512-6wQCTbEAFmcWtQYBsct9l6PF4wZi1ydHw2xETO6lGPQakY1gNT6DyTtNqisKokk5VmI5nJYq2pTpSg8rIk6xgQ==",
"version": "3.0.61",
"resolved": "https://registry.npmjs.org/@librechat/agents/-/agents-3.0.61.tgz",
"integrity": "sha512-fmVC17G/RuLd38XG6/2olS49qd96SPcav9Idcb30Bv7gUZx/kOCqPay4GeMnwXDWXnDxTktNRCP5Amb0pEYuOw==",
"license": "MIT",
"dependencies": {
"@langchain/anthropic": "^0.3.26",
"@langchain/aws": "^0.1.15",
"@langchain/core": "^0.3.79",
"@langchain/core": "^0.3.80",
"@langchain/deepseek": "^0.0.2",
"@langchain/google-genai": "^0.2.18",
"@langchain/google-vertexai": "^0.2.18",
@ -48961,8 +48961,8 @@
"@azure/search-documents": "^12.0.0",
"@azure/storage-blob": "^12.27.0",
"@keyv/redis": "^4.3.3",
"@langchain/core": "^0.3.79",
"@librechat/agents": "^3.0.52",
"@langchain/core": "^0.3.80",
"@librechat/agents": "^3.0.61",
"@librechat/data-schemas": "*",
"@modelcontextprotocol/sdk": "^1.24.3",
"axios": "^1.12.1",
@ -51344,7 +51344,6 @@
"@babel/preset-env": "^7.21.5",
"@babel/preset-react": "^7.18.6",
"@babel/preset-typescript": "^7.21.0",
"@langchain/core": "^0.3.62",
"@rollup/plugin-alias": "^5.1.0",
"@rollup/plugin-commonjs": "^29.0.0",
"@rollup/plugin-json": "^6.1.0",

View file

@ -84,8 +84,8 @@
"@azure/search-documents": "^12.0.0",
"@azure/storage-blob": "^12.27.0",
"@keyv/redis": "^4.3.3",
"@langchain/core": "^0.3.79",
"@librechat/agents": "^3.0.52",
"@langchain/core": "^0.3.80",
"@librechat/agents": "^3.0.61",
"@librechat/data-schemas": "*",
"@modelcontextprotocol/sdk": "^1.24.3",
"axios": "^1.12.1",

View file

@ -238,8 +238,8 @@ class GenerationJobManagerClass {
if (currentRuntime.allSubscribersLeftHandlers) {
this.jobStore
.getContentParts(streamId)
.then((content) => {
const parts = content ?? [];
.then((result) => {
const parts = result?.content ?? [];
for (const handler of currentRuntime.allSubscribersLeftHandlers ?? []) {
try {
handler(parts);
@ -426,7 +426,8 @@ class GenerationJobManagerClass {
}
// Get content before clearing state
const content = (await this.jobStore.getContentParts(streamId)) ?? [];
const result = await this.jobStore.getContentParts(streamId);
const content = result?.content ?? [];
// Detect "early abort" - aborted before any generation happened (e.g., during tool loading)
// In this case, no messages were saved to DB, so frontend shouldn't navigate to conversation
@ -765,7 +766,8 @@ class GenerationJobManagerClass {
return null;
}
const aggregatedContent = (await this.jobStore.getContentParts(streamId)) ?? [];
const result = await this.jobStore.getContentParts(streamId);
const aggregatedContent = result?.content ?? [];
const runSteps = await this.jobStore.getRunSteps(streamId);
logger.debug(`[GenerationJobManager] getResumeState:`, {
@ -872,7 +874,8 @@ class GenerationJobManagerClass {
return null;
}
const aggregatedContent = (await this.jobStore.getContentParts(streamId)) ?? [];
const result = await this.jobStore.getContentParts(streamId);
const aggregatedContent = result?.content ?? [];
return {
active: jobData.status === 'running',

View file

@ -233,11 +233,11 @@ describe('RedisJobStore Integration Tests', () => {
}
// Instance 2 reconstructs content (simulating reconnect to different instance)
const content = await instance2.getContentParts(streamId);
const result = await instance2.getContentParts(streamId);
// Should have reconstructed content
expect(content).not.toBeNull();
expect(content!.length).toBeGreaterThan(0);
expect(result).not.toBeNull();
expect(result!.content.length).toBeGreaterThan(0);
await instance1.destroy();
await instance2.destroy();
@ -325,11 +325,11 @@ describe('RedisJobStore Integration Tests', () => {
await store.appendChunk(streamId, chunk);
}
const content = await store.getContentParts(streamId);
const result = await store.getContentParts(streamId);
expect(content).not.toBeNull();
expect(result).not.toBeNull();
// Content aggregator combines text deltas
const textPart = content!.find((p) => p.type === 'text');
const textPart = result!.content.find((p) => p.type === 'text');
expect(textPart).toBeDefined();
await store.destroy();
@ -388,12 +388,12 @@ describe('RedisJobStore Integration Tests', () => {
await store.appendChunk(streamId, chunk);
}
const content = await store.getContentParts(streamId);
const result = await store.getContentParts(streamId);
expect(content).not.toBeNull();
expect(result).not.toBeNull();
// Should have both think and text parts
const thinkPart = content!.find((p) => p.type === 'think');
const textPart = content!.find((p) => p.type === 'text');
const thinkPart = result!.content.find((p) => p.type === 'think');
const textPart = result!.content.find((p) => p.type === 'text');
expect(thinkPart).toBeDefined();
expect(textPart).toBeDefined();
@ -905,8 +905,8 @@ describe('RedisJobStore Integration Tests', () => {
store.setGraph(streamId, mockGraph as unknown as StandardGraph);
// Get content - should come from local cache, not Redis
const content = await store.getContentParts(streamId);
expect(content).toEqual(mockContentParts);
const result = await store.getContentParts(streamId);
expect(result!.content).toEqual(mockContentParts);
// Get run steps - should come from local cache
const runSteps = await store.getRunSteps(streamId);
@ -959,9 +959,9 @@ describe('RedisJobStore Integration Tests', () => {
await instance2.initialize();
// Get content - should reconstruct from Redis chunks
const content = await instance2.getContentParts(streamId);
expect(content).not.toBeNull();
expect(content!.length).toBeGreaterThan(0);
const result = await instance2.getContentParts(streamId);
expect(result).not.toBeNull();
expect(result!.content.length).toBeGreaterThan(0);
// Get run steps - should fetch from Redis
const runSteps = await instance2.getRunSteps(streamId);

View file

@ -260,8 +260,16 @@ export class InMemoryJobStore implements IJobStore {
* Get content parts for a job.
* Returns live content from stored reference.
*/
async getContentParts(streamId: string): Promise<Agents.MessageContentComplex[] | null> {
return this.contentState.get(streamId)?.contentParts ?? null;
async getContentParts(streamId: string): Promise<{
content: Agents.MessageContentComplex[];
} | null> {
const state = this.contentState.get(streamId);
if (!state?.contentParts) {
return null;
}
return {
content: state.contentParts,
};
}
/**

View file

@ -225,7 +225,7 @@ export class RedisJobStore implements IJobStore {
}
async deleteJob(streamId: string): Promise<void> {
// Clear local cache
// Clear local caches
this.localGraphCache.delete(streamId);
// Note: userJobs cleanup is handled lazily via self-healing in getActiveJobIdsByUser
@ -380,7 +380,7 @@ export class RedisJobStore implements IJobStore {
clearInterval(this.cleanupInterval);
this.cleanupInterval = null;
}
// Clear local cache
// Clear local caches
this.localGraphCache.clear();
// Don't close the Redis connection - it's shared
logger.info('[RedisJobStore] Destroyed');
@ -403,10 +403,12 @@ export class RedisJobStore implements IJobStore {
}
/**
* No-op for Redis - content is built from chunks.
* No-op for Redis - content parts are reconstructed from chunks.
* Metadata (agentId, groupId) is embedded directly on content parts by the agent runtime.
*/
setContentParts(): void {
// No-op: Redis uses chunks for content reconstruction
setContentParts(_streamId: string, _contentParts: Agents.MessageContentComplex[]): void {
// Content parts are reconstructed from chunks during getContentParts
// No separate storage needed
}
/**
@ -417,9 +419,11 @@ export class RedisJobStore implements IJobStore {
* For cross-instance reconnects, we reconstruct from Redis Streams.
*
* @param streamId - The stream identifier
* @returns Content parts array, or null if not found
* @returns Content parts array or null if not found
*/
async getContentParts(streamId: string): Promise<Agents.MessageContentComplex[] | null> {
async getContentParts(streamId: string): Promise<{
content: Agents.MessageContentComplex[];
} | null> {
// 1. Try local graph cache first (fast path for same-instance reconnect)
const graphRef = this.localGraphCache.get(streamId);
if (graphRef) {
@ -427,7 +431,9 @@ export class RedisJobStore implements IJobStore {
if (graph) {
const localParts = graph.getContentParts();
if (localParts && localParts.length > 0) {
return localParts;
return {
content: localParts,
};
}
} else {
// WeakRef was collected, remove from cache
@ -472,7 +478,10 @@ export class RedisJobStore implements IJobStore {
filtered.push(part);
}
}
return filtered;
return {
content: filtered,
};
}
/**
@ -517,7 +526,7 @@ export class RedisJobStore implements IJobStore {
* Removes both local cache and Redis data.
*/
clearContentState(streamId: string): void {
// Clear local cache immediately
// Clear local caches immediately
this.localGraphCache.delete(streamId);
// Fire and forget - async cleanup for Redis

View file

@ -167,7 +167,9 @@ export interface IJobStore {
* @param streamId - The stream identifier
* @returns Content parts or null if not available
*/
getContentParts(streamId: string): Promise<Agents.MessageContentComplex[] | null>;
getContentParts(streamId: string): Promise<{
content: Agents.MessageContentComplex[];
} | null>;
/**
* Get run steps for a job (for resume state).

View file

@ -48,7 +48,6 @@
"@babel/preset-env": "^7.21.5",
"@babel/preset-react": "^7.18.6",
"@babel/preset-typescript": "^7.21.0",
"@langchain/core": "^0.3.62",
"@rollup/plugin-alias": "^5.1.0",
"@rollup/plugin-commonjs": "^29.0.0",
"@rollup/plugin-json": "^6.1.0",

View file

@ -66,6 +66,8 @@ export const messages = (params: q.MessagesListParams) => {
export const messagesArtifacts = (messageId: string) => `${messagesRoot}/artifact/${messageId}`;
export const messagesBranch = () => `${messagesRoot}/branch`;
const shareRoot = `${BASE_URL}/api/share`;
export const shareMessages = (shareId: string) => `${shareRoot}/${shareId}`;
export const getSharedLink = (conversationId: string) => `${shareRoot}/link/${conversationId}`;

View file

@ -5,6 +5,7 @@ import * as s from './schemas';
export default function createPayload(submission: t.TSubmission) {
const {
isEdited,
addedConvo,
userMessage,
isContinued,
isTemporary,
@ -32,6 +33,7 @@ export default function createPayload(submission: t.TSubmission) {
...userMessage,
...endpointOption,
endpoint,
addedConvo,
isTemporary,
isRegenerate,
editedContent,

View file

@ -756,6 +756,12 @@ export const editArtifact = async ({
return request.post(endpoints.messagesArtifacts(messageId), params);
};
export const branchMessage = async (
payload: m.TBranchMessageRequest,
): Promise<m.TBranchMessageResponse> => {
return request.post(endpoints.messagesBranch(), payload);
};
export function getMessagesByConvoId(conversationId: string): Promise<s.TMessage[]> {
if (
conversationId === config.Constants.NEW_CONVO ||

View file

@ -197,7 +197,7 @@ const extractOmniVersion = (modelStr: string): string => {
return '';
};
export const getResponseSender = (endpointOption: t.TEndpointOption): string => {
export const getResponseSender = (endpointOption: Partial<t.TEndpointOption>): string => {
const {
model: _m,
endpoint: _e,
@ -216,10 +216,11 @@ export const getResponseSender = (endpointOption: t.TEndpointOption): string =>
if (
[EModelEndpoint.openAI, EModelEndpoint.bedrock, EModelEndpoint.azureOpenAI].includes(endpoint)
) {
if (chatGptLabel) {
return chatGptLabel;
} else if (modelLabel) {
if (modelLabel) {
return modelLabel;
} else if (chatGptLabel) {
// @deprecated - prefer modelLabel
return chatGptLabel;
} else if (model && extractOmniVersion(model)) {
return extractOmniVersion(model);
} else if (model && (model.includes('mistral') || model.includes('codestral'))) {
@ -255,6 +256,7 @@ export const getResponseSender = (endpointOption: t.TEndpointOption): string =>
if (modelLabel) {
return modelLabel;
} else if (chatGptLabel) {
// @deprecated - prefer modelLabel
return chatGptLabel;
} else if (model && extractOmniVersion(model)) {
return extractOmniVersion(model);
@ -414,3 +416,138 @@ export function replaceSpecialVars({ text, user }: { text: string; user?: t.TUse
return result;
}
/**
* Parsed ephemeral agent ID result
*/
export type ParsedEphemeralAgentId = {
endpoint: string;
model: string;
sender?: string;
index?: number;
};
/**
* Encodes an ephemeral agent ID from endpoint, model, optional sender, and optional index.
* Uses __ to replace : (reserved in graph node names) and ___ to separate sender.
*
* Format: endpoint__model___sender or endpoint__model___sender____index (if index provided)
*
* @example
* encodeEphemeralAgentId({ endpoint: 'openAI', model: 'gpt-4o', sender: 'GPT-4o' })
* // => 'openAI__gpt-4o___GPT-4o'
*
* @example
* encodeEphemeralAgentId({ endpoint: 'openAI', model: 'gpt-4o', sender: 'GPT-4o', index: 1 })
* // => 'openAI__gpt-4o___GPT-4o____1'
*/
export function encodeEphemeralAgentId({
endpoint,
model,
sender,
index,
}: {
endpoint: string;
model: string;
sender?: string;
index?: number;
}): string {
const base = `${endpoint}:${model}`.replace(/:/g, '__');
let result = base;
if (sender) {
// Use ___ as separator before sender to distinguish from __ in model names
result = `${base}___${sender.replace(/:/g, '__')}`;
}
if (index != null) {
// Use ____ (4 underscores) as separator for index
result = `${result}____${index}`;
}
return result;
}
/**
* Parses an ephemeral agent ID back into its components.
* Returns undefined if the ID doesn't match the expected format.
*
* Format: endpoint__model___sender or endpoint__model___sender____index
* - ____ (4 underscores) separates optional index suffix
* - ___ (triple underscore) separates model from optional sender
* - __ (double underscore) replaces : in endpoint/model names
*
* @example
* parseEphemeralAgentId('openAI__gpt-4o___GPT-4o')
* // => { endpoint: 'openAI', model: 'gpt-4o', sender: 'GPT-4o' }
*
* @example
* parseEphemeralAgentId('openAI__gpt-4o___GPT-4o____1')
* // => { endpoint: 'openAI', model: 'gpt-4o', sender: 'GPT-4o', index: 1 }
*/
export function parseEphemeralAgentId(agentId: string): ParsedEphemeralAgentId | undefined {
if (!agentId.includes('__')) {
return undefined;
}
// First check for index suffix (separated by ____)
let index: number | undefined;
let workingId = agentId;
if (agentId.includes('____')) {
const lastIndexSep = agentId.lastIndexOf('____');
const indexStr = agentId.slice(lastIndexSep + 4);
const parsedIndex = parseInt(indexStr, 10);
if (!isNaN(parsedIndex)) {
index = parsedIndex;
workingId = agentId.slice(0, lastIndexSep);
}
}
// Check for sender (separated by ___)
let sender: string | undefined;
let mainPart = workingId;
if (workingId.includes('___')) {
const [before, after] = workingId.split('___');
mainPart = before;
// Restore colons in sender if any
sender = after?.replace(/__/g, ':');
}
const [endpoint, ...modelParts] = mainPart.split('__');
if (!endpoint || modelParts.length === 0) {
return undefined;
}
// Restore colons in model name (model names can contain colons like claude-3:opus)
const model = modelParts.join(':');
return { endpoint, model, sender, index };
}
/**
* Checks if an agent ID represents an ephemeral (non-saved) agent.
* Real agent IDs always start with "agent_", so anything else is ephemeral.
*/
export function isEphemeralAgentId(agentId: string | null | undefined): boolean {
return !agentId?.startsWith('agent_');
}
/**
* Strips the index suffix (____N) from an agent ID if present.
* Works with both ephemeral and real agent IDs.
*
* @example
* stripAgentIdSuffix('agent_abc123____1') // => 'agent_abc123'
* stripAgentIdSuffix('openAI__gpt-4o___GPT-4o____1') // => 'openAI__gpt-4o___GPT-4o'
* stripAgentIdSuffix('agent_abc123') // => 'agent_abc123' (unchanged)
*/
export function stripAgentIdSuffix(agentId: string): string {
return agentId.replace(/____\d+$/, '');
}
/**
* Appends an index suffix (____N) to an agent ID.
* Used to distinguish parallel agents with the same base ID.
*
* @example
* appendAgentIdSuffix('agent_abc123', 1) // => 'agent_abc123____1'
* appendAgentIdSuffix('openAI__gpt-4o___GPT-4o', 1) // => 'openAI__gpt-4o___GPT-4o____1'
*/
export function appendAgentIdSuffix(agentId: string, index: number): string {
return `${agentId}____${index}`;
}

View file

@ -109,6 +109,8 @@ export type TPayload = Partial<TMessage> &
isTemporary: boolean;
ephemeralAgent?: TEphemeralAgent | null;
editedContent?: TEditedContent | null;
/** Added conversation for multi-convo feature */
addedConvo?: TConversation;
};
export type TEditedContent =
@ -136,6 +138,8 @@ export type TSubmission = {
clientTimestamp?: string;
ephemeralAgent?: TEphemeralAgent | null;
editedContent?: TEditedContent | null;
/** Added conversation for multi-convo feature */
addedConvo?: TConversation;
};
export type EventSubmission = Omit<TSubmission, 'initialResponse'> & { initialResponse: TMessage };

View file

@ -166,8 +166,11 @@ export namespace Agents {
type: StepTypes;
id: string; // #new
runId?: string; // #new
agentId?: string; // #new
index: number; // #new
stepIndex?: number; // #new
/** Group ID for parallel content - parts with same groupId are displayed in columns */
groupId?: number; // #new
stepDetails: StepDetails;
usage: null | object;
};

View file

@ -466,8 +466,17 @@ export type PartMetadata = {
action?: boolean;
auth?: string;
expires_at?: number;
/** Index indicating parallel sibling content (same stepIndex in multi-agent runs) */
siblingIndex?: number;
/** Agent ID for parallel agent rendering - identifies which agent produced this content */
agentId?: string;
/** Group ID for parallel content - parts with same groupId are displayed in columns */
groupId?: number;
};
/** Metadata for parallel content rendering - subset of PartMetadata */
export type ContentMetadata = Pick<PartMetadata, 'agentId' | 'groupId'>;
export type ContentPart = (
| CodeToolCall
| RetrievalToolCall
@ -482,18 +491,18 @@ export type ContentPart = (
export type TextData = (Text & PartMetadata) | undefined;
export type TMessageContentParts =
| {
| ({
type: ContentTypes.ERROR;
text?: string | TextData;
error?: string;
}
| { type: ContentTypes.THINK; think?: string | TextData }
| {
} & ContentMetadata)
| ({ type: ContentTypes.THINK; think?: string | TextData } & ContentMetadata)
| ({
type: ContentTypes.TEXT;
text?: string | TextData;
tool_call_ids?: string[];
}
| {
} & ContentMetadata)
| ({
type: ContentTypes.TOOL_CALL;
tool_call: (
| CodeToolCall
@ -503,10 +512,10 @@ export type TMessageContentParts =
| Agents.AgentToolCall
) &
PartMetadata;
}
| { type: ContentTypes.IMAGE_FILE; image_file: ImageFile & PartMetadata }
| Agents.AgentUpdate
| Agents.MessageContentImageUrl;
} & ContentMetadata)
| ({ type: ContentTypes.IMAGE_FILE; image_file: ImageFile & PartMetadata } & ContentMetadata)
| (Agents.AgentUpdate & ContentMetadata)
| (Agents.MessageContentImageUrl & ContentMetadata);
export type StreamContentData = TMessageContentParts & {
/** The index of the current content part */

View file

@ -381,6 +381,20 @@ export type EditArtifactOptions = MutationOptions<
Error
>;
export type TBranchMessageRequest = {
messageId: string;
agentId: string;
};
export type TBranchMessageResponse = types.TMessage;
export type BranchMessageOptions = MutationOptions<
TBranchMessageResponse,
TBranchMessageRequest,
unknown,
Error
>;
export type TLogoutResponse = {
message: string;
redirect?: string;

View file

@ -1,5 +1,4 @@
import type { Logger as WinstonLogger } from 'winston';
import type { RunnableConfig } from '@langchain/core/runnables';
export type SearchRefType = 'search' | 'image' | 'news' | 'video' | 'ref';
@ -174,16 +173,6 @@ export interface CohereRerankerResponse {
export type SafeSearchLevel = 0 | 1 | 2;
export type Logger = WinstonLogger;
export interface SearchToolConfig extends SearchConfig, ProcessSourcesConfig, FirecrawlConfig {
logger?: Logger;
safeSearch?: SafeSearchLevel;
jinaApiKey?: string;
jinaApiUrl?: string;
cohereApiKey?: string;
rerankerType?: RerankerType;
onSearchResults?: (results: SearchResult, runnableConfig?: RunnableConfig) => void;
onGetHighlights?: (link: string) => void;
}
export interface MediaReference {
originalUrl: string;
title?: string;
@ -290,18 +279,6 @@ export interface FirecrawlScraperConfig {
logger?: Logger;
}
export type GetSourcesParams = {
query: string;
date?: DATE_RANGE;
country?: string;
numResults?: number;
safeSearch?: SearchToolConfig['safeSearch'];
images?: boolean;
videos?: boolean;
news?: boolean;
type?: 'search' | 'images' | 'videos' | 'news';
};
/** Serper API */
export interface VideoResult {
title?: string;
@ -609,12 +586,3 @@ export interface SearXNGResult {
publishedDate?: string;
img_src?: string;
}
export type ProcessSourcesFields = {
result: SearchResult;
numElements: number;
query: string;
news: boolean;
proMode: boolean;
onGetHighlights: SearchToolConfig['onGetHighlights'];
};