mirror of
https://github.com/danny-avila/LibreChat.git
synced 2026-01-07 02:58:50 +01:00
📜 refactor: Optimize Conversation History Nav with Cursor Pagination (#5785)
* ✨ feat: improve Nav/Conversations/Convo/NewChat component performance * ✨ feat: implement cursor-based pagination for conversations API * 🔧 refactor: remove createdAt from conversation selection in API and type definitions * 🔧 refactor: include createdAt in conversation selection and update related types * ✨ fix: search functionality and bugs with loadMoreConversations * feat: move ArchivedChats to cursor and DataTable standard * 🔧 refactor: add InfiniteQueryObserverResult type import in Nav component * feat: enhance conversation listing with pagination, sorting, and search capabilities * 🔧 refactor: remove unnecessary comment regarding lodash/debounce in ArchivedChatsTable * 🔧 refactor: remove unused translation keys for archived chats and search results * 🔧 fix: Archived Chats, Delete Convo, Duplicate Convo * 🔧 refactor: improve conversation components with layout adjustments and new translations * 🔧 refactor: simplify archive conversation mutation and improve unarchive handling; fix: update fork mutation * 🔧 refactor: decode search query parameter in conversation route; improve error handling in unarchive mutation; clean up DataTable component styles * 🔧 refactor: remove unused translation key for empty archived chats * 🚀 fix: `archivedConversation` query key not updated correctly while archiving * 🧠 feat: Bedrock Anthropic Reasoning & Update Endpoint Handling (#6163) * feat: Add thinking and thinkingBudget parameters for Bedrock Anthropic models * chore: Update @librechat/agents to version 2.1.8 * refactor: change region order in params * refactor: Add maxTokens parameter to conversation preset schema * refactor: Update agent client to use bedrockInputSchema and improve error handling for model parameters * refactor: streamline/optimize llmConfig initialization and saving for bedrock * fix: ensure config titleModel is used for all endpoints * refactor: enhance OpenAIClient and agent initialization to support endpoint checks for OpenRouter * chore: bump @google/generative-ai * ✨ feat: improve Nav/Conversations/Convo/NewChat component performance * 🔧 refactor: remove unnecessary comment regarding lodash/debounce in ArchivedChatsTable * 🔧 refactor: update translation keys for clarity; simplify conversation query parameters and improve sorting functionality in SharedLinks component * 🔧 refactor: optimize conversation loading logic and improve search handling in Nav component * fix: package-lock * fix: package-lock 2 * fix: package lock 3 * refactor: remove unused utility files and exports to clean up the codebase * refactor: remove i18n and useAuthRedirect modules to streamline codebase * refactor: optimize Conversations component and remove unused ToggleContext * refactor(Convo): add RenameForm and ConvoLink components; enhance Conversations component with responsive design * fix: add missing @azure/storage-blob dependency in package.json * refactor(Search): add error handling with toast notification for search errors * refactor: make createdAt and updatedAt fields of tConvoUpdateSchema less restrictive if timestamps are missing * chore: update @azure/storage-blob dependency to version 12.27.0, ensure package-lock is correct * refactor(Search): improve conversation handling server side * fix: eslint warning and errors * refactor(Search): improved search loading state and overall UX * Refactors conversation cache management Centralizes conversation mutation logic into dedicated utility functions for adding, updating, and removing conversations from query caches. Improves reliability and maintainability by: - Consolidating duplicate cache manipulation code - Adding type safety for infinite query data structures - Implementing consistent cache update patterns across all conversation operations - Removing obsolete conversation helper functions in favor of standardized utilities * fix: conversation handling and SSE event processing - Optimizes conversation state management with useMemo and proper hook ordering - Improves SSE event handler documentation and error handling - Adds reset guard flag for conversation changes - Removes redundant navigation call - Cleans up cursor handling logic and document structure Improves code maintainability and prevents potential race conditions in conversation state updates * refactor: add type for SearchBar `onChange` * fix: type tags * style: rounded to xl all Header buttons * fix: activeConvo in Convo not working * style(Bookmarks): improved UI * a11y(AccountSettings): fixed hover style not visible when using light theme * style(SettingsTabs): improved tab switchers and dropdowns * feat: add translations keys for Speech * chore: fix package-lock * fix(mutations): legacy import after rebase * feat: refactor conversation navigation for accessibility * fix(search): convo and message create/update date not returned * fix(search): show correct iconURL and endpoint for searched messages * fix: small UI improvements * chore: console.log cleanup * chore: fix tests * fix(ChatForm): improve conversation ID handling and clean up useMemo dependencies * chore: improve typing * chore: improve typing * fix(useSSE): clear conversation ID on submission to prevent draft restoration * refactor(OpenAIClient): clean up abort handler * refactor(abortMiddleware): change handleAbort to use function expression * feat: add PENDING_CONVO constant and update conversation ID checks * fix: final event handling on abort * fix: improve title sync and query cache sync on final event * fix: prevent overwriting cached conversation data if it already exists --------- Co-authored-by: Danny Avila <danny@librechat.ai>
This commit is contained in:
parent
77a21719fd
commit
650e9b4f6c
69 changed files with 3434 additions and 2139 deletions
|
|
@ -8,17 +8,12 @@ import {
|
|||
startOfYear,
|
||||
isWithinInterval,
|
||||
} from 'date-fns';
|
||||
import { EModelEndpoint, LocalStorageKeys } from 'librechat-data-provider';
|
||||
import type {
|
||||
TConversation,
|
||||
ConversationData,
|
||||
GroupedConversations,
|
||||
ConversationListResponse,
|
||||
} from 'librechat-data-provider';
|
||||
|
||||
import { addData, deleteData, updateData, findPage } from './collection';
|
||||
import { InfiniteData } from '@tanstack/react-query';
|
||||
import { QueryClient } from '@tanstack/react-query';
|
||||
import { EModelEndpoint, LocalStorageKeys, QueryKeys } from 'librechat-data-provider';
|
||||
import type { TConversation, GroupedConversations } from 'librechat-data-provider';
|
||||
import type { InfiniteData } from '@tanstack/react-query';
|
||||
|
||||
// Date group helpers
|
||||
export const dateKeys = {
|
||||
today: 'com_ui_date_today',
|
||||
yesterday: 'com_ui_date_yesterday',
|
||||
|
|
@ -73,11 +68,7 @@ const monthOrderMap = new Map([
|
|||
['february', 1],
|
||||
['january', 0],
|
||||
]);
|
||||
|
||||
const dateKeysReverse = Object.fromEntries(
|
||||
Object.entries(dateKeys).map(([key, value]) => [value, key]),
|
||||
);
|
||||
|
||||
const dateKeysReverse = Object.fromEntries(Object.entries(dateKeys).map(([k, v]) => [v, k]));
|
||||
const dateGroupsSet = new Set([
|
||||
dateKeys.today,
|
||||
dateKeys.yesterday,
|
||||
|
|
@ -91,7 +82,6 @@ export const groupConversationsByDate = (
|
|||
if (!Array.isArray(conversations)) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const seenConversationIds = new Set();
|
||||
const groups = new Map();
|
||||
const now = new Date(Date.now());
|
||||
|
|
@ -108,7 +98,6 @@ export const groupConversationsByDate = (
|
|||
} else {
|
||||
date = now;
|
||||
}
|
||||
|
||||
const groupName = getGroupName(date);
|
||||
if (!groups.has(groupName)) {
|
||||
groups.set(groupName, []);
|
||||
|
|
@ -117,15 +106,12 @@ export const groupConversationsByDate = (
|
|||
});
|
||||
|
||||
const sortedGroups = new Map();
|
||||
|
||||
// Add date groups first
|
||||
dateGroupsSet.forEach((group) => {
|
||||
if (groups.has(group)) {
|
||||
sortedGroups.set(group, groups.get(group));
|
||||
}
|
||||
});
|
||||
|
||||
// Sort and add year/month groups
|
||||
const yearMonthGroups = Array.from(groups.keys())
|
||||
.filter((group) => !dateGroupsSet.has(group))
|
||||
.sort((a, b) => {
|
||||
|
|
@ -133,141 +119,285 @@ export const groupConversationsByDate = (
|
|||
if (yearA !== yearB) {
|
||||
return yearB - yearA;
|
||||
}
|
||||
|
||||
const [monthA, monthB] = [dateKeysReverse[a], dateKeysReverse[b]];
|
||||
const bOrder = monthOrderMap.get(monthB) ?? -1;
|
||||
const aOrder = monthOrderMap.get(monthA) ?? -1;
|
||||
const bOrder = monthOrderMap.get(monthB) ?? -1,
|
||||
aOrder = monthOrderMap.get(monthA) ?? -1;
|
||||
return bOrder - aOrder;
|
||||
});
|
||||
|
||||
yearMonthGroups.forEach((group) => {
|
||||
sortedGroups.set(group, groups.get(group));
|
||||
});
|
||||
|
||||
// Sort conversations within each group
|
||||
sortedGroups.forEach((conversations) => {
|
||||
conversations.sort(
|
||||
(a: TConversation, b: TConversation) =>
|
||||
new Date(b.updatedAt).getTime() - new Date(a.updatedAt).getTime(),
|
||||
);
|
||||
});
|
||||
|
||||
return Array.from(sortedGroups, ([key, value]) => [key, value]);
|
||||
};
|
||||
|
||||
export const addConversation = (
|
||||
data: InfiniteData<ConversationListResponse>,
|
||||
newConversation: TConversation,
|
||||
): ConversationData => {
|
||||
return addData<ConversationListResponse, TConversation>(
|
||||
data,
|
||||
'conversations',
|
||||
newConversation,
|
||||
(page) =>
|
||||
page.conversations.findIndex((c) => c.conversationId === newConversation.conversationId),
|
||||
);
|
||||
export type ConversationCursorData = {
|
||||
conversations: TConversation[];
|
||||
nextCursor?: string | null;
|
||||
};
|
||||
|
||||
export function findPageForConversation(
|
||||
data: ConversationData,
|
||||
conversation: TConversation | { conversationId: string },
|
||||
) {
|
||||
return findPage<ConversationListResponse>(data, (page) =>
|
||||
page.conversations.findIndex((c) => c.conversationId === conversation.conversationId),
|
||||
);
|
||||
}
|
||||
// === InfiniteData helpers for cursor-based convo queries ===
|
||||
|
||||
export const updateConversation = (
|
||||
data: InfiniteData<ConversationListResponse>,
|
||||
newConversation: TConversation,
|
||||
): ConversationData => {
|
||||
return updateData<ConversationListResponse, TConversation>(
|
||||
data,
|
||||
'conversations',
|
||||
newConversation,
|
||||
(page) =>
|
||||
page.conversations.findIndex((c) => c.conversationId === newConversation.conversationId),
|
||||
);
|
||||
};
|
||||
|
||||
export const updateConvoFields = (
|
||||
data: ConversationData,
|
||||
updatedConversation: Partial<TConversation> & Pick<TConversation, 'conversationId'>,
|
||||
keepPosition = false,
|
||||
): ConversationData => {
|
||||
const newData = JSON.parse(JSON.stringify(data));
|
||||
const { pageIndex, index } = findPageForConversation(
|
||||
newData,
|
||||
updatedConversation as { conversationId: string },
|
||||
);
|
||||
if (pageIndex !== -1 && index !== -1) {
|
||||
const oldConversation = newData.pages[pageIndex].conversations[index] as TConversation;
|
||||
|
||||
/**
|
||||
* Do not change the position of the conversation if the tags are updated.
|
||||
*/
|
||||
if (keepPosition) {
|
||||
const updatedConvo = {
|
||||
...oldConversation,
|
||||
...updatedConversation,
|
||||
};
|
||||
newData.pages[pageIndex].conversations[index] = updatedConvo;
|
||||
} else {
|
||||
const updatedConvo = {
|
||||
...oldConversation,
|
||||
...updatedConversation,
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
newData.pages[pageIndex].conversations.splice(index, 1);
|
||||
newData.pages[0].conversations.unshift(updatedConvo);
|
||||
}
|
||||
}
|
||||
|
||||
return newData;
|
||||
};
|
||||
|
||||
export const deleteConversation = (
|
||||
data: ConversationData,
|
||||
export function findConversationInInfinite(
|
||||
data: InfiniteData<ConversationCursorData> | undefined,
|
||||
conversationId: string,
|
||||
): ConversationData => {
|
||||
return deleteData<ConversationListResponse, ConversationData>(data, 'conversations', (page) =>
|
||||
page.conversations.findIndex((c) => c.conversationId === conversationId),
|
||||
);
|
||||
};
|
||||
|
||||
export const getConversationById = (
|
||||
data: ConversationData | undefined,
|
||||
conversationId: string | null,
|
||||
): TConversation | undefined => {
|
||||
if (!data || !(conversationId ?? '')) {
|
||||
): TConversation | undefined {
|
||||
if (!data) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
for (const page of data.pages) {
|
||||
const conversation = page.conversations.find((c) => c.conversationId === conversationId);
|
||||
if (conversation) {
|
||||
return conversation;
|
||||
const found = page.conversations.find((c) => c.conversationId === conversationId);
|
||||
if (found) {
|
||||
return found;
|
||||
}
|
||||
}
|
||||
return undefined;
|
||||
};
|
||||
}
|
||||
|
||||
export function updateInfiniteConvoPage(
|
||||
data: InfiniteData<ConversationCursorData> | undefined,
|
||||
conversationId: string,
|
||||
updater: (c: TConversation) => TConversation,
|
||||
): InfiniteData<ConversationCursorData> | undefined {
|
||||
if (!data) {
|
||||
return data;
|
||||
}
|
||||
return {
|
||||
...data,
|
||||
pages: data.pages.map((page) => ({
|
||||
...page,
|
||||
conversations: page.conversations.map((c) =>
|
||||
c.conversationId === conversationId ? updater(c) : c,
|
||||
),
|
||||
})),
|
||||
};
|
||||
}
|
||||
|
||||
export function addConversationToInfinitePages(
|
||||
data: InfiniteData<ConversationCursorData> | undefined,
|
||||
newConversation: TConversation,
|
||||
): InfiniteData<ConversationCursorData> {
|
||||
if (!data) {
|
||||
return {
|
||||
pageParams: [undefined],
|
||||
pages: [{ conversations: [newConversation], nextCursor: null }],
|
||||
};
|
||||
}
|
||||
return {
|
||||
...data,
|
||||
pages: [
|
||||
{ ...data.pages[0], conversations: [newConversation, ...data.pages[0].conversations] },
|
||||
...data.pages.slice(1),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
export function addConversationToAllConversationsQueries(
|
||||
queryClient: QueryClient,
|
||||
newConversation: TConversation,
|
||||
) {
|
||||
// Find all keys that start with QueryKeys.allConversations
|
||||
const queries = queryClient
|
||||
.getQueryCache()
|
||||
.findAll([QueryKeys.allConversations], { exact: false });
|
||||
|
||||
for (const query of queries) {
|
||||
queryClient.setQueryData<InfiniteData<ConversationCursorData>>(query.queryKey, (old) => {
|
||||
if (
|
||||
!old ||
|
||||
old.pages[0].conversations.some((c) => c.conversationId === newConversation.conversationId)
|
||||
) {
|
||||
return old;
|
||||
}
|
||||
return {
|
||||
...old,
|
||||
pages: [
|
||||
{
|
||||
...old.pages[0],
|
||||
conversations: [newConversation, ...old.pages[0].conversations],
|
||||
},
|
||||
...old.pages.slice(1),
|
||||
],
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export function removeConvoFromInfinitePages(
|
||||
data: InfiniteData<ConversationCursorData> | undefined,
|
||||
conversationId: string,
|
||||
): InfiniteData<ConversationCursorData> | undefined {
|
||||
if (!data) {
|
||||
return data;
|
||||
}
|
||||
return {
|
||||
...data,
|
||||
pages: data.pages
|
||||
.map((page) => ({
|
||||
...page,
|
||||
conversations: page.conversations.filter((c) => c.conversationId !== conversationId),
|
||||
}))
|
||||
.filter((page) => page.conversations.length > 0),
|
||||
};
|
||||
}
|
||||
|
||||
// Used for partial update (e.g., title, etc.), updating AND possibly bumping to front of visible convos
|
||||
export function updateConvoFieldsInfinite(
|
||||
data: InfiniteData<ConversationCursorData> | undefined,
|
||||
updatedConversation: Partial<TConversation> & { conversationId: string },
|
||||
keepPosition = false,
|
||||
): InfiniteData<ConversationCursorData> | undefined {
|
||||
if (!data) {
|
||||
return data;
|
||||
}
|
||||
let found: TConversation | undefined;
|
||||
let pageIdx = -1,
|
||||
convoIdx = -1;
|
||||
for (let i = 0; i < data.pages.length; ++i) {
|
||||
const idx = data.pages[i].conversations.findIndex(
|
||||
(c) => c.conversationId === updatedConversation.conversationId,
|
||||
);
|
||||
if (idx !== -1) {
|
||||
pageIdx = i;
|
||||
convoIdx = idx;
|
||||
found = data.pages[i].conversations[idx];
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!found) {
|
||||
return data;
|
||||
}
|
||||
|
||||
if (keepPosition) {
|
||||
return {
|
||||
...data,
|
||||
pages: data.pages.map((page, pi) =>
|
||||
pi === pageIdx
|
||||
? {
|
||||
...page,
|
||||
conversations: page.conversations.map((c, ci) =>
|
||||
ci === convoIdx ? { ...c, ...updatedConversation } : c,
|
||||
),
|
||||
}
|
||||
: page,
|
||||
),
|
||||
};
|
||||
} else {
|
||||
const patched = { ...found, ...updatedConversation, updatedAt: new Date().toISOString() };
|
||||
const pages = data.pages.map((page) => ({
|
||||
...page,
|
||||
conversations: page.conversations.filter((c) => c.conversationId !== patched.conversationId),
|
||||
}));
|
||||
|
||||
pages[0].conversations = [patched, ...pages[0].conversations];
|
||||
|
||||
const finalPages = pages.filter((page) => page.conversations.length > 0);
|
||||
return { ...data, pages: finalPages };
|
||||
}
|
||||
}
|
||||
|
||||
export function storeEndpointSettings(conversation: TConversation | null) {
|
||||
if (!conversation) {
|
||||
return;
|
||||
}
|
||||
const { endpoint, model, agentOptions } = conversation;
|
||||
|
||||
if (!endpoint) {
|
||||
return;
|
||||
}
|
||||
|
||||
const lastModel = JSON.parse(localStorage.getItem(LocalStorageKeys.LAST_MODEL) ?? '{}');
|
||||
lastModel[endpoint] = model;
|
||||
|
||||
if (endpoint === EModelEndpoint.gptPlugins) {
|
||||
lastModel.secondaryModel = agentOptions?.model ?? model ?? '';
|
||||
}
|
||||
|
||||
localStorage.setItem(LocalStorageKeys.LAST_MODEL, JSON.stringify(lastModel));
|
||||
}
|
||||
|
||||
// Add
|
||||
export function addConvoToAllQueries(queryClient: QueryClient, newConvo: TConversation) {
|
||||
const queries = queryClient
|
||||
.getQueryCache()
|
||||
.findAll([QueryKeys.allConversations], { exact: false });
|
||||
|
||||
for (const query of queries) {
|
||||
queryClient.setQueryData<InfiniteData<ConversationCursorData>>(query.queryKey, (oldData) => {
|
||||
if (!oldData) {
|
||||
return oldData;
|
||||
}
|
||||
if (
|
||||
oldData.pages.some((p) =>
|
||||
p.conversations.some((c) => c.conversationId === newConvo.conversationId),
|
||||
)
|
||||
) {
|
||||
return oldData;
|
||||
}
|
||||
return {
|
||||
...oldData,
|
||||
pages: [
|
||||
{
|
||||
...oldData.pages[0],
|
||||
conversations: [newConvo, ...oldData.pages[0].conversations],
|
||||
},
|
||||
...oldData.pages.slice(1),
|
||||
],
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Update
|
||||
export function updateConvoInAllQueries(
|
||||
queryClient: QueryClient,
|
||||
conversationId: string,
|
||||
updater: (c: TConversation) => TConversation,
|
||||
) {
|
||||
const queries = queryClient
|
||||
.getQueryCache()
|
||||
.findAll([QueryKeys.allConversations], { exact: false });
|
||||
|
||||
for (const query of queries) {
|
||||
queryClient.setQueryData<InfiniteData<ConversationCursorData>>(query.queryKey, (oldData) => {
|
||||
if (!oldData) {
|
||||
return oldData;
|
||||
}
|
||||
return {
|
||||
...oldData,
|
||||
pages: oldData.pages.map((page) => ({
|
||||
...page,
|
||||
conversations: page.conversations.map((c) =>
|
||||
c.conversationId === conversationId ? updater(c) : c,
|
||||
),
|
||||
})),
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Remove
|
||||
export function removeConvoFromAllQueries(queryClient: QueryClient, conversationId: string) {
|
||||
const queries = queryClient
|
||||
.getQueryCache()
|
||||
.findAll([QueryKeys.allConversations], { exact: false });
|
||||
|
||||
for (const query of queries) {
|
||||
queryClient.setQueryData<InfiniteData<ConversationCursorData>>(query.queryKey, (oldData) => {
|
||||
if (!oldData) {
|
||||
return oldData;
|
||||
}
|
||||
return {
|
||||
...oldData,
|
||||
pages: oldData.pages
|
||||
.map((page) => ({
|
||||
...page,
|
||||
conversations: page.conversations.filter((c) => c.conversationId !== conversationId),
|
||||
}))
|
||||
.filter((page) => page.conversations.length > 0),
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue