🧠 feat: User Memories for Conversational Context (#7760)

* 🧠 feat: User Memories for Conversational Context

chore: mcp typing, use `t`

WIP: first pass, Memories UI

- Added MemoryViewer component for displaying, editing, and deleting user memories.
- Integrated data provider hooks for fetching, updating, and deleting memories.
- Implemented pagination and loading states for better user experience.
- Created unit tests for MemoryViewer to ensure functionality and interaction with data provider.
- Updated translation files to include new UI strings related to memories.

chore: move mcp-related files to own directory

chore: rename librechat-mcp to librechat-api

WIP: first pass, memory processing and data schemas

chore: linting in fileSearch.js query description

chore: rename librechat-api to @librechat/api across the project

WIP: first pass, functional memory agent

feat: add MemoryEditDialog and MemoryViewer components for managing user memories

- Introduced MemoryEditDialog for editing memory entries with validation and toast notifications.
- Updated MemoryViewer to support editing and deleting memories, including pagination and loading states.
- Enhanced data provider to handle memory updates with optional original key for better management.
- Added new localization strings for memory-related UI elements.

feat: add memory permissions management

- Implemented memory permissions in the backend, allowing roles to have specific permissions for using, creating, updating, and reading memories.
- Added new API endpoints for updating memory permissions associated with roles.
- Created a new AdminSettings component for managing memory permissions in the frontend.
- Integrated memory permissions into the existing roles and permissions schemas.
- Updated the interface to include memory settings and permissions.
- Enhanced the MemoryViewer component to conditionally render admin settings based on user roles.
- Added localization support for memory permissions in the translation files.

feat: move AdminSettings component to a new position in MemoryViewer for better visibility

refactor: clean up commented code in MemoryViewer component

feat: enhance MemoryViewer with search functionality and improve MemoryEditDialog integration

- Added a search input to filter memories in the MemoryViewer component.
- Refactored MemoryEditDialog to accept children for better customization.
- Updated MemoryViewer to utilize the new EditMemoryButton and DeleteMemoryButton components for editing and deleting memories.
- Improved localization support by adding new strings for memory filtering and deletion confirmation.

refactor: optimize memory filtering in MemoryViewer using match-sorter

- Replaced manual filtering logic with match-sorter for improved search functionality.
- Enhanced performance and readability of the filteredMemories computation.

feat: enhance MemoryEditDialog with triggerRef and improve updateMemory mutation handling

feat: implement access control for MemoryEditDialog and MemoryViewer components

refactor: remove commented out code and create runMemory method

refactor: rename role based files

feat: implement access control for memory usage in AgentClient

refactor: simplify checkVisionRequest method in AgentClient by removing commented-out code

refactor: make `agents` dir in api package

refactor: migrate Azure utilities to TypeScript and consolidate imports

refactor: move sanitizeFilename function to a new file and update imports, add related tests

refactor: update LLM configuration types and consolidate Azure options in the API package

chore: linting

chore: import order

refactor: replace getLLMConfig with getOpenAIConfig and remove unused LLM configuration file

chore: update winston-daily-rotate-file to version 5.0.0 and add object-hash dependency in package-lock.json

refactor: move primeResources and optionalChainWithEmptyCheck functions to resources.ts and update imports

refactor: move createRun function to a new run.ts file and update related imports

fix: ensure safeAttachments is correctly typed as an array of TFile

chore: add node-fetch dependency and refactor fetch-related functions into packages/api/utils, removing the old generators file

refactor: enhance TEndpointOption type by using Pick to streamline endpoint fields and add new properties for model parameters and client options

feat: implement initializeOpenAIOptions function and update OpenAI types for enhanced configuration handling

fix: update types due to new TEndpointOption typing

fix: ensure safe access to group parameters in initializeOpenAIOptions function

fix: remove redundant API key validation comment in initializeOpenAIOptions function

refactor: rename initializeOpenAIOptions to initializeOpenAI for consistency and update related documentation

refactor: decouple req.body fields and tool loading from initializeAgentOptions

chore: linting

refactor: adjust column widths in MemoryViewer for improved layout

refactor: simplify agent initialization by creating loadAgent function and removing unused code

feat: add memory configuration loading and validation functions

WIP: first pass, memory processing with config

feat: implement memory callback and artifact handling

feat: implement memory artifacts display and processing updates

feat: add memory configuration options and schema validation for validKeys

fix: update MemoryEditDialog and MemoryViewer to handle memory state and display improvements

refactor: remove padding from BookmarkTable and MemoryViewer headers for consistent styling

WIP: initial tokenLimit config and move Tokenizer to @librechat/api

refactor: update mongoMeili plugin methods to use callback for better error handling

feat: enhance memory management with token tracking and usage metrics

- Added token counting for memory entries to enforce limits and provide usage statistics.
- Updated memory retrieval and update routes to include total token usage and limit.
- Enhanced MemoryEditDialog and MemoryViewer components to display memory usage and token information.
- Refactored memory processing functions to handle token limits and provide feedback on memory capacity.

feat: implement memory artifact handling in attachment handler

- Enhanced useAttachmentHandler to process memory artifacts when receiving updates.
- Introduced handleMemoryArtifact utility to manage memory updates and deletions.
- Updated query client to reflect changes in memory state based on incoming data.

refactor: restructure web search key extraction logic

- Moved the logic for extracting API keys from the webSearchAuth configuration into a dedicated function, getWebSearchKeys.
- Updated webSearchKeys to utilize the new function for improved clarity and maintainability.
- Prevents build time errors

feat: add personalization settings and memory preferences management

- Introduced a new Personalization tab in settings to manage user memory preferences.
- Implemented API endpoints and client-side logic for updating memory preferences.
- Enhanced user interface components to reflect personalization options and memory usage.
- Updated permissions to allow users to opt out of memory features.
- Added localization support for new settings and messages related to personalization.

style: personalization switch class

feat: add PersonalizationIcon and align Side Panel UI

feat: implement memory creation functionality

- Added a new API endpoint for creating memory entries, including validation for key and value.
- Introduced MemoryCreateDialog component for user interface to facilitate memory creation.
- Integrated token limit checks to prevent exceeding user memory capacity.
- Updated MemoryViewer to include a button for opening the memory creation dialog.
- Enhanced localization support for new messages related to memory creation.

feat: enhance message processing with configurable window size

- Updated AgentClient to use a configurable message window size for processing messages.
- Introduced messageWindowSize option in memory configuration schema with a default value of 5.
- Improved logic for selecting messages to process based on the configured window size.

chore: update librechat-data-provider version to 0.7.87 in package.json and package-lock.json

chore: remove OpenAPIPlugin and its associated tests

chore: remove MIGRATION_README.md as migration tasks are completed

ci: fix backend tests

chore: remove unused translation keys from localization file

chore: remove problematic test file and unused var in AgentClient

chore: remove unused import and import directly for JSDoc

* feat: add api package build stage in Dockerfile for improved modularity

* docs: reorder build steps in contributing guide for clarity
This commit is contained in:
Danny Avila 2025-06-07 18:52:22 -04:00 committed by GitHub
parent cd7dd576c1
commit 29ef91b4dd
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
170 changed files with 5700 additions and 3632 deletions

View file

@ -1,114 +0,0 @@
# `@librechat/data-schemas`
Mongoose schemas and models for LibreChat. This package provides a comprehensive collection of Mongoose schemas used across the LibreChat project, enabling robust data modeling and validation for various entities such as actions, agents, messages, users, and more.
## Features
- **Modular Schemas:** Includes schemas for actions, agents, assistants, balance, banners, categories, conversation tags, conversations, files, keys, messages, plugin authentication, presets, projects, prompts, prompt groups, roles, sessions, shared links, tokens, tool calls, transactions, and users.
- **TypeScript Support:** Provides TypeScript definitions for type-safe development.
- **Ready for Mongoose Integration:** Easily integrate with Mongoose to create models and interact with your MongoDB database.
- **Flexible & Extensible:** Designed to support the evolving needs of LibreChat while being adaptable to other projects.
## Installation
Install the package via npm or yarn:
```bash
npm install @librechat/data-schemas
```
Or with yarn:
```bash
yarn add @librechat/data-schemas
```
## Usage
After installation, you can import and use the schemas in your project. For example, to create a Mongoose model for a user:
```js
import mongoose from 'mongoose';
import { userSchema } from '@librechat/data-schemas';
const UserModel = mongoose.model('User', userSchema);
// Now you can use UserModel to create, read, update, and delete user documents.
```
You can also import other schemas as needed:
```js
import { actionSchema, agentSchema, messageSchema } from '@librechat/data-schemas';
```
Each schema is designed to integrate seamlessly with Mongoose and provides indexes, timestamps, and validations tailored for LibreChats use cases.
## Development
This package uses Rollup and TypeScript for building and bundling.
### Available Scripts
- **Build:**
Cleans the `dist` directory and builds the package.
```bash
npm run build
```
- **Build Watch:**
Rebuilds automatically on file changes.
```bash
npm run build:watch
```
- **Test:**
Runs tests with coverage in watch mode.
```bash
npm run test
```
- **Test (CI):**
Runs tests with coverage for CI environments.
```bash
npm run test:ci
```
- **Verify:**
Runs tests in CI mode to verify code integrity.
```bash
npm run verify
```
- **Clean:**
Removes the `dist` directory.
```bash
npm run clean
```
For those using Bun, equivalent scripts are available:
- **Bun Clean:** `bun run b:clean`
- **Bun Build:** `bun run b:build`
## Repository & Issues
The source code is maintained on GitHub.
- **Repository:** [LibreChat Repository](https://github.com/danny-avila/LibreChat.git)
- **Issues & Bug Reports:** [LibreChat Issues](https://github.com/danny-avila/LibreChat/issues)
## License
This project is licensed under the [MIT License](LICENSE).
## Contributing
Contributions to improve and expand the data schemas are welcome. If you have suggestions, improvements, or bug fixes, please open an issue or submit a pull request on the [GitHub repository](https://github.com/danny-avila/LibreChat/issues).
For more detailed documentation on each schema and model, please refer to the source code or visit the [LibreChat website](https://librechat.ai).

View file

@ -3,5 +3,6 @@ export * from './schema';
export { createModels } from './models';
export { createMethods } from './methods';
export type * from './types';
export type * from './methods';
export { default as logger } from './config/winston';
export { default as meiliLogger } from './config/meiliLogger';

View file

@ -2,6 +2,8 @@ import { createUserMethods, type UserMethods } from './user';
import { createSessionMethods, type SessionMethods } from './session';
import { createTokenMethods, type TokenMethods } from './token';
import { createRoleMethods, type RoleMethods } from './role';
/* Memories */
import { createMemoryMethods, type MemoryMethods } from './memory';
/**
* Creates all database methods for all collections
@ -12,7 +14,9 @@ export function createMethods(mongoose: typeof import('mongoose')) {
...createSessionMethods(mongoose),
...createTokenMethods(mongoose),
...createRoleMethods(mongoose),
...createMemoryMethods(mongoose),
};
}
export type AllMethods = UserMethods & SessionMethods & TokenMethods & RoleMethods;
export type { MemoryMethods };
export type AllMethods = UserMethods & SessionMethods & TokenMethods & RoleMethods & MemoryMethods;

View file

@ -0,0 +1,168 @@
import { Types } from 'mongoose';
import logger from '~/config/winston';
import type * as t from '~/types';
/**
* Formats a date in YYYY-MM-DD format
*/
const formatDate = (date: Date): string => {
return date.toISOString().split('T')[0];
};
// Factory function that takes mongoose instance and returns the methods
export function createMemoryMethods(mongoose: typeof import('mongoose')) {
const MemoryEntry = mongoose.models.MemoryEntry;
/**
* Creates a new memory entry for a user
* Throws an error if a memory with the same key already exists
*/
async function createMemory({
userId,
key,
value,
tokenCount = 0,
}: t.SetMemoryParams): Promise<t.MemoryResult> {
try {
if (key?.toLowerCase() === 'nothing') {
return { ok: false };
}
const existingMemory = await MemoryEntry.findOne({ userId, key });
if (existingMemory) {
throw new Error('Memory with this key already exists');
}
await MemoryEntry.create({
userId,
key,
value,
tokenCount,
updated_at: new Date(),
});
return { ok: true };
} catch (error) {
throw new Error(
`Failed to create memory: ${error instanceof Error ? error.message : 'Unknown error'}`,
);
}
}
/**
* Sets or updates a memory entry for a user
*/
async function setMemory({
userId,
key,
value,
tokenCount = 0,
}: t.SetMemoryParams): Promise<t.MemoryResult> {
try {
if (key?.toLowerCase() === 'nothing') {
return { ok: false };
}
await MemoryEntry.findOneAndUpdate(
{ userId, key },
{
value,
tokenCount,
updated_at: new Date(),
},
{
upsert: true,
new: true,
},
);
return { ok: true };
} catch (error) {
throw new Error(
`Failed to set memory: ${error instanceof Error ? error.message : 'Unknown error'}`,
);
}
}
/**
* Deletes a specific memory entry for a user
*/
async function deleteMemory({ userId, key }: t.DeleteMemoryParams): Promise<t.MemoryResult> {
try {
const result = await MemoryEntry.findOneAndDelete({ userId, key });
return { ok: !!result };
} catch (error) {
throw new Error(
`Failed to delete memory: ${error instanceof Error ? error.message : 'Unknown error'}`,
);
}
}
/**
* Gets all memory entries for a user
*/
async function getAllUserMemories(
userId: string | Types.ObjectId,
): Promise<t.IMemoryEntryLean[]> {
try {
return (await MemoryEntry.find({ userId }).lean()) as t.IMemoryEntryLean[];
} catch (error) {
throw new Error(
`Failed to get all memories: ${error instanceof Error ? error.message : 'Unknown error'}`,
);
}
}
/**
* Gets and formats all memories for a user in two different formats
*/
async function getFormattedMemories({
userId,
}: t.GetFormattedMemoriesParams): Promise<t.FormattedMemoriesResult> {
try {
const memories = await getAllUserMemories(userId);
if (!memories || memories.length === 0) {
return { withKeys: '', withoutKeys: '', totalTokens: 0 };
}
const sortedMemories = memories.sort(
(a, b) => new Date(a.updated_at!).getTime() - new Date(b.updated_at!).getTime(),
);
const totalTokens = sortedMemories.reduce((sum, memory) => {
return sum + (memory.tokenCount || 0);
}, 0);
const withKeys = sortedMemories
.map((memory, index) => {
const date = formatDate(new Date(memory.updated_at!));
const tokenInfo = memory.tokenCount ? ` [${memory.tokenCount} tokens]` : '';
return `${index + 1}. [${date}]. ["key": "${memory.key}"]${tokenInfo}. ["value": "${memory.value}"]`;
})
.join('\n\n');
const withoutKeys = sortedMemories
.map((memory, index) => {
const date = formatDate(new Date(memory.updated_at!));
return `${index + 1}. [${date}]. ${memory.value}`;
})
.join('\n\n');
return { withKeys, withoutKeys, totalTokens };
} catch (error) {
logger.error('Failed to get formatted memories:', error);
return { withKeys: '', withoutKeys: '', totalTokens: 0 };
}
}
return {
setMemory,
createMemory,
deleteMemory,
getAllUserMemories,
getFormattedMemories,
};
}
export type MemoryMethods = ReturnType<typeof createMemoryMethods>;

View file

@ -170,6 +170,35 @@ export function createUserMethods(mongoose: typeof import('mongoose')) {
});
}
/**
* Update a user's personalization memories setting.
* Handles the edge case where the personalization object doesn't exist.
*/
async function toggleUserMemories(
userId: string,
memoriesEnabled: boolean,
): Promise<IUser | null> {
const User = mongoose.models.User;
// First, ensure the personalization object exists
const user = await User.findById(userId);
if (!user) {
return null;
}
// Use $set to update the nested field, which will create the personalization object if it doesn't exist
const updateOperation = {
$set: {
'personalization.memories': memoriesEnabled,
},
};
return (await User.findByIdAndUpdate(userId, updateOperation, {
new: true,
runValidators: true,
}).lean()) as IUser | null;
}
// Return all methods
return {
findUser,
@ -179,6 +208,7 @@ export function createUserMethods(mongoose: typeof import('mongoose')) {
getUserById,
deleteUserById,
generateToken,
toggleUserMemories,
};
}

View file

@ -20,6 +20,7 @@ import { createPromptGroupModel } from './promptGroup';
import { createConversationTagModel } from './conversationTag';
import { createSharedLinkModel } from './sharedLink';
import { createToolCallModel } from './toolCall';
import { createMemoryModel } from './memory';
/**
* Creates all database models for all collections
@ -48,5 +49,6 @@ export function createModels(mongoose: typeof import('mongoose')) {
ConversationTag: createConversationTagModel(mongoose),
SharedLink: createSharedLinkModel(mongoose),
ToolCall: createToolCallModel(mongoose),
MemoryEntry: createMemoryModel(mongoose),
};
}

View file

@ -0,0 +1,6 @@
import memorySchema from '~/schema/memory';
import type { IMemoryEntry } from '~/types/memory';
export function createMemoryModel(mongoose: typeof import('mongoose')) {
return mongoose.models.MemoryEntry || mongoose.model<IMemoryEntry>('MemoryEntry', memorySchema);
}

View file

@ -1,6 +1,14 @@
import _ from 'lodash';
import { MeiliSearch, Index } from 'meilisearch';
import type { FilterQuery, Types, Schema, Document, Model, Query } from 'mongoose';
import type {
CallbackWithoutResultAndOptionalError,
FilterQuery,
Document,
Schema,
Query,
Types,
Model,
} from 'mongoose';
import logger from '~/config/meiliLogger';
interface MongoMeiliOptions {
@ -24,12 +32,12 @@ interface ContentItem {
interface DocumentWithMeiliIndex extends Document {
_meiliIndex?: boolean;
preprocessObjectForIndex?: () => Record<string, unknown>;
addObjectToMeili?: () => Promise<void>;
updateObjectToMeili?: () => Promise<void>;
deleteObjectFromMeili?: () => Promise<void>;
postSaveHook?: () => void;
postUpdateHook?: () => void;
postRemoveHook?: () => void;
addObjectToMeili?: (next: CallbackWithoutResultAndOptionalError) => Promise<void>;
updateObjectToMeili?: (next: CallbackWithoutResultAndOptionalError) => Promise<void>;
deleteObjectFromMeili?: (next: CallbackWithoutResultAndOptionalError) => Promise<void>;
postSaveHook?: (next: CallbackWithoutResultAndOptionalError) => void;
postUpdateHook?: (next: CallbackWithoutResultAndOptionalError) => void;
postRemoveHook?: (next: CallbackWithoutResultAndOptionalError) => void;
conversationId?: string;
content?: ContentItem[];
messageId?: string;
@ -220,7 +228,7 @@ const createMeiliMongooseModel = ({
);
}
} catch (error) {
logger.error('[syncWithMeili] Error adding document to Meili', error);
logger.error('[syncWithMeili] Error adding document to Meili:', error);
}
}
@ -306,28 +314,48 @@ const createMeiliMongooseModel = ({
/**
* Adds the current document to the MeiliSearch index
*/
async addObjectToMeili(this: DocumentWithMeiliIndex): Promise<void> {
async addObjectToMeili(
this: DocumentWithMeiliIndex,
next: CallbackWithoutResultAndOptionalError,
): Promise<void> {
const object = this.preprocessObjectForIndex!();
try {
await index.addDocuments([object]);
} catch (error) {
logger.error('[addObjectToMeili] Error adding document to Meili', error);
logger.error('[addObjectToMeili] Error adding document to Meili:', error);
return next();
}
await this.collection.updateMany(
{ _id: this._id as Types.ObjectId },
{ $set: { _meiliIndex: true } },
);
try {
await this.collection.updateMany(
{ _id: this._id as Types.ObjectId },
{ $set: { _meiliIndex: true } },
);
} catch (error) {
logger.error('[addObjectToMeili] Error updating _meiliIndex field:', error);
return next();
}
next();
}
/**
* Updates the current document in the MeiliSearch index
*/
async updateObjectToMeili(this: DocumentWithMeiliIndex): Promise<void> {
const object = _.omitBy(_.pick(this.toJSON(), attributesToIndex), (v, k) =>
k.startsWith('$'),
);
await index.updateDocuments([object]);
async updateObjectToMeili(
this: DocumentWithMeiliIndex,
next: CallbackWithoutResultAndOptionalError,
): Promise<void> {
try {
const object = _.omitBy(_.pick(this.toJSON(), attributesToIndex), (v, k) =>
k.startsWith('$'),
);
await index.updateDocuments([object]);
next();
} catch (error) {
logger.error('[updateObjectToMeili] Error updating document in Meili:', error);
return next();
}
}
/**
@ -335,8 +363,17 @@ const createMeiliMongooseModel = ({
*
* @returns {Promise<void>}
*/
async deleteObjectFromMeili(this: DocumentWithMeiliIndex): Promise<void> {
await index.deleteDocument(this._id as string);
async deleteObjectFromMeili(
this: DocumentWithMeiliIndex,
next: CallbackWithoutResultAndOptionalError,
): Promise<void> {
try {
await index.deleteDocument(this._id as string);
next();
} catch (error) {
logger.error('[deleteObjectFromMeili] Error deleting document from Meili:', error);
return next();
}
}
/**
@ -345,11 +382,11 @@ const createMeiliMongooseModel = ({
* If the document is already indexed (i.e. `_meiliIndex` is true), it updates it;
* otherwise, it adds the document to the index.
*/
postSaveHook(this: DocumentWithMeiliIndex): void {
postSaveHook(this: DocumentWithMeiliIndex, next: CallbackWithoutResultAndOptionalError): void {
if (this._meiliIndex) {
this.updateObjectToMeili!();
this.updateObjectToMeili!(next);
} else {
this.addObjectToMeili!();
this.addObjectToMeili!(next);
}
}
@ -359,9 +396,14 @@ const createMeiliMongooseModel = ({
* This hook is triggered after a document update, ensuring that changes are
* propagated to the MeiliSearch index if the document is indexed.
*/
postUpdateHook(this: DocumentWithMeiliIndex): void {
postUpdateHook(
this: DocumentWithMeiliIndex,
next: CallbackWithoutResultAndOptionalError,
): void {
if (this._meiliIndex) {
this.updateObjectToMeili!();
this.updateObjectToMeili!(next);
} else {
next();
}
}
@ -371,9 +413,14 @@ const createMeiliMongooseModel = ({
* This hook is triggered after a document is removed, ensuring that the document
* is also removed from the MeiliSearch index if it was previously indexed.
*/
postRemoveHook(this: DocumentWithMeiliIndex): void {
postRemoveHook(
this: DocumentWithMeiliIndex,
next: CallbackWithoutResultAndOptionalError,
): void {
if (this._meiliIndex) {
this.deleteObjectFromMeili!();
this.deleteObjectFromMeili!(next);
} else {
next();
}
}
}
@ -429,16 +476,16 @@ export default function mongoMeili(schema: Schema, options: MongoMeiliOptions):
schema.loadClass(createMeiliMongooseModel({ index, attributesToIndex }));
// Register Mongoose hooks
schema.post('save', function (doc: DocumentWithMeiliIndex) {
doc.postSaveHook?.();
schema.post('save', function (doc: DocumentWithMeiliIndex, next) {
doc.postSaveHook?.(next);
});
schema.post('updateOne', function (doc: DocumentWithMeiliIndex) {
doc.postUpdateHook?.();
schema.post('updateOne', function (doc: DocumentWithMeiliIndex, next) {
doc.postUpdateHook?.(next);
});
schema.post('deleteOne', function (doc: DocumentWithMeiliIndex) {
doc.postRemoveHook?.();
schema.post('deleteOne', function (doc: DocumentWithMeiliIndex, next) {
doc.postRemoveHook?.(next);
});
// Pre-deleteMany hook: remove corresponding documents from MeiliSearch when multiple documents are deleted.
@ -486,13 +533,13 @@ export default function mongoMeili(schema: Schema, options: MongoMeiliOptions):
});
// Post-findOneAndUpdate hook
schema.post('findOneAndUpdate', async function (doc: DocumentWithMeiliIndex) {
schema.post('findOneAndUpdate', async function (doc: DocumentWithMeiliIndex, next) {
if (!meiliEnabled) {
return;
return next();
}
if (doc.unfinished) {
return;
return next();
}
let meiliDoc: Record<string, unknown> | undefined;
@ -509,9 +556,9 @@ export default function mongoMeili(schema: Schema, options: MongoMeiliOptions):
}
if (meiliDoc && meiliDoc.title === doc.title) {
return;
return next();
}
doc.postSaveHook?.();
doc.postSaveHook?.(next);
});
}

View file

@ -21,3 +21,4 @@ export { default as tokenSchema } from './token';
export { default as toolCallSchema } from './toolCall';
export { default as transactionSchema } from './transaction';
export { default as userSchema } from './user';
export { default as memorySchema } from './memory';

View file

@ -0,0 +1,33 @@
import { Schema } from 'mongoose';
import type { IMemoryEntry } from '~/types/memory';
const MemoryEntrySchema: Schema<IMemoryEntry> = new Schema({
userId: {
type: Schema.Types.ObjectId,
ref: 'User',
index: true,
required: true,
},
key: {
type: String,
required: true,
validate: {
validator: (v: string) => /^[a-z_]+$/.test(v),
message: 'Key must only contain lowercase letters and underscores',
},
},
value: {
type: String,
required: true,
},
tokenCount: {
type: Number,
default: 0,
},
updated_at: {
type: Date,
default: Date.now,
},
});
export default MemoryEntrySchema;

View file

@ -13,6 +13,13 @@ const rolePermissionsSchema = new Schema(
[Permissions.USE]: { type: Boolean, default: true },
[Permissions.CREATE]: { type: Boolean, default: true },
},
[PermissionTypes.MEMORIES]: {
[Permissions.USE]: { type: Boolean, default: true },
[Permissions.CREATE]: { type: Boolean, default: true },
[Permissions.UPDATE]: { type: Boolean, default: true },
[Permissions.READ]: { type: Boolean, default: true },
[Permissions.OPT_OUT]: { type: Boolean, default: true },
},
[PermissionTypes.AGENTS]: {
[Permissions.SHARED_GLOBAL]: { type: Boolean, default: false },
[Permissions.USE]: { type: Boolean, default: true },
@ -45,6 +52,12 @@ const roleSchema: Schema<IRole> = new Schema({
[Permissions.USE]: true,
[Permissions.CREATE]: true,
},
[PermissionTypes.MEMORIES]: {
[Permissions.USE]: true,
[Permissions.CREATE]: true,
[Permissions.UPDATE]: true,
[Permissions.READ]: true,
},
[PermissionTypes.AGENTS]: {
[Permissions.SHARED_GLOBAL]: false,
[Permissions.USE]: true,

View file

@ -129,6 +129,15 @@ const userSchema = new Schema<IUser>(
type: Boolean,
default: false,
},
personalization: {
type: {
memories: {
type: Boolean,
default: true,
},
},
default: {},
},
},
{ timestamps: true },
);

View file

@ -1,3 +1,6 @@
import type { Types } from 'mongoose';
export type ObjectId = Types.ObjectId;
export * from './user';
export * from './token';
export * from './convo';
@ -10,3 +13,5 @@ export * from './role';
export * from './action';
export * from './assistant';
export * from './file';
/* Memories */
export * from './memory';

View file

@ -0,0 +1,48 @@
import type { Types, Document } from 'mongoose';
// Base memory interfaces
export interface IMemoryEntry extends Document {
userId: Types.ObjectId;
key: string;
value: string;
tokenCount?: number;
updated_at?: Date;
}
export interface IMemoryEntryLean {
_id: Types.ObjectId;
userId: Types.ObjectId;
key: string;
value: string;
tokenCount?: number;
updated_at?: Date;
__v?: number;
}
// Method parameter interfaces
export interface SetMemoryParams {
userId: string | Types.ObjectId;
key: string;
value: string;
tokenCount?: number;
}
export interface DeleteMemoryParams {
userId: string | Types.ObjectId;
key: string;
}
export interface GetFormattedMemoriesParams {
userId: string | Types.ObjectId;
}
// Result interfaces
export interface MemoryResult {
ok: boolean;
}
export interface FormattedMemoriesResult {
withKeys: string;
withoutKeys: string;
totalTokens?: number;
}

View file

@ -12,6 +12,12 @@ export interface IRole extends Document {
[Permissions.USE]?: boolean;
[Permissions.CREATE]?: boolean;
};
[PermissionTypes.MEMORIES]?: {
[Permissions.USE]?: boolean;
[Permissions.CREATE]?: boolean;
[Permissions.UPDATE]?: boolean;
[Permissions.READ]?: boolean;
};
[PermissionTypes.AGENTS]?: {
[Permissions.SHARED_GLOBAL]?: boolean;
[Permissions.USE]?: boolean;

View file

@ -30,6 +30,9 @@ export interface IUser extends Document {
}>;
expiresAt?: Date;
termsAccepted?: boolean;
personalization?: {
memories?: boolean;
};
createdAt?: Date;
updatedAt?: Date;
}