📥 feat: Import Conversations from LibreChat, ChatGPT, Chatbot UI (#2355)

* Basic implementation of ChatGPT conversation import

* remove debug code

* Handle citations

* Fix updatedAt in import

* update default model

* Use job scheduler to handle import requests

* import job status endpoint

* Add wrapper around Agenda

* Rate limits for import endpoint

* rename import api path

* Batch save import to mongo

* Improve naming

* Add documenting comments

* Test for importers

* Change button for importing conversations

* Frontend changes

* Import job status endpoint

* Import endpoint response

* Add translations to new phrases

* Fix conversations refreshing

* cleanup unused functions

* set timeout for import job status polling

* Add documentation

* get extra spaces back

* Improve error message

* Fix translation files after merge

* fix translation files 2

* Add zh translation for import functionality

* Sync mailisearch index after import

* chore: add dummy uri for jest tests, as MONGO_URI should only be real for E2E tests

* docs: fix links

* docs: fix conversationsImport section

* fix: user role issue for librechat imports

* refactor: import conversations from json
- organize imports
- add additional jsdocs
- use multer with diskStorage to avoid loading file into memory outside of job
- use filepath instead of loading data string for imports
- replace console logs and some logger.info() with logger.debug
- only use multer for import route

* fix: undefined metadata edge case and replace ChatGtp -> ChatGpt

* Refactor importChatGptConvo function to handle undefined metadata edge case and replace ChatGtp with ChatGpt

* fix: chatgpt importer

* feat: maintain tree relationship for librechat messages

* chore: use enum

* refactor: saveMessage to use single object arg, replace console logs, add userId to log message

* chore: additional comment

* chore: multer edge case

* feat: first pass, maintain tree relationship

* chore: organize

* chore: remove log

* ci: add heirarchy test for chatgpt

* ci: test maintaining of heirarchy for librechat

* wip: allow non-text content type messages

* refactor: import content part object json string

* refactor: more content types to format

* chore: consolidate messageText formatting

* docs: update on changes, bump data-provider/config versions, update readme

* refactor(indexSync): singleton pattern for MeiliSearchClient

* refactor: debug log after batch is done

* chore: add back indexSync error handling

---------

Co-authored-by: jakubmieszczak <jakub.mieszczak@zendesk.com>
Co-authored-by: Danny Avila <danny@librechat.ai>
This commit is contained in:
Denis Palnitsky 2024-05-02 08:48:26 +02:00 committed by GitHub
parent 3b44741cf9
commit ab6fbe48f1
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
64 changed files with 3795 additions and 98 deletions

View file

@ -30,6 +30,24 @@ module.exports = {
return { message: 'Error saving conversation' };
}
},
bulkSaveConvos: async (conversations) => {
try {
const bulkOps = conversations.map((convo) => ({
updateOne: {
filter: { conversationId: convo.conversationId, user: convo.user },
update: convo,
upsert: true,
timestamps: false,
},
}));
const result = await Conversation.bulkWrite(bulkOps);
return result;
} catch (error) {
logger.error('[saveBulkConversations] Error saving conversations in bulk', error);
throw new Error('Failed to save conversations in bulk.');
}
},
getConvosByPage: async (user, pageNumber = 1, pageSize = 25) => {
try {
const totalConvos = (await Conversation.countDocuments({ user })) || 1;

View file

@ -74,6 +74,25 @@ module.exports = {
throw new Error('Failed to save message.');
}
},
async bulkSaveMessages(messages) {
try {
const bulkOps = messages.map((message) => ({
updateOne: {
filter: { messageId: message.messageId },
update: message,
upsert: true,
},
}));
const result = await Message.bulkWrite(bulkOps);
return result;
} catch (err) {
logger.error('Error saving messages in bulk:', err);
throw new Error('Failed to save messages in bulk.');
}
},
/**
* Records a message in the database.
*