mirror of
https://github.com/danny-avila/LibreChat.git
synced 2025-12-17 00:40:14 +01:00
🔧 fix: Google Gemma Support & OpenAI Reasoning Instructions (#7196)
* 🔄 chore: Update @langchain/google-vertexai to version 0.2.5 in package.json and package-lock.json * chore: temp remove agents * 🔄 chore: Update @langchain/google-genai to version 0.2.5 in package.json and package-lock.json * 🔄 chore: Update @langchain/community to version 0.3.42 in package.json and package-lock.json * 🔄 chore: Add license information for @langchain/textsplitters in package-lock.json * 🔄 chore: Update @langchain/core to version 0.3.51 in package.json and package-lock.json * 🔄 chore: Update openai dependency to version 4.96.2 in package.json and package-lock.json * chore: @librechat/agents to v2.4.30 * fix: streaming condition in ModelEndHandler to account for boundModel `disableStreaming` setting * fix: update regex for noSystemModel and refactor message handling in AgentClient * feat: Google Gemma models * chore: remove unnecessary empty JSX fragment in PopoverButtons component
This commit is contained in:
parent
5d6d13efe8
commit
37b50736bc
11 changed files with 120 additions and 121 deletions
|
|
@ -140,8 +140,7 @@ class GoogleClient extends BaseClient {
|
|||
this.options.attachments?.then((attachments) => this.checkVisionRequest(attachments));
|
||||
|
||||
/** @type {boolean} Whether using a "GenerativeAI" Model */
|
||||
this.isGenerativeModel =
|
||||
this.modelOptions.model.includes('gemini') || this.modelOptions.model.includes('learnlm');
|
||||
this.isGenerativeModel = /gemini|learnlm|gemma/.test(this.modelOptions.model);
|
||||
|
||||
this.maxContextTokens =
|
||||
this.options.maxContextTokens ??
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue