* feat: Add thinking and thinkingBudget parameters for Bedrock Anthropic models
* chore: Update @librechat/agents to version 2.1.8
* refactor: change region order in params
* refactor: Add maxTokens parameter to conversation preset schema
* refactor: Update agent client to use bedrockInputSchema and improve error handling for model parameters
* refactor: streamline/optimize llmConfig initialization and saving for bedrock
* fix: ensure config titleModel is used for all endpoints
* refactor: enhance OpenAIClient and agent initialization to support endpoint checks for OpenRouter
* chore: bump @google/generative-ai
* feat: Refactor ModelEndHandler to collect usage metadata only if it exists
* feat: google tool end handling, custom anthropic class for better token ux
* refactor: differentiate between client <> request options
* feat: initial support for google agents
* feat: only cache messages with non-empty text
* feat: Cache non-empty messages in chatV2 controller
* fix: anthropic llm client options llmConfig
* refactor: streamline client options handling in LLM configuration
* fix: VertexAI Agent Auth & Tool Handling
* fix: additional fields for llmConfig, however customHeaders are not supported by langchain, requires PR
* feat: set default location for vertexai LLM configuration
* fix: outdated OpenAI Client options for getLLMConfig
* chore: agent provider options typing
* chore: add note about currently unsupported customHeaders in langchain GenAI client
* fix: skip transaction creation when rawAmount is NaN