🔧 Fix: Resolve Anthropic Client Issues 🧠 (#1226)

* fix: correct preset title for Anthropic endpoint

* fix(Settings/Anthropic): show correct default value for LLM temperature

* fix(AnthropicClient): use `getModelMaxTokens` to get the correct LLM max context tokens, correctly set default temperature to 1, use only 2 params for class constructor, use `getResponseSender` to add correct sender to response message

* refactor(/api/ask|edit/anthropic): save messages to database after the final response is sent to the client, and do not save conversation from route controller

* fix(initializeClient/anthropic): correctly pass client options (endpointOption) to class initialization

* feat(ModelService/Anthropic): add claude-1.2
This commit is contained in:
Danny Avila 2023-11-26 14:44:57 -05:00 committed by GitHub
parent 4b289640f2
commit d7ef4590ea
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
9 changed files with 73 additions and 45 deletions

View file

@ -51,6 +51,8 @@ const maxTokensMap = {
'gpt-3.5-turbo-16k-0613': 15999,
'gpt-3.5-turbo-1106': 16380, // -5 from max
'gpt-4-1106': 127995, // -5 from max
'claude-2.1': 200000,
'claude-': 100000,
};
/**