* fix: correct preset title for Anthropic endpoint
* fix(Settings/Anthropic): show correct default value for LLM temperature
* fix(AnthropicClient): use `getModelMaxTokens` to get the correct LLM max context tokens, correctly set default temperature to 1, use only 2 params for class constructor, use `getResponseSender` to add correct sender to response message
* refactor(/api/ask|edit/anthropic): save messages to database after the final response is sent to the client, and do not save conversation from route controller
* fix(initializeClient/anthropic): correctly pass client options (endpointOption) to class initialization
* feat(ModelService/Anthropic): add claude-1.2