diff --git a/docs/install/configuration/litellm.md b/docs/install/configuration/litellm.md index 0cf0cd836e..87a2f67ac2 100644 --- a/docs/install/configuration/litellm.md +++ b/docs/install/configuration/litellm.md @@ -6,6 +6,7 @@ weight: -7 # Using LibreChat with LiteLLM Proxy Use **[LiteLLM Proxy](https://docs.litellm.ai/docs/simple_proxy)** for: + * Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc. in the OpenAI ChatCompletions & Completions format * Load balancing - between Multiple Models + Deployments of the same model LiteLLM proxy can handle 1k+ requests/second during load tests * Authentication & Spend Tracking Virtual Keys