Răsfoiți Sursa

Fix issue #4912: [Bug]: BedrockException: "The number of toolResult blocks at messages.2.content exceeds the number of toolUse blocks of previous turn.". (#4937)

Co-authored-by: Xingyao Wang <xingyao@all-hands.dev>
Co-authored-by: Graham Neubig <neubig@gmail.com>
Co-authored-by: mamoodi <mamoodiha@gmail.com>
OpenHands 1 an în urmă
părinte
comite
207df9dd30

+ 20 - 0
docs/modules/usage/llms/litellm-proxy.md

@@ -0,0 +1,20 @@
+# LiteLLM Proxy
+
+OpenHands supports using the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/quick_start) to access various LLM providers.
+
+## Configuration
+
+To use LiteLLM proxy with OpenHands, you need to:
+
+1. Set up a LiteLLM proxy server (see [LiteLLM documentation](https://docs.litellm.ai/docs/proxy/quick_start))
+2. When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
+  * Enable `Advanced Options`
+  * `Custom Model` to the prefix `litellm_proxy/` + the model you will be using (e.g. `litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0`)
+  * `Base URL` to your LiteLLM proxy URL (e.g. `https://your-litellm-proxy.com`)
+  * `API Key` to your LiteLLM proxy API key
+
+## Supported Models
+
+The supported models depend on your LiteLLM proxy configuration. OpenHands supports any model that your LiteLLM proxy is configured to handle.
+
+Refer to your LiteLLM proxy configuration for the list of available models and their names.

+ 1 - 0
docs/modules/usage/llms/llms.md

@@ -63,6 +63,7 @@ We have a few guides for running OpenHands with specific model providers:
 - [Azure](llms/azure-llms)
 - [Google](llms/google-llms)
 - [Groq](llms/groq)
+- [LiteLLM Proxy](llms/litellm-proxy)
 - [OpenAI](llms/openai-llms)
 - [OpenRouter](llms/openrouter)
 

+ 5 - 0
docs/sidebars.ts

@@ -76,6 +76,11 @@ const sidebars: SidebarsConfig = {
                   label: 'Groq',
                   id: 'usage/llms/groq',
                 },
+                {
+                  type: 'doc',
+                  label: 'LiteLLM Proxy',
+                  id: 'usage/llms/litellm-proxy',
+                },
                 {
                   type: 'doc',
                   label: 'OpenAI',