Ver Fonte

Missed adding Groq to sidebar so it shows up (#3885)

mamoodi há 1 ano atrás
pai
commit
834296f7b4

+ 2 - 2
docs/modules/usage/llms/azure-llms.md

@@ -27,8 +27,8 @@ You will need your ChatGPT deployment name which can be found on the deployments
 
 * Enable `Advanced Options`
 * `Custom Model` to azure/<deployment-name>
-* `Base URL` to your Azure API Base URL (Example: https://example-endpoint.openai.azure.com)
-* `API Key`
+* `Base URL` to your Azure API Base URL (Example: `https://example-endpoint.openai.azure.com`)
+* `API Key` to your Azure API key
 
 ## Embeddings
 

+ 1 - 1
docs/modules/usage/llms/google-llms.md

@@ -11,7 +11,7 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
 * `LLM Provider` to `Gemini`
 * `LLM Model` to the model you will be using.
 If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. gemini/<model-name>).
-* `API Key`
+* `API Key` to your Gemini API key
 
 ## VertexAI - Google Cloud Platform Configs
 

+ 11 - 7
docs/modules/usage/llms/groq.md

@@ -1,4 +1,4 @@
-# Running LLMs on Groq
+# Groq
 
 OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their full documentation on using Groq as provider [here](https://docs.litellm.ai/docs/providers/groq).
 
@@ -6,14 +6,18 @@ OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their
 
 When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
 * `LLM Provider` to `Groq`
-* `LLM Model` to the model you will be using
-* `API key` to your Groq API key. To find or create your Groq API Key, [see **here**](https://console.groq.com/keys).
+* `LLM Model` to the model you will be using. [Visit **here** to see the list of
+models that Groq hosts](https://console.groq.com/docs/models). If the model is not in the list, toggle
+`Advanced Options`, and enter it in `Custom Model` (i.e. groq/<model-name>)
+* `API key` to your Groq API key. To find or create your Groq API Key, [see **here**](https://console.groq.com/keys)
+
 
-Visit [here](https://console.groq.com/docs/models) to see the list of models that Groq hosts.
 
 ## Using Groq as an OpenAI-Compatible Endpoint
 
-The Groq endpoint for chat completion is [mostly OpenAI-compatible](https://console.groq.com/docs/openai). Therefore, if you wish, you can access Groq models as you would access any OpenAI-compatible endpoint. You can toggle `Advanced Options` and set the following:
-* `Custom Model` to the prefix `openai/` + the model you will be using, e.g. `openai/llama3-8b-8192`
-* `API Key` to your Groq API key
+The Groq endpoint for chat completion is [mostly OpenAI-compatible](https://console.groq.com/docs/openai). Therefore, you can access Groq models as you
+would access any OpenAI-compatible endpoint. You can set the following in the OpenHands UI through the Settings:
+* Enable `Advanced Options`
+* `Custom Model` to the prefix `openai/` + the model you will be using (Example: `openai/llama3-8b-8192`)
 * `Base URL` to `https://api.groq.com/openai/v1`
+* `API Key` to your Groq API key

+ 6 - 5
docs/modules/usage/llms/openai-llms.md

@@ -9,7 +9,7 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
 * `LLM Model` to the model you will be using.
 [Visit **here** to see a full list of OpenAI models that LiteLLM supports.](https://docs.litellm.ai/docs/providers/openai#openai-chat-completion-models)
 If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. openai/<model-name>).
-* `API Key`. To find or create your OpenAI Project API Key, [see **here**](https://platform.openai.com/api-keys).
+* `API Key` to your OpenAI API key. To find or create your OpenAI Project API Key, [see **here**](https://platform.openai.com/api-keys).
 
 ## Using OpenAI-Compatible Endpoints
 
@@ -17,7 +17,8 @@ Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoi
 
 ## Using an OpenAI Proxy
 
-If you're using an OpenAI proxy, you'll need to toggle `Advanced Options` in the OpenHands Settings, and set the following:
-* `Custom Model` to the model you will be using, e.g. `openai/gpt-4o` or `openai/<proxy_prefix>/<model_name>`
-* `API Key` to your API key.
-* `Base URL` to the URL of your OpenAI proxy.
+If you're using an OpenAI proxy, you'll need to set the following in the OpenHands UI through the Settings:
+* Enable `Advanced Options`
+* `Custom Model` to openai/<model-name> (i.e.: `openai/gpt-4o` or openai/<proxy-prefix>/<model-name>)
+* `Base URL` to the URL of your OpenAI proxy
+* `API Key` to your OpenAI API key

+ 4 - 0
docs/sidebars.ts

@@ -52,6 +52,10 @@ const sidebars: SidebarsConfig = {
       type: 'doc',
       label: 'Google',
       id: 'usage/llms/google-llms',
+    }, {
+      type: 'doc',
+      label: 'Groq',
+      id: 'usage/llms/groq',
     }, {
       type: 'doc',
       label: 'Local/ollama',