Browse Source

Add docs for Groq and for OpenAI proxies (#3877)

Engel Nyst 1 năm trước cách đây
mục cha
commit
f6709d08ef
2 tập tin đã thay đổi với 26 bổ sung0 xóa
  1. 19 0
      docs/modules/usage/llms/groq.md
  2. 7 0
      docs/modules/usage/llms/openai-llms.md

+ 19 - 0
docs/modules/usage/llms/groq.md

@@ -0,0 +1,19 @@
+# Running LLMs on Groq
+
+OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their full documentation on using Groq as provider [here](https://docs.litellm.ai/docs/providers/groq).
+
+## Configuration
+
+When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
+* `LLM Provider` to `Groq`
+* `LLM Model` to the model you will be using
+* `API key` to your Groq API key. To find or create your Groq API Key, [see **here**](https://console.groq.com/keys).
+
+Visit [here](https://console.groq.com/docs/models) to see the list of models that Groq hosts.
+
+## Using Groq as an OpenAI-Compatible Endpoint
+
+The Groq endpoint for chat completion is [mostly OpenAI-compatible](https://console.groq.com/docs/openai). Therefore, if you wish, you can access Groq models as you would access any OpenAI-compatible endpoint. You can toggle `Advanced Options` and set the following:
+* `Custom Model` to the prefix `openai/` + the model you will be using, e.g. `openai/llama3-8b-8192`
+* `API Key` to your Groq API key
+* `Base URL` to `https://api.groq.com/openai/v1`

+ 7 - 0
docs/modules/usage/llms/openai-llms.md

@@ -14,3 +14,10 @@ If the model is not in the list, toggle `Advanced Options`, and enter it in `Cus
 ## Using OpenAI-Compatible Endpoints
 
 Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoints. You can find their full documentation on this topic [here](https://docs.litellm.ai/docs/providers/openai_compatible).
+
+## Using an OpenAI Proxy
+
+If you're using an OpenAI proxy, you'll need to toggle `Advanced Options` in the OpenHands Settings, and set the following:
+* `Custom Model` to the model you will be using, e.g. `openai/gpt-4o` or `openai/<proxy_prefix>/<model_name>`
+* `API Key` to your API key.
+* `Base URL` to the URL of your OpenAI proxy.