OpenHands uses LiteLLM for completion calls. The following resources are relevant for using OpenHands with Google's LLMs:
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
LLM Provider to GeminiLLM Model to the model you will be using.
If the model is not in the list, toggle Advanced Options, and enter it in Custom Model (i.e. gemini/<model-name>).API KeyTo use Vertex AI through Google Cloud Platform when running OpenHands, you'll need to set the following environment
variables using -e in the docker run command:
GOOGLE_APPLICATION_CREDENTIALS="<json-dump-of-gcp-service-account-json>"
VERTEXAI_PROJECT="<your-gcp-project-id>"
VERTEXAI_LOCATION="<your-gcp-location>"
Then set the following in the OpenHands UI through the Settings:
LLM Provider to VertexAILLM Model to the model you will be using.
If the model is not in the list, toggle Advanced Options, and enter it in Custom Model (i.e. vertex_ai/<model-name>).