OpenHands uses LiteLLM to make calls to Google's chat models. You can find their documentation on using Google as a provider:
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
LLM Provider to GeminiLLM Model to the model you will be using.
If the model is not in the list, toggle Advanced Options, and enter it in Custom Model (e.g. gemini/<model-name> like gemini/gemini-1.5-pro).API Key to your Gemini API keyTo use Vertex AI through Google Cloud Platform when running OpenHands, you'll need to set the following environment
variables using -e in the docker run command:
GOOGLE_APPLICATION_CREDENTIALS="<json-dump-of-gcp-service-account-json>"
VERTEXAI_PROJECT="<your-gcp-project-id>"
VERTEXAI_LOCATION="<your-gcp-location>"
Then set the following in the OpenHands UI through the Settings:
LLM Provider to VertexAILLM Model to the model you will be using.
If the model is not in the list, toggle Advanced Options, and enter it in Custom Model (e.g. vertex_ai/<model-name>).