This guide details all configuration options available for OpenHands, helping you customize its behavior and integrate it with other services.
:::note If you are running in GUI Mode, the settings available in the Settings UI will always take precedence. :::
The core configuration options are defined in the [core] section of the config.toml file.
API Keys
e2b_api_key
str""modal_api_token_id
str""modal_api_token_secret
str""Workspace
workspace_base
str"./workspace"cache_dir
str"/tmp/cache"Debugging and Logging
debug
boolfalsedisable_color
boolfalseTrajectories
trajectories_path
str"./trajectories"File Store
file_store_path
str"/tmp/file_store"file_store
str"memory"file_uploads_allowed_extensions
list of str[".*"]file_uploads_max_file_size_mb
int0file_uploads_restrict_file_types
boolfalsefile_uploads_allowed_extensions
list of str[".*"]Task Management
max_budget_per_task
float0.0max_iterations
int100Sandbox Configuration
workspace_mount_path_in_sandbox
str"/workspace"workspace_mount_path
str""workspace_mount_rewrite
str""Miscellaneous
run_as_openhands
booltrueruntime
str"eventstream"default_agent
str"CodeActAgent"jwt_secret
struuid.uuid4().hexThe LLM (Large Language Model) configuration options are defined in the [llm] section of the config.toml file.
To use these with the docker command, pass in -e LLM_<option>. Example: -e LLM_NUM_RETRIES.
AWS Credentials
aws_access_key_id
str""aws_region_name
str""aws_secret_access_key
str""API Configuration
api_key
strNonebase_url
str""api_version
str""input_cost_per_token
float0.0output_cost_per_token
float0.0Custom LLM Provider
custom_llm_provider
str""Embeddings
embedding_base_url
str""embedding_deployment_name
str""embedding_model
str"local"Message Handling
max_message_chars
int30000max_input_tokens
int0max_output_tokens
int0Model Selection
model
str"claude-3-5-sonnet-20241022"Retrying
num_retries
int8retry_max_wait
int120retry_min_wait
int15retry_multiplier
float2.0Advanced Options
drop_params
boolfalsecaching_prompt
booltrueollama_base_url
str""temperature
float0.0timeout
int0top_p
float1.0disable_vision
boolNoneThe agent configuration options are defined in the [agent] and [agent.<agent_name>] sections of the config.toml file.
Microagent Configuration
micro_agent_name
str""Memory Configuration
memory_enabled
boolfalsememory_max_threads
int3LLM Configuration
llm_config
str'your-llm-config-group'ActionSpace Configuration
function_calling
booltruecodeact_enable_browsing
boolfalsecodeact_enable_llm_editor
boolfalsecodeact_enable_jupyter
boolfalseMicroagent Usage
use_microagents
booltruedisabled_microagents
list of strNoneThe sandbox configuration options are defined in the [sandbox] section of the config.toml file.
To use these with the docker command, pass in -e SANDBOX_<option>. Example: -e SANDBOX_TIMEOUT.
Execution
timeout
int120user_id
int1000Container Image
base_container_image
str"nikolaik/python-nodejs:python3.12-nodejs22"Networking
use_host_network
boolfalseLinting and Plugins
enable_auto_lint
boolfalseinitialize_plugins
booltrueDependencies and Environment
runtime_extra_deps
str""runtime_startup_env_vars
dict{}Evaluation
browsergym_eval_env
str""The security configuration options are defined in the [security] section of the config.toml file.
To use these with the docker command, pass in -e SECURITY_<option>. Example: -e SECURITY_CONFIRMATION_MODE.
Confirmation Mode
confirmation_mode
boolfalseSecurity Analyzer
security_analyzer
str""Note: Adjust configurations carefully, especially for memory, security, and network-related settings to ensure optimal performance and security. Please note that the configuration options may be subject to change in future versions of OpenHands. It's recommended to refer to the official documentation for the most up-to-date information.