Переглянути джерело

Correct workspace directory config variable in README (#1282)

Jim Su 1 рік тому
батько
коміт
3193014ed8
3 змінених файлів з 52 додано та 47 видалено
  1. 26 23
      README.md
  2. 23 21
      docs/README-zh.md
  3. 3 3
      docs/guides/LocalLLMs.md

+ 26 - 23
README.md

@@ -1,6 +1,7 @@
 [English](README.md) | [中文](docs/README-zh.md)
 
 <a name="readme-top"></a>
+
 <!--
 *** Thanks for checking out the Best-README-Template. If you have a suggestion
 *** that would make this better, please fork the repo and create a pull request
@@ -9,8 +10,6 @@
 *** Thanks again! Now go create something AMAZING! :D
 -->
 
-
-
 <!-- PROJECT SHIELDS -->
 <!--
 *** I'm using markdown "reference style" links for readability.
@@ -37,9 +36,6 @@
   <h1 align="center">OpenDevin: Code Less, Make More</h1>
 </div>
 
-
-
-
 <!-- TABLE OF CONTENTS -->
 <details>
   <summary>🗂️ Table of Contents</summary>
@@ -69,7 +65,6 @@
 
 [Project Demo Video](https://github.com/OpenDevin/OpenDevin/assets/38853559/71a472cc-df34-430c-8b1d-4d7286c807c9)
 
-
 Welcome to OpenDevin, an open-source project aiming to replicate Devin, an autonomous AI software engineer who is capable of executing complex engineering tasks and collaborating actively with users on software development projects. This project aspires to replicate, enhance, and innovate upon Devin through the power of the open-source community.
 
 <p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
@@ -79,6 +74,7 @@ Welcome to OpenDevin, an open-source project aiming to replicate Devin, an auton
 </p>
 
 ## 🤔 What is Devin?
+
 Devin represents a cutting-edge autonomous agent designed to navigate the complexities of software engineering. It leverages a combination of tools such as a shell, code editor, and web browser, showcasing the untapped potential of LLMs in software development. Our goal is to explore and expand upon Devin's capabilities, identifying both its strengths and areas for improvement, to guide the progress of open code models.
 
 <p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
@@ -88,6 +84,7 @@ Devin represents a cutting-edge autonomous agent designed to navigate the comple
 </p>
 
 ## 🐚 Why OpenDevin?
+
 The OpenDevin project is born out of a desire to replicate, enhance, and innovate beyond the original Devin model. By engaging the open-source community, we aim to tackle the challenges faced by Code LLMs in practical scenarios, producing works that significantly contribute to the community and pave the way for future advancements.
 
 <p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
@@ -114,31 +111,35 @@ After completing the MVP, the team will focus on research in various areas, incl
 </p>
 
 ## ⚠️ Caveats and Warnings
-* OpenDevin is still an alpha project. It is changing very quickly and is unstable. We are working on getting a stable release out in the coming weeks.
-* OpenDevin will issue many prompts to the LLM you configure. Most of these LLMs cost money--be sure to set spending limits and monitor usage.
-* OpenDevin runs `bash` commands within a Docker sandbox, so it should not affect your machine. But your workspace directory will be attached to that sandbox, and files in the directory may be modified or deleted.
-* Our default Agent is currently the MonologueAgent, which has limited capabilities, but is fairly stable. We're working on other Agent implementations, including [SWE Agent](https://swe-agent.com/). You can [read about our current set of agents here](./docs/Agents.md).
+
+- OpenDevin is still an alpha project. It is changing very quickly and is unstable. We are working on getting a stable release out in the coming weeks.
+- OpenDevin will issue many prompts to the LLM you configure. Most of these LLMs cost money--be sure to set spending limits and monitor usage.
+- OpenDevin runs `bash` commands within a Docker sandbox, so it should not affect your machine. But your workspace directory will be attached to that sandbox, and files in the directory may be modified or deleted.
+- Our default Agent is currently the MonologueAgent, which has limited capabilities, but is fairly stable. We're working on other Agent implementations, including [SWE Agent](https://swe-agent.com/). You can [read about our current set of agents here](./docs/Agents.md).
 
 ## 🚀 Get Started
+
 The easiest way to run OpenDevin is inside a Docker container.
 
 To start the app, run these commands, replacing `$(pwd)/workspace` with the path to the code you want OpenDevin to work with.
+
 ```bash
 # Your OpenAI API key, or any other LLM API key
 export LLM_API_KEY="sk-..."
 
 # The directory you want OpenDevin to modify. MUST be an absolute path!
-export WORKSPACE_DIR=$(pwd)/workspace
+export WORKSPACE_BASE=$(pwd)/workspace
 
 docker run \
     -e LLM_API_KEY \
-    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_DIR \
-    -v $WORKSPACE_DIR:/opt/workspace_base \
+    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
+    -v $WORKSPACE_BASE:/opt/workspace_base \
     -v /var/run/docker.sock:/var/run/docker.sock \
     -p 3000:3000 \
     --add-host host.docker.internal=host-gateway \
     ghcr.io/opendevin/opendevin:0.3.1
 ```
+
 You'll find opendevin running at `http://localhost:3000`.
 
 If you want to use the (unstable!) bleeding edge, you can use `ghcr.io/opendevin/opendevin:main` as the image.
@@ -148,6 +149,7 @@ See [Development.md](Development.md) for instructions on running OpenDevin witho
 Having trouble? Check out our [Troubleshooting Guide](./docs/guides/Troubleshooting.md).
 
 ## 🤖 LLM Backends
+
 OpenDevin can work with any LLM backend.
 For a full list of the LM providers and models available, please consult the
 [litellm documentation](https://docs.litellm.ai/docs/providers).
@@ -157,15 +159,17 @@ But when using the OpenDevin UI, you'll need to choose your model in the setting
 wheel on the bottom left).
 
 The following environment variables might be necessary for some LLMs:
-* `LLM_API_KEY`
-* `LLM_BASE_URL`
-* `LLM_EMBEDDING_MODEL`
-* `LLM_EMBEDDING_DEPLOYMENT_NAME`
-* `LLM_API_VERSION`
+
+- `LLM_API_KEY`
+- `LLM_BASE_URL`
+- `LLM_EMBEDDING_MODEL`
+- `LLM_EMBEDDING_DEPLOYMENT_NAME`
+- `LLM_API_VERSION`
 
 We have a few guides for running OpenDevin with specific model providers:
-* [ollama](./docs/guides/LocalLLMs.md)
-* [Azure](./docs/guides/AzureLLMs.md)
+
+- [ollama](./docs/guides/LocalLLMs.md)
+- [Azure](./docs/guides/AzureLLMs.md)
 
 If you're using another provider, we encourage you to open a PR to share your setup!
 
@@ -177,7 +181,6 @@ poor responses, or errors about malformed JSON. OpenDevin
 can only be as powerful as the models driving it--fortunately folks on our team
 are actively working on building better open source models!
 
-
 **Note on API retries and rate limits:**
 Some LLMs have rate limits and may require retries. OpenDevin will automatically retry requests if it receives a 429 error or API connection error.
 You can set LLM_NUM_RETRIES, LLM_RETRY_MIN_WAIT, LLM_RETRY_MAX_WAIT environment variables to control the number of retries and the time between retries.
@@ -218,8 +221,8 @@ For details, please check [this document](./CONTRIBUTING.md).
 
 Now we have both Slack workspace for the collaboration on building OpenDevin and Discord server for discussion about anything related, e.g., this project, LLM, agent, etc.
 
-* [Slack workspace](https://join.slack.com/t/opendevin/shared_invite/zt-2etftj1dd-X1fDL2PYIVpsmJZkqEYANw)
-* [Discord server](https://discord.gg/mBuDGRzzES)
+- [Slack workspace](https://join.slack.com/t/opendevin/shared_invite/zt-2etftj1dd-X1fDL2PYIVpsmJZkqEYANw)
+- [Discord server](https://discord.gg/mBuDGRzzES)
 
 If you would love to contribute, feel free to join our community (note that now there is no need to fill in the [form](https://forms.gle/758d5p6Ve8r2nxxq6)). Let's simplify software engineering together!
 

+ 23 - 21
docs/README-zh.md

@@ -3,6 +3,7 @@
 [English](../README.md) | [中文](README-zh.md)
 
 <a name="readme-top"></a>
+
 <!--
 *** Thanks for checking out the Best-README-Template. If you have a suggestion
 *** that would make this better, please fork the repo and create a pull request
@@ -11,8 +12,6 @@
 *** Thanks again! Now go create something AMAZING! :D
 -->
 
-
-
 <!-- PROJECT SHIELDS -->
 <!--
 *** I'm using markdown "reference style" links for readability.
@@ -39,9 +38,6 @@
   <h1 align="center">OpenDevin:少写代码,多创作</h1>
 </div>
 
-
-
-
 <!-- TABLE OF CONTENTS -->
 <details>
   <summary>🗂️ Table of Contents</summary>
@@ -71,7 +67,6 @@
 
 [Project Demo Video](https://github.com/OpenDevin/OpenDevin/assets/38853559/71a472cc-df34-430c-8b1d-4d7286c807c9)
 
-
 欢迎来到 OpenDevin,一个开源项目,旨在复制 Devin,一款自主的 AI 软件工程师,能够执行复杂的工程任务,并与用户积极合作,共同进行软件开发项目。该项目立志通过开源社区的力量复制、增强和创新 Devin。
 
 <p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
@@ -81,6 +76,7 @@
 </p>
 
 ## 🤔 Devin 是什么?
+
 Devin 代表着一种尖端的自主代理程序,旨在应对软件工程的复杂性。它利用诸如 shell、代码编辑器和 Web 浏览器等工具的组合,展示了在软件开发中利用 LLMs(大型语言模型)的未开发潜力。我们的目标是探索和拓展 Devin 的能力,找出其优势和改进空间,以指导开源代码模型的进展。
 
 <p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
@@ -90,6 +86,7 @@ Devin 代表着一种尖端的自主代理程序,旨在应对软件工程的
 </p>
 
 ## 🐚 为什么选择 OpenDevin?
+
 OpenDevin 项目源于对复制、增强和超越原始 Devin 模型的愿望。通过与开源社区的互动,我们旨在解决 Code LLMs 在实际场景中面临的挑战,创作出对社区有重大贡献并为未来进步铺平道路的作品。
 
 <p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
@@ -116,10 +113,11 @@ OpenDevin 目前仍在进行中,但您已经可以运行 alpha 版本来查看
 </p>
 
 ## ⚠️ 注意事项和警告
-* OpenDevin 仍然是一个 alpha 项目。它变化很快且不稳定。我们正在努力在未来几周发布稳定版本。
-* OpenDevin 会向您配置的 LLM 发出许多提示。大多数 LLM 都需要花费金钱,请务必设置花费限制并监控使用情况。
-* OpenDevin 在 Docker 沙箱中运行 `bash` 命令,因此不应影响您的计算机。但您的工作区目录将附加到该沙箱,并且目录中的文件可能会被修改或删除。
-* 我们默认的代理目前是 MonologueAgent,具有有限的功能,但相当稳定。我们正在开发其他代理实现,包括 [SWE 代理](https://swe-agent.com/)。您可以[在这里阅读我们当前的代理集合](./docs/documentation/Agents.md)。
+
+- OpenDevin 仍然是一个 alpha 项目。它变化很快且不稳定。我们正在努力在未来几周发布稳定版本。
+- OpenDevin 会向您配置的 LLM 发出许多提示。大多数 LLM 都需要花费金钱,请务必设置花费限制并监控使用情况。
+- OpenDevin 在 Docker 沙箱中运行 `bash` 命令,因此不应影响您的计算机。但您的工作区目录将附加到该沙箱,并且目录中的文件可能会被修改或删除。
+- 我们默认的代理目前是 MonologueAgent,具有有限的功能,但相当稳定。我们正在开发其他代理实现,包括 [SWE 代理](https://swe-agent.com/)。您可以[在这里阅读我们当前的代理集合](./docs/documentation/Agents.md)。
 
 ## 🚀 开始
 
@@ -127,21 +125,23 @@ OpenDevin 目前仍在进行中,但您已经可以运行 alpha 版本来查看
 
 运行 OpenDevin 最简单的方法是在 Docker 容器中。
 您可以运行:
+
 ```bash
 # 您的 OpenAI API 密钥,或任何其他 LLM API 密钥
 export LLM_API_KEY="sk-..."
 
 # 您想要 OpenDevin 修改的目录。必须是绝对路径!
-export WORKSPACE_DIR=$(pwd)/workspace
+export WORKSPACE_BASE=$(pwd)/workspace
 
 docker run \
     -e LLM_API_KEY \
-    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_DIR \
-    -v $WORKSPACE_DIR:/opt/workspace_base \
+    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
+    -v $WORKSPACE_BASE:/opt/workspace_base \
     -v /var/run/docker.sock:/var/run/docker.sock \
     -p 3000:3000 \
     ghcr.io/opendevin/opendevin:latest
 ```
+
 将 `$(pwd)/workspace` 替换为您希望 OpenDevin 使用的代码路径。
 
 您可以在 `http://localhost:3000` 找到正在运行的 OpenDevin。
@@ -149,6 +149,7 @@ docker run \
 请参阅[Development.md](Development.md)以获取在没有 Docker 的情况下运行 OpenDevin 的说明。
 
 ## 🤖 LLM 后端
+
 OpenDevin 可以与任何 LLM 后端配合使用。
 要获取提供的 LM 提供商和模型的完整列表,请参阅
 [litellm 文档](https://docs.litellm.ai/docs/providers)。
@@ -157,11 +158,12 @@ OpenDevin 可以与任何 LLM 后端配合使用。
 但在 OpenDevin UI 中选择模型将覆盖此设置。
 
 对于某些 LLM,可能需要以下环境变量:
-* `LLM_API_KEY`
-* `LLM_BASE_URL`
-* `LLM_EMBEDDING_MODEL`
-* `LLM_EMBEDDING_DEPLOYMENT_NAME`
-* `LLM_API_VERSION`
+
+- `LLM_API_KEY`
+- `LLM_BASE_URL`
+- `LLM_EMBEDDING_MODEL`
+- `LLM_EMBEDDING_DEPLOYMENT_NAME`
+- `LLM_API_VERSION`
 
 **关于替代模型的说明:**
 某些替代模型可能比其他模型更具挑战性。
@@ -206,12 +208,12 @@ OpenDevin 是一个社区驱动的项目,我们欢迎所有人的贡献。无
 
 现在我们既有 Slack 工作空间用于协作构建 OpenDevin,也有 Discord 服务器用于讨论与项目、LLM、Agent 等相关的任何事情。
 
-* [Slack 工作空间](https://join.slack.com/t/opendevin/shared_invite/zt-2etftj1dd-X1fDL2PYIVpsmJZkqEYANw)
-* [Discord 服务器](https://discord.gg/mBuDGRzzES)
+- [Slack 工作空间](https://join.slack.com/t/opendevin/shared_invite/zt-2etftj1dd-X1fDL2PYIVpsmJZkqEYANw)
+- [Discord 服务器](https://discord.gg/mBuDGRzzES)
 
 如果你愿意贡献,欢迎加入我们的社区(请注意,现在无需填写[表格](https://forms.gle/758d5p6Ve8r2nxxq6))。让我们一起简化软件工程!
 
-🐚 **少写代码,用OpenDevin创造更多。**
+🐚 **少写代码,用 OpenDevin 创造更多。**
 
 [![Star History Chart](https://api.star-history.com/svg?repos=OpenDevin/OpenDevin&type=Date)](https://star-history.com/#OpenDevin/OpenDevin&Date)
 

+ 3 - 3
docs/guides/LocalLLMs.md

@@ -39,14 +39,14 @@ For example:
 
 ```bash
 # The directory you want OpenDevin to modify. MUST be an absolute path!
-export WORKSPACE_DIR=$(pwd)/workspace
+export WORKSPACE_BASE=$(pwd)/workspace
 
 docker run \
     --add-host host.docker.internal=host-gateway \
     -e LLM_API_KEY="ollama" \
     -e LLM_BASE_URL="http://host.docker.internal:11434" \
-    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_DIR \
-    -v $WORKSPACE_DIR:/opt/workspace_base \
+    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
+    -v $WORKSPACE_BASE:/opt/workspace_base \
     -v /var/run/docker.sock:/var/run/docker.sock \
     -p 3000:3000 \
     ghcr.io/opendevin/opendevin:main