Ver Fonte

Merge pull request #378 from hellofinch/main

feat (translator) : add new translator and fix bug.
Byaidu há 1 ano atrás
pai
commit
ba7b3f2a4f

+ 3 - 0
docs/ADVANCED.md

@@ -66,6 +66,9 @@ We've provided a detailed table on the required [environment variables](https://
 | **Dify**             | `dify`         | `DIFY_API_URL`, `DIFY_API_KEY`                                        | `[Your DIFY URL]`, `[Your Key]`                          | See [Dify](https://github.com/langgenius/dify),Three variables, lang_out, lang_in, and text, need to be defined in Dify's workflow input.                                                                 |
 | **Dify**             | `dify`         | `DIFY_API_URL`, `DIFY_API_KEY`                                        | `[Your DIFY URL]`, `[Your Key]`                          | See [Dify](https://github.com/langgenius/dify),Three variables, lang_out, lang_in, and text, need to be defined in Dify's workflow input.                                                                 |
 | **AnythingLLM**      | `anythingllm`  | `AnythingLLM_URL`, `AnythingLLM_APIKEY`                               | `[Your AnythingLLM URL]`, `[Your Key]`                   | See [anything-llm](https://github.com/Mintplex-Labs/anything-llm)                                                                                                                                         |
 | **AnythingLLM**      | `anythingllm`  | `AnythingLLM_URL`, `AnythingLLM_APIKEY`                               | `[Your AnythingLLM URL]`, `[Your Key]`                   | See [anything-llm](https://github.com/Mintplex-Labs/anything-llm)                                                                                                                                         |
 |**Argos Translate**|`argos`| | |See [argos-translate](https://github.com/argosopentech/argos-translate)|
 |**Argos Translate**|`argos`| | |See [argos-translate](https://github.com/argosopentech/argos-translate)|
+|**Grok**|`grok`| `GORK_API_KEY`, `GORK_MODEL` | `[Your GORK_API_KEY]`, `grok-2-1212` |See [Grok](https://docs.x.ai/docs/overview)|
+|**DeepSeek**|`deepseek`| `DEEPSEEK_API_KEY`, `DEEPSEEK_MODEL` | `[Your DEEPSEEK_API_KEY]`, `deepseek-chat` |See [DeepSeek](https://www.deepseek.com/)|
+|**OpenAI-Liked**|`openai-liked`| `OPENAILIKE_BASE_URL`, `OPENAILIKE_API_KEY`, `OPENAILIKE_MODEL` | `url`, `[Your Key]`, `model name` | None |
 
 
 For large language models that are compatible with the OpenAI API but not listed in the table above, you can set environment variables using the same method outlined for OpenAI in the table.
 For large language models that are compatible with the OpenAI API but not listed in the table above, you can set environment variables using the same method outlined for OpenAI in the table.
 
 

+ 3 - 0
docs/README_ja-JP.md

@@ -224,6 +224,9 @@ pdf2zh example.pdf -li en -lo ja
 |**Dify**|`dify`|`DIFY_API_URL`, `DIFY_API_KEY`|`[Your DIFY URL]`, `[Your Key]`|See [Dify](https://github.com/langgenius/dify),Three variables, lang_out, lang_in, and text, need to be defined in Dify's workflow input.|
 |**Dify**|`dify`|`DIFY_API_URL`, `DIFY_API_KEY`|`[Your DIFY URL]`, `[Your Key]`|See [Dify](https://github.com/langgenius/dify),Three variables, lang_out, lang_in, and text, need to be defined in Dify's workflow input.|
 |**AnythingLLM**|`anythingllm`|`AnythingLLM_URL`, `AnythingLLM_APIKEY`|`[Your AnythingLLM URL]`, `[Your Key]`|See [anything-llm](https://github.com/Mintplex-Labs/anything-llm)|
 |**AnythingLLM**|`anythingllm`|`AnythingLLM_URL`, `AnythingLLM_APIKEY`|`[Your AnythingLLM URL]`, `[Your Key]`|See [anything-llm](https://github.com/Mintplex-Labs/anything-llm)|
 |**Argos Translate**|`argos`| | |See [argos-translate](https://github.com/argosopentech/argos-translate)|
 |**Argos Translate**|`argos`| | |See [argos-translate](https://github.com/argosopentech/argos-translate)|
+|**Grok**|`grok`| `GORK_API_KEY`, `GORK_MODEL` | `[Your GORK_API_KEY]`, `grok-2-1212` |See [Grok](https://docs.x.ai/docs/overview)|
+|**DeepSeek**|`deepseek`| `DEEPSEEK_API_KEY`, `DEEPSEEK_MODEL` | `[Your DEEPSEEK_API_KEY]`, `deepseek-chat` |See [DeepSeek](https://www.deepseek.com/)|
+|**OpenAI-Liked**|`openai-liked`| `OPENAILIKE_BASE_URL`, `OPENAILIKE_API_KEY`, `OPENAILIKE_MODEL` | `url`, `[Your Key]`, `model name` | None |
 
 
 (need Japenese translation)
 (need Japenese translation)
 For large language models that are compatible with the OpenAI API but not listed in the table above, you can set environment variables using the same method outlined for OpenAI in the table.
 For large language models that are compatible with the OpenAI API but not listed in the table above, you can set environment variables using the same method outlined for OpenAI in the table.

+ 3 - 0
docs/README_zh-CN.md

@@ -226,6 +226,9 @@ pdf2zh example.pdf -li en -lo ja
 |**Dify**|`dify`|`DIFY_API_URL`, `DIFY_API_KEY`|`[Your DIFY URL]`, `[Your Key]`|See [Dify](https://github.com/langgenius/dify),Three variables, lang_out, lang_in, and text, need to be defined in Dify's workflow input.|
 |**Dify**|`dify`|`DIFY_API_URL`, `DIFY_API_KEY`|`[Your DIFY URL]`, `[Your Key]`|See [Dify](https://github.com/langgenius/dify),Three variables, lang_out, lang_in, and text, need to be defined in Dify's workflow input.|
 |**AnythingLLM**|`anythingllm`|`AnythingLLM_URL`, `AnythingLLM_APIKEY`|`[Your AnythingLLM URL]`, `[Your Key]`|See [anything-llm](https://github.com/Mintplex-Labs/anything-llm)|
 |**AnythingLLM**|`anythingllm`|`AnythingLLM_URL`, `AnythingLLM_APIKEY`|`[Your AnythingLLM URL]`, `[Your Key]`|See [anything-llm](https://github.com/Mintplex-Labs/anything-llm)|
 |**Argos Translate**|`argos`| | |See [argos-translate](https://github.com/argosopentech/argos-translate)|
 |**Argos Translate**|`argos`| | |See [argos-translate](https://github.com/argosopentech/argos-translate)|
+|**Grok**|`grok`| `GORK_API_KEY`, `GORK_MODEL` | `[Your GORK_API_KEY]`, `grok-2-1212` |See [Grok](https://docs.x.ai/docs/overview)|
+|**DeepSeek**|`deepseek`| `DEEPSEEK_API_KEY`, `DEEPSEEK_MODEL` | `[Your DEEPSEEK_API_KEY]`, `deepseek-chat` |See [DeepSeek](https://www.deepseek.com/)|
+|**OpenAI-Liked**|`openai-liked`| `OPENAILIKE_BASE_URL`, `OPENAILIKE_API_KEY`, `OPENAILIKE_MODEL` | `url`, `[Your Key]`, `model name` | None |
 
 
 对于未在上述表格中的,并且兼容 OpenAI api 的大语言模型,可使用表格中的 OpenAI 的方式进行环境变量的设置。
 对于未在上述表格中的,并且兼容 OpenAI api 的大语言模型,可使用表格中的 OpenAI 的方式进行环境变量的设置。
 
 

+ 8 - 1
pdf2zh/converter.py

@@ -36,6 +36,9 @@ from pdf2zh.translator import (
     AnythingLLMTranslator,
     AnythingLLMTranslator,
     XinferenceTranslator,
     XinferenceTranslator,
     ArgosTranslator,
     ArgosTranslator,
+    GorkTranslator,
+    DeepseekTranslator,
+    OpenAIlikedTranslator,
 )
 )
 from pymupdf import Font
 from pymupdf import Font
 
 
@@ -150,8 +153,12 @@ class TranslateConverter(PDFConverterEx):
         param = service.split(":", 1)
         param = service.split(":", 1)
         service_name = param[0]
         service_name = param[0]
         service_model = param[1] if len(param) > 1 else None
         service_model = param[1] if len(param) > 1 else None
+        if not envs:
+            envs = {}
+        if not prompt:
+            prompt = []
         for translator in [GoogleTranslator, BingTranslator, DeepLTranslator, DeepLXTranslator, OllamaTranslator, XinferenceTranslator, AzureOpenAITranslator,
         for translator in [GoogleTranslator, BingTranslator, DeepLTranslator, DeepLXTranslator, OllamaTranslator, XinferenceTranslator, AzureOpenAITranslator,
-                           OpenAITranslator, ZhipuTranslator, ModelScopeTranslator, SiliconTranslator, GeminiTranslator, AzureTranslator, TencentTranslator, DifyTranslator, AnythingLLMTranslator, ArgosTranslator]:
+                           OpenAITranslator, ZhipuTranslator, ModelScopeTranslator, SiliconTranslator, GeminiTranslator, AzureTranslator, TencentTranslator, DifyTranslator, AnythingLLMTranslator, ArgosTranslator, GorkTranslator, DeepseekTranslator, OpenAIlikedTranslator,]:
             if service_name == translator.name:
             if service_name == translator.name:
                 self.translator = translator(lang_in, lang_out, service_model, envs=envs, prompt=prompt)
                 self.translator = translator(lang_in, lang_out, service_model, envs=envs, prompt=prompt)
         if not self.translator:
         if not self.translator:

+ 6 - 0
pdf2zh/gui.py

@@ -33,6 +33,9 @@ from pdf2zh.translator import (
     TencentTranslator,
     TencentTranslator,
     XinferenceTranslator,
     XinferenceTranslator,
     ZhipuTranslator,
     ZhipuTranslator,
+    GorkTranslator,
+    DeepseekTranslator,
+    OpenAIlikedTranslator,
 )
 )
 
 
 # The following variables associate strings with translators
 # The following variables associate strings with translators
@@ -54,6 +57,9 @@ service_map: dict[str, BaseTranslator] = {
     "Dify": DifyTranslator,
     "Dify": DifyTranslator,
     "AnythingLLM": AnythingLLMTranslator,
     "AnythingLLM": AnythingLLMTranslator,
     "Argos Translate": ArgosTranslator,
     "Argos Translate": ArgosTranslator,
+    "Gork": GorkTranslator,
+    "DeepSeek": DeepseekTranslator,
+    "OpenAI-liked": OpenAIlikedTranslator,
 }
 }
 
 
 # The following variables associate strings with specific languages
 # The following variables associate strings with specific languages

+ 9 - 5
pdf2zh/high_level.py

@@ -8,7 +8,7 @@ import tempfile
 import urllib.request
 import urllib.request
 from asyncio import CancelledError
 from asyncio import CancelledError
 from pathlib import Path
 from pathlib import Path
-from typing import Any, BinaryIO, List, Optional
+from typing import Any, BinaryIO, List, Optional, Dict
 
 
 import numpy as np
 import numpy as np
 import requests
 import requests
@@ -87,6 +87,8 @@ def translate_patch(
     callback: object = None,
     callback: object = None,
     cancellation_event: asyncio.Event = None,
     cancellation_event: asyncio.Event = None,
     model: OnnxModel = None,
     model: OnnxModel = None,
+    envs: Dict = None,
+    prompt: List = None,
     **kwarg: Any,
     **kwarg: Any,
 ) -> None:
 ) -> None:
     rsrcmgr = PDFResourceManager()
     rsrcmgr = PDFResourceManager()
@@ -102,8 +104,8 @@ def translate_patch(
         service,
         service,
         resfont,
         resfont,
         noto,
         noto,
-        kwarg.get("envs", {}),
-        kwarg.get("prompt", []),
+        envs,
+        prompt,
     )
     )
 
 
     assert device is not None
     assert device is not None
@@ -179,6 +181,8 @@ def translate_stream(
     callback: object = None,
     callback: object = None,
     cancellation_event: asyncio.Event = None,
     cancellation_event: asyncio.Event = None,
     model: OnnxModel = None,
     model: OnnxModel = None,
+    envs: Dict = None,
+    prompt: List = None,
     **kwarg: Any,
     **kwarg: Any,
 ):
 ):
     font_list = [("tiro", None)]
     font_list = [("tiro", None)]
@@ -313,6 +317,8 @@ def translate(
     compatible: bool = False,
     compatible: bool = False,
     cancellation_event: asyncio.Event = None,
     cancellation_event: asyncio.Event = None,
     model: OnnxModel = None,
     model: OnnxModel = None,
+    envs: Dict = None,
+    prompt: List = None,
     **kwarg: Any,
     **kwarg: Any,
 ):
 ):
     if not files:
     if not files:
@@ -367,8 +373,6 @@ def translate(
             os.unlink(file)
             os.unlink(file)
         s_mono, s_dual = translate_stream(
         s_mono, s_dual = translate_stream(
             s_raw,
             s_raw,
-            envs=kwarg.get("envs", {}),
-            prompt=kwarg.get("prompt", []),
             **locals(),
             **locals(),
         )
         )
         file_mono = Path(output) / f"{filename}-mono.pdf"
         file_mono = Path(output) / f"{filename}-mono.pdf"

+ 80 - 9
pdf2zh/translator.py

@@ -255,7 +255,7 @@ class OllamaTranslator(BaseTranslator):
         self.prompttext = prompt
         self.prompttext = prompt
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
     def do_translate(self, text):
     def do_translate(self, text):
         maxlen = max(2000, len(text) * 5)
         maxlen = max(2000, len(text) * 5)
@@ -298,7 +298,7 @@ class XinferenceTranslator(BaseTranslator):
         self.prompttext = prompt
         self.prompttext = prompt
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
     def do_translate(self, text):
     def do_translate(self, text):
         maxlen = max(2000, len(text) * 5)
         maxlen = max(2000, len(text) * 5)
@@ -362,7 +362,7 @@ class OpenAITranslator(BaseTranslator):
         self.prompttext = prompt
         self.prompttext = prompt
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
     def do_translate(self, text) -> str:
     def do_translate(self, text) -> str:
         response = self.client.chat.completions.create(
         response = self.client.chat.completions.create(
@@ -407,7 +407,7 @@ class AzureOpenAITranslator(BaseTranslator):
         self.prompttext = prompt
         self.prompttext = prompt
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         self.add_cache_impact_parameters("temperature", self.options["temperature"])
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
     def do_translate(self, text) -> str:
     def do_translate(self, text) -> str:
         response = self.client.chat.completions.create(
         response = self.client.chat.completions.create(
@@ -445,7 +445,7 @@ class ModelScopeTranslator(OpenAITranslator):
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         self.prompttext = prompt
         self.prompttext = prompt
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
 
 
 class ZhipuTranslator(OpenAITranslator):
 class ZhipuTranslator(OpenAITranslator):
@@ -466,7 +466,7 @@ class ZhipuTranslator(OpenAITranslator):
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         self.prompttext = prompt
         self.prompttext = prompt
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
     def do_translate(self, text) -> str:
     def do_translate(self, text) -> str:
         try:
         try:
@@ -503,7 +503,7 @@ class SiliconTranslator(OpenAITranslator):
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         self.prompttext = prompt
         self.prompttext = prompt
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
 
 
 class GeminiTranslator(OpenAITranslator):
 class GeminiTranslator(OpenAITranslator):
@@ -524,7 +524,7 @@ class GeminiTranslator(OpenAITranslator):
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
         self.prompttext = prompt
         self.prompttext = prompt
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
 
 
 class AzureTranslator(BaseTranslator):
 class AzureTranslator(BaseTranslator):
@@ -603,7 +603,7 @@ class AnythingLLMTranslator(BaseTranslator):
         }
         }
         self.prompttext = prompt
         self.prompttext = prompt
         if prompt:
         if prompt:
-            self.add_cache_impact_parameters("prompt", prompt)
+            self.add_cache_impact_parameters("prompt", prompt.template)
 
 
     def do_translate(self, text):
     def do_translate(self, text):
         messages = self.prompt(text, self.prompttext)
         messages = self.prompt(text, self.prompttext)
@@ -701,3 +701,74 @@ class ArgosTranslator(BaseTranslator):
         translation = from_lang.get_translation(to_lang)
         translation = from_lang.get_translation(to_lang)
         translatedText = translation.translate(text)
         translatedText = translation.translate(text)
         return translatedText
         return translatedText
+
+
+class GorkTranslator(OpenAITranslator):
+    # https://docs.x.ai/docs/overview#getting-started
+    name = "grok"
+    envs = {
+        "GORK_API_KEY": None,
+        "GORK_MODEL": "grok-2-1212",
+    }
+    CustomPrompt = True
+
+    def __init__(self, lang_in, lang_out, model, envs=None, prompt=None):
+        self.set_envs(envs)
+        base_url = "https://api.x.ai/v1"
+        api_key = self.envs["GORK_API_KEY"]
+        if not model:
+            model = self.envs["GORK_MODEL"]
+        super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
+        self.prompttext = prompt
+        if prompt:
+            self.add_cache_impact_parameters("prompt", prompt.template)
+
+
+class DeepseekTranslator(OpenAITranslator):
+    name = "deepseek"
+    envs = {
+        "DEEPSEEK_API_KEY": None,
+        "DEEPSEEK_MODEL": "deepseek-chat",
+    }
+    CustomPrompt = True
+
+    def __init__(self, lang_in, lang_out, model, envs=None, prompt=None):
+        self.set_envs(envs)
+        base_url = "https://api.deepseek.com/v1"
+        api_key = self.envs["DEEPSEEK_API_KEY"]
+        if not model:
+            model = self.envs["DEEPSEEK_MODEL"]
+        super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
+        self.prompttext = prompt
+        if prompt:
+            self.add_cache_impact_parameters("prompt", prompt.template)
+
+
+class OpenAIlikedTranslator(OpenAITranslator):
+    name = "openailiked"
+    envs = {
+        "OPENAILIKED_BASE_URL": None,
+        "OPENAILIKED_API_KEY": None,
+        "OPENAILIKED_MODEL": None,
+    }
+    CustomPrompt = True
+
+    def __init__(self, lang_in, lang_out, model, envs=None, prompt=None):
+        self.set_envs(envs)
+        if self.envs["OPENAILIKED_BASE_URL"]:
+            base_url = self.envs["OPENAILIKED_BASE_URL"]
+        else:
+            raise ValueError("The OPENAILIKED_BASE_URL is missing.")
+        if not model:
+            if self.envs["OPENAILIKED_MODEL"]:
+                model = self.envs["OPENAILIKED_MODEL"]
+            else:
+                raise ValueError("The OPENAILIKED_MODEL is missing.")
+        if self.envs["OPENAILIKED_API_KEY"] is None:
+            api_key = "openailiked"
+        else:
+            api_key = self.envs["OPENAILIKED_API_KEY"]
+        super().__init__(lang_in, lang_out, model, base_url=base_url, api_key=api_key)
+        self.prompttext = prompt
+        if prompt:
+            self.add_cache_impact_parameters("prompt", prompt.template)

+ 66 - 0
test/test_translator.py

@@ -1,5 +1,6 @@
 import unittest
 import unittest
 from pdf2zh.translator import BaseTranslator
 from pdf2zh.translator import BaseTranslator
+from pdf2zh.translator import OpenAIlikedTranslator
 from pdf2zh import cache
 from pdf2zh import cache
 
 
 
 
@@ -73,5 +74,70 @@ class TestTranslator(unittest.TestCase):
             translator.translate("Hello World")
             translator.translate("Hello World")
 
 
 
 
+class TestOpenAIlikedTranslator(unittest.TestCase):
+    def setUp(self) -> None:
+        self.default_envs = {
+            "OPENAILIKED_BASE_URL": "https://api.openailiked.com",
+            "OPENAILIKED_API_KEY": "test_api_key",
+            "OPENAILIKED_MODEL": "test_model",
+        }
+
+    def test_missing_base_url_raises_error(self):
+        """测试缺失 OPENAILIKED_BASE_URL 时抛出异常"""
+        with self.assertRaises(ValueError) as context:
+            OpenAIlikedTranslator(
+                lang_in="en", lang_out="zh", model="test_model", envs={}
+            )
+        self.assertIn("The OPENAILIKED_BASE_URL is missing.", str(context.exception))
+
+    def test_missing_model_raises_error(self):
+        """测试缺失 OPENAILIKED_MODEL 时抛出异常"""
+        envs_without_model = {
+            "OPENAILIKED_BASE_URL": "https://api.openailiked.com",
+            "OPENAILIKED_API_KEY": "test_api_key",
+        }
+        with self.assertRaises(ValueError) as context:
+            OpenAIlikedTranslator(
+                lang_in="en", lang_out="zh", model=None, envs=envs_without_model
+            )
+        self.assertIn("The OPENAILIKED_MODEL is missing.", str(context.exception))
+
+    def test_initialization_with_valid_envs(self):
+        """测试使用有效的环境变量初始化"""
+        translator = OpenAIlikedTranslator(
+            lang_in="en",
+            lang_out="zh",
+            model=None,
+            envs=self.default_envs,
+        )
+        self.assertEqual(
+            translator.envs["OPENAILIKED_BASE_URL"],
+            self.default_envs["OPENAILIKED_BASE_URL"],
+        )
+        self.assertEqual(
+            translator.envs["OPENAILIKED_API_KEY"],
+            self.default_envs["OPENAILIKED_API_KEY"],
+        )
+        self.assertEqual(translator.model, self.default_envs["OPENAILIKED_MODEL"])
+
+    def test_default_api_key_fallback(self):
+        """测试当 OPENAILIKED_API_KEY 为空时使用默认值"""
+        envs_without_key = {
+            "OPENAILIKED_BASE_URL": "https://api.openailiked.com",
+            "OPENAILIKED_MODEL": "test_model",
+        }
+        translator = OpenAIlikedTranslator(
+            lang_in="en",
+            lang_out="zh",
+            model=None,
+            envs=envs_without_key,
+        )
+        self.assertEqual(
+            translator.envs["OPENAILIKED_BASE_URL"],
+            self.default_envs["OPENAILIKED_BASE_URL"],
+        )
+        self.assertEqual(translator.envs["OPENAILIKED_API_KEY"], None)
+
+
 if __name__ == "__main__":
 if __name__ == "__main__":
     unittest.main()
     unittest.main()