-
-
Notifications
You must be signed in to change notification settings - Fork 12.5k
在 AI 服务商配置 OpenAI 的 API 代理地址,会影响通过环境变量设置的嵌入模型 #7500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
📦 Deployment environmentDocker 📦 Deployment ModeServer-side mode (lobe-chat-database mirror) 📌 Software versionv1.81.7 💻 System environmentmacOS 🌐 BrowserChrome 🐛 Question descriptionUse docker to deploy server mode to process file vectorization using the local ollama's bge-m3 model by setting environment variables.
When configuring OpenAI's API proxy address at the AI service provider, the request will not be the address of OLLAMA_PROXY_URL, but the API proxy address of OpenAI (https://openai.proxy.example.com/v1).
Clear the OpenAI API proxy address and you can vectorize it normally. 📷 Reproduction stepsNo response 🚦 Expected resultsNo response 📝 Supplementary informationNo response |
I found a similar discussion regarding issues with proxy URL configurations when deploying Lobe Chat in server mode using Docker. The discussion titled 使用docker run和docker compose部署lobe,环境变量配置未生效 is still open and might provide insights into your issue [1]. To address your problem, ensure that the environment variables are correctly set and that the Ollama proxy URL is prioritized over the OpenAI API proxy address. Here are some steps you can follow:
If clearing the OpenAI API proxy address resolves the issue, it indicates that the system defaults to using the OpenAI proxy when both are configured. Adjusting the configuration to prioritize the Ollama proxy URL should help. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
📦 部署环境
Docker
📦 部署模式
服务端模式(lobe-chat-database 镜像)
📌 软件版本
v1.81.7
💻 系统环境
macOS
🌐 浏览器
Chrome
🐛 问题描述
使用 docker 部署服务器模式,通过设置环境变量来使用本地 ollama 的 bge-m3 模型来处理文件向量化。
当在 AI 服务商配置 OpenAI 的 API 代理地址时,会导致请求的不是 OLLAMA_PROXY_URL 的地址,而是 OpenAI 的 API 代理地址(https://openai.proxy.example.com/v1)。
把 OpenAI 的 API 代理地址清空,就能正常向量化。
📷 复现步骤
No response
🚦 期望结果
No response
📝 补充信息
No response
The text was updated successfully, but these errors were encountered: