pycharm中os.environ不能读取环境变量:raise KeyError(key) from None

本文介绍在PyCharm中正确配置环境变量的方法,解决从.bashrc设置的环境变量无法在PyCharm中访问的问题。通过在运行配置中添加环境变量,确保程序正常运行。

Ubuntu系统

在.bashrc中设置了程序运行所需的环境变量,程序中通过os.environ['变量名']访问,通过终端启动程序没问题,在pycharm中点击图标(绿色三角标志)运行提示“raise KeyError(key) from None”,print打印了os.environ.keys(),发现里面没有bashrc中自己设置的环境变量。

网上给的在pycharm的Preference(Mac系统)或Setting(Ubuntu)中,Build,Execution,Deployment--Console--Python Console的Environment variables增加环境变量,试过没起效果。

后来发现在运行图标(绿色三角)左边的项目名称那里,有“Edit Configurations”选项,打开后在Environment variables增加环境变量,Apply,OK

可以使用了。

接口代码如下:# !/usr/bin/env python # -*- coding: utf-8 -*- # 版权信息:华为技术有限公司,版本所有(C) 2025-2099 """ 功 能:供应链 SCM Agent -- interface/roles/order/supply_manager_assistant_app_langgraph-供应经理助手fastapi接口 """ import json import os import sys from time import time import uvicorn from aipaas.logger_factory import logger from fastapi import Request, FastAPI from fastapi.responses import StreamingResponse from langchain_core.messages import HumanMessage from infrastructure.auth_fastapi import SoaAuth from infrastructure.langfuse_telemetery.trace_langgraph import create_langfuse_callback from scm_agent.src.application.roles.order.supply_manager_assistant_graph.main_graph.graph import \ get_supply_manager_assistant_main_graph from scm_agent.src.common.agent_name import AgentName from scm_agent.src.common.constants import Status from scm_agent.src.infrastructures.agent_config_download.config_download import dowload_agent_config_langgraph from scm_agent.src.infrastructures.agent_config_read.read_yaml_config import read_project_config from scm_agent.src.infrastructures.agent_state.agent_state_helper import get_redis_key from scm_agent.src.infrastructures.app_postprocess.output_process import str_to_output_json from scm_agent.src.infrastructures.app_postprocess.output_process import str_to_stream_output_langgraph from scm_agent.src.infrastructures.memory.postgre_checkpointer.postgre_checkpointer import FrameworkAdapter from scm_agent.src.infrastructures.read_config import app_config from scm_agent.src.interface.input_output_parameters import SupplyManagerAssistantChatInput, ConfigUpdateInput, \ ConfigUpdateOutput os.environ['NO_PROXY'] = '127.0.0.1,localhost' fastapi_app = FastAPI(lifespan=FrameworkAdapter.lifespan_wrapper) env = os.environ.get("env") soa = SoaAuth(env_type=env, skip_soa_auth=False, only_check_token=True) agent_name = AgentName.SupplyManagerAssistantLangGraph project_config = {} common_prompt_config = {} def preload_agent_config(name): """ 预加载agent配置 Args: name: 助手/技能名称,app_config中配置 """ global project_config global common_prompt_config dowload_agent_config_langgraph(name) # 读取项目配置文件 project_config = read_project_config(agent_name, f"{agent_name}.yaml").get(env) # 读取公共Prompt配置文件 common_prompt_config = read_project_config("common", "prompt_config.yaml") if 'PYCHARM_HOSTED' in os.environ or 'PYCHARM_DEBUG_PROCESS' in os.environ: logger.info("debug模式请在此打断点") # raise Exception("debug模式请在此打断点,注释此行即可") preload_agent_config(agent_name) async def generator(graph, supply_manager_assistant_chat_input, initial_state, config): yield str_to_stream_output_langgraph('<think>') yield str_to_stream_output_langgraph('**问题**') question = supply_manager_assistant_chat_input.question.strip() yield str_to_stream_output_langgraph('\n' + question) async for chunk in graph.astream( input=initial_state, stream_mode="custom", config=config, subgraphs=True ): yield str_to_stream_output_langgraph(chunk[1]) @fastapi_app.post('/roles/supply_manager_assistant_chat_langgraph') @soa.required async def supply_manager_assistant_chat(request: Request, supply_manager_assistant_chat_input: SupplyManagerAssistantChatInput): # checkpointer = presit_param.get("checkpointer") checkpointer = request.app.state.presist_param.get("checkpointer") thread_id = get_redis_key(supply_manager_assistant_chat_input) user_id = supply_manager_assistant_chat_input.user_id session_id = supply_manager_assistant_chat_input.session_id langfuse_callback = create_langfuse_callback(user_id=user_id, session_id=session_id, trace_name=AgentName.SupplyManagerAssistantLangGraph) config = {"configurable": {"thread_id": thread_id}, "metadata": {"user_id": supply_manager_assistant_chat_input.user_id, "project_config": project_config, "common_prompt_config": common_prompt_config, "ctx_params": supply_manager_assistant_chat_input.ctxParams}, "callbacks": [langfuse_callback], } try: graph = get_supply_manager_assistant_main_graph(checkpointer) initial_state = {"messages": [HumanMessage(content=supply_manager_assistant_chat_input.question)]} return StreamingResponse( generator(graph, supply_manager_assistant_chat_input, initial_state, config), media_type="text/event-stream", headers={"Cache-Control": "no-cache", "Connection": "keep-alive"} ) except Exception as e: return str_to_output_json(f'处理异常,异常原因: {e}') @fastapi_app.post('/config_update') @soa.required async def config_update(request: Request, config_update_input: ConfigUpdateInput): start_time = time() config_update_output = ConfigUpdateOutput() try: preload_agent_config(config_update_input.agent_name) config_update_output.status = Status.SUCCESS except Exception as e: config_update_output.error_message = "[SCM-Agent] Update config error." # 耗时统计 config_update_output.elapsed_time = str(time() - start_time) return config_update_output.to_dict() @fastapi_app.get('/health') @soa.required async def health(request: Request, ): return json.dumps({"success": True}, ensure_ascii=False) if __name__ == '__main__': uvicorn.run("supply_manager_assistant_app_langgraph:fastapi_app", host=app_config.get('host', '0.0.0.0'), # port=app_config.get('port', 8080), loop="asyncio", port=8080) # workers=app_config.get('workers', 4)) postgre_checkpointer.py目前没问题了,但是现在接口代码报错如下:ERROR: Exception in ASGI application + Exception Group Traceback (most recent call last): | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 426, in run_asgi | result = await app( # type: ignore[func-returns-value] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in __call__ | return await self.app(scope, receive, send) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\fastapi\applications.py", line 1054, in __call__ | await super().__call__(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\applications.py", line 123, in __call__ | await self.middleware_stack(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\middleware\errors.py", line 186, in __call__ | raise exc | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\middleware\errors.py", line 164, in __call__ | await self.app(scope, receive, _send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__ | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app | raise exc | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\routing.py", line 756, in __call__ | await self.middleware_stack(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\routing.py", line 776, in app | await route.handle(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\routing.py", line 297, in handle | await self.app(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\routing.py", line 77, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app | raise exc | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\routing.py", line 75, in app | await response(scope, receive, send) | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\responses.py", line 258, in __call__ | async with anyio.create_task_group() as task_group: | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 772, in __aexit__ | raise BaseExceptionGroup( | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\responses.py", line 261, in wrap | await func() | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\starlette\responses.py", line 250, in stream_response | async for chunk in self.body_iterator: | File "D:\code\iscp-app-aigc-ai\scm_agent\src\interface\roles\order\supply_manager_assistant_app_langgraph.py", line 71, in generator | async for chunk in graph.astream( | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\langgraph\pregel\main.py", line 2883, in astream | async with AsyncPregelLoop( | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\langgraph\pregel\_loop.py", line 1186, in __aenter__ | saved = await self.checkpointer.aget_tuple(self.checkpoint_config) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\langgraph\checkpoint\postgres\aio.py", line 192, in aget_tuple | async with self._cursor() as cur: | File "C:\Users\zwx1453293\AppData\Local\Programs\Python\Python311\Lib\contextlib.py", line 204, in __aenter__ | return await anext(self.gen) | ^^^^^^^^^^^^^^^^^^^^^ | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\langgraph\checkpoint\postgres\aio.py", line 388, in _cursor | async with conn.cursor(binary=True, row_factory=dict_row) as cur: | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\psycopg\connection_async.py", line 256, in cursor | self._check_connection_ok() | File "D:\code\iscp-app-aigc-ai\.venv\Lib\site-packages\psycopg\_connection_base.py", line 528, in _check_connection_ok | raise e.OperationalError("the connection is closed") | psycopg.OperationalError: the connection is closed +------------------------------------ 之前接口代码是没问题的
最新发布
09-25
from openai import OpenAI client = OpenAI(api_key="<sk-a3fedf84d0364247a2103fffbd0c2644>", base_url="https://api.deepseek.com/v1") response = client.chat.completions.create( model="deepseek-chat", messages=[ {"role": "system", "content": "你是一个幽默风趣的客服,请使用搞笑方式回答问题"}, {"role": "user", "content": "你介绍下你自己"}, ], stream=False ) print(response.choices[0].message.content) 出现错误:Traceback (most recent call last): File "D:\py\Pycharm\plugins\python\helpers\pydev\pydevconsole.py", line 364, in runcode coro = func() ^^^^^^ File "<input>", line 1, in <module> File "D:\py\Pycharm\plugins\python\helpers\pydev\_pydev_bundle\pydev_umd.py", line 198, in runfile pydev_imports.execfile(filename, global_vars, local_vars) # execute the script ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\py\Pycharm\plugins\python\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "D:\pythonProject\xijiao\Deepseek.py", line 11, in <module> response = client.chat.completions.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pythonProject\xijiao\venv\Lib\site-packages\openai\_utils\_utils.py", line 287, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "D:\pythonProject\xijiao\venv\Lib\site-packages\openai\resources\chat\completions\completions.py", line 1150, in create return self._post( ^^^^^^^^^^^ File "D:\pythonProject\xijiao\venv\Lib\site-packages\openai\_base_client.py", line 1259, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pythonProject\xijiao\venv\Lib\site-packages\openai\_base_client.py", line 1047, in request raise self._make_status_error_from_response(err.response) from None openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Authentication Fails, Your api key: ****644> is invalid', 'type': 'authentication_error', 'param': None, 'code': 'invalid_request_error'}}
08-12
<think>我们正在解决用户的问题:在FastAPI项目中使用load_dotenv加载环境变量后,无法通过os.environ读取环境变量。根据引用[1]和[2],我们知道通常的做法是使用`load_dotenv()`来加载.env文件中的环境变量,然后使用`os.getenv()`来获取。但是用户遇到的问题是,即使调用了`load_dotenv()`,通过`os.environ`(或者`os.getenv()`)仍然读取不到环境变量。可能的原因:1. .env文件的位置不正确,导致`load_dotenv()`没有找到该文件。2.环境变量名称拼写错误。3.在调用`load_dotenv()`之后,环境变量被其他操作覆盖了。4.使用`os.environ`的方式有误(例如,在加载之前就尝试读取)。根据引用[3],我们知道`load_dotenv()`实际上是通过`os.environ`来设置环境变量的。因此,如果`load_dotenv()`成功执行,那么环境变量应该被设置到`os.environ`中。解决方案:1.确保.env文件位于项目根目录(即运行FastAPI应用的目录)或指定路径。2.检查环境变量名称是否正确,包括大小写。3.确保在调用`load_dotenv()`之后才使用`os.getenv()`或`os.environ`。4.可以尝试打印`os.environ`来查看所有环境变量,确认是否成功加载。示例代码:根据引用[1]和[2],我们可以在加载后立即打印环境变量来调试。修改后的代码示例: ```pythonfrom fastapi importFastAPIimportosfrom dotenv importload_dotenv#加载.env文件load_dotenv()#默认加载当前目录下的.env文件#调试:打印所有环境变量,或者打印特定的环境变量print("Allenvironment variables:", os.environ) print("DB_URL:",os.getenv("DB_URL")) app= FastAPI()@app.get("/")def read_root():#这里使用os.getenv获取环境变量db_url= os.getenv("DB_URL")return{"DB_URL":db_url} ```如果.env文件不在当前目录,可以指定路径:```pythonload_dotenv('.env')#指定文件路径```另外,注意:在FastAPI应用中,如果使用uvicorn等ASGI服务器运行,有时工作目录可能不同,因此需要确保.env文件在正确的位置。如果问题仍然存在,可以尝试使用绝对路径。用户可能已经按照上述方式做了,但仍然不行,那么可能是其他原因。另一个可能的原因:在调用`load_dotenv()`之前,环境变量已经被读取(例如,在模块级别过早读取)。因此,确保在加载之后才读取。例如,以下代码是错误的: ```pythonfromfastapiimport FastAPIimport osfromdotenvimport load_dotenv#错误:在加载之前就读取DB_URL= os.getenv("DB_URL") load_dotenv()#此时才加载,所以之前的DB_URL为Noneapp =FastAPI() ```应改为:```pythonfromfastapiimport FastAPIimport osfromdotenvimport load_dotenvload_dotenv()#先加载DB_URL =os.getenv("DB_URL")#再读取app= FastAPI()```总结步骤:1.确认.env文件存在且位置正确。2.确认在`load_dotenv()`之后才读取环境变量。3.检查变量名是否正确。4.使用调试打印来检查。如果用户使用的是PyCharm等IDE,有时运行配置中设置了环境变量,可能会覆盖.env文件中的值,也需要注意。另外,引用[3]提到,`load_dotenv`是通过`os.environ`设置环境变量的,所以如果.env文件中有注释或者格式错误,也可能导致部分变量没有加载。确保.env文件每行都是`KEY=VALUE`格式,且没有多余空格。如果.env文件中有空格,例如: ```KEY =VALUE```这会被解析为KEY和VALUE前后都有空格,可以通过`os.getenv("KEY")`得到的是`'VALUE'`(注意前面的空格)?但不会导致读取不到,只是值有空格。但用户的问题是读取不到,所以重点还是位置和加载顺序。最后,如果用户使用的是最新版本的python-dotenv,注意其API是否有变化。根据以上分析,我们给出解决方案。</think>### FastAPI中`load_dotenv`后无法通过`os.environ`读取环境变量的解决方案 在FastAPI项目中,使用`load_dotenv()`加载.env文件后无法通过`os.environ`读取环境变量,通常由以下原因导致: #### 常见原因及解决方案 1. **`.env`文件位置错误** - ✅ 确保`.env`文件位于**项目根目录**(即启动FastAPI应用的目录) - ✅ 显式指定路径:`load_dotenv('.env')` 或 `load_dotenv('/absolute/path/.env')` 2. **环境变量加载时机问题** ```python # 错误示例:在导入模块时过早读取环境变量 from config import DB_URL # 此时load_dotenv()尚未执行 load_dotenv() # 环境变量加载太晚 # 正确做法:在load_dotenv()读取 load_dotenv() DB_URL = os.getenv("DB_URL") # 确保在加载后读取 ``` 3. **变量名大小写敏感** - 🐞 `.env`中定义:`db_url=postgres://...` - ❌ `os.getenv("DB_URL")` # 大小写不匹配 - ✅ 保持大小写一致:`DB_URL=postgres://...` 和 `os.getenv("DB_URL")` 4. **环境变量覆盖问题** - 系统环境变量优先级高于`.env`文件[^3] - 检查是否已存在同名系统变量: ```python print("Pre-load:", os.getenv("DB_URL")) # 检查加载前是否存在 load_dotenv() print("Post-load:", os.getenv("DB_URL")) ``` 5. **文件编码问题** - 确保`.env`文件使用**UTF-8编码** - 避免特殊字符(如中文)未经转义 #### 完整调试示例 ```python from fastapi import FastAPI import os from dotenv import load_dotenv app = FastAPI() # 1. 显式指定路径并加载 env_path = '.env' # 或绝对路径 load_dotenv(env_path) # 2. 调试输出 print("Loaded variables:", list(os.environ.keys())) # 检查所有变量 print("DB_URL exists:", "DB_URL" in os.environ) # 检查特定变量 # 3. 正确获取变量 DB_URL = os.environ.get("DB_URL") # 使用os.environ字典方式 # 或 DB_URL = os.getenv("DB_URL") @app.get("/config") def read_config(): return {"db_url": DB_URL} ``` #### 验证步骤 1. 在终端检查环境变量: ```bash # 启动前 echo $DB_URL # 启动应用后(在同一个终端会话) uvicorn main:app --reload ``` 2. 访问`/config`端点查看输出 3. 检查应用日志确认加载的变量列表 > **关键提示**:`os.environ`是Python环境变量的实时字典,`load_dotenv()`本质是通过`os.environ.update()`写入变量[^3]。若加载后仍不存在,需检查文件路径或变量命名。 ### 相关问题 1. 如何在Docker容器中正确使用`.env`文件与FastAPI配合? 2. FastAPI的热重载(--reload)模式是否会影响环境变量加载? 3. 除了`python-dotenv`,还有哪些管理FastAPI环境变量的替代方案? 4. 如何在生产环境中安全地管理FastAPI的敏感配置(如API密钥)? [^1]: 引用自FastAPI环境变量管理的最佳实践 [^2]: 参考自`.env`文件在FastAPI中的标准用法 [^3]: 基于`dotenv`包源码分析的环境变量注入机制
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值