【flash-attention】Building wheel for flash-attn (pyproject.toml) did not run successfully

报错

Building wheel for flash-attn (pyproject.toml) did not run successfully

解决

方法1

git clone git@github.com:Dao-AILab/flash-attention.git
cd /flash-attention
python setup.py install

注意这里会从出现错误提示flash-attention/csrc/cutlass找不到,git下载cutlass失败
所以cd flash-attention/csrc/ 然后 git@github.com:NVIDIA/cutlass.git

重新运行python setup.py install 就可以编译成功了

方法2(推荐)

找到自己对应的配置版本,例如:
cuda:12.2
torch:2.2
python:3.10

pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.2/flash_attn-2.4.2+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
(rdt) qfw@LAPTOP-IQ27EG3H:~/RoboticsDiffusionTransformer$ pip install flash-attn --no-build-isolation Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting flash-attn Downloading https://pypi.tuna.tsinghua.edu.cn/packages/11/34/9bf60e736ed7bbe15055ac2dab48ec67d9dbd088d2b4ae318fd77190ab4e/flash_attn-2.7.4.post1.tar.gz (6.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 8.2 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: torch in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from flash-attn) (2.1.0) Collecting einops (from flash-attn) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/87/62/9773de14fe6c45c23649e98b83231fffd7b9892b6cf863251dc2afa73643/einops-0.8.1-py3-none-any.whl (64 kB) Requirement already satisfied: filelock in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.18.0) Requirement already satisfied: typing-extensions in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (4.13.2) Requirement already satisfied: sympy in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (1.14.0) Requirement already satisfied: networkx in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2) Requirement already satisfied: jinja2 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.1.6) Requirement already satisfied: fsspec in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2025.3.2) Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cudnn-cu12==8.9.2.26 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (8.9.2.26) Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.3.1) Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (11.0.2.54) Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (10.3.2.106) Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (11.4.5.107) Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.0.106) Requirement already satisfied: nvidia-nccl-cu12==2.18.1 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2.18.1) Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: triton==2.1.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2.1.0) Requirement already satisfied: nvidia-nvjitlink-cu12 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch->flash-attn) (12.9.41) Requirement already satisfied: MarkupSafe>=2.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (3.0.2) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from sympy->torch->flash-attn) (1.3.0) Building wheels for collected packages: flash-attn DEPRECATION: Building 'flash-attn' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'flash-attn'. Discussion can be found at https://github.com/pypa/pip/issues/6334 Building wheel for flash-attn (setup.py) ... error error: subprocess-exited-with-error × python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [31 lines of output] torch.__version__ = 2.1.0+cu121 /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages/setuptools/__init__.py:94: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. !! ******************************************************************************** Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`. ******************************************************************************** !! dist.fetch_build_eggs(dist.setup_requires) /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl error: Remote end closed connection without response [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)遇到这个问题怎么解决?
最新发布
05-12
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值