版本问题
错误:
Building wheels for collected packages: flash_attn
Building wheel for flash_attn (setup.py) ... error
error: subprocess-exited-with-error
1.查看torch、cuda和python版本,去https://github.com/Dao-AILab/flash-attention/releases/
找对应的版本下载,例如下面的就是python3.11、cuda11.8、torch2.4
2.然后 安装:pip install flash_attn-2.6.2+cu118torch2.4cxx11abiFALSE-cp311-cp311-linux_x86_64.whl --no-build-isolation
注意:后面要加--no-build-isolation,不然会报错