参照 GPU — vLLM 上述所说进行安装
创建相应环境
conda create -n vllm python=3.12 -y
conda activate vllm
安装vllm
pip install vllm
此时我遇到了这种问题
RuntimeError:
The detected CUDA version (11.7) mismatches the version that was used to compile
PyTorch (12.4). Please make sure to use the same CUDA versions.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for xformers
Running setup.py clean for xformers
Failed to build xformers
我们可以发现时cuda版本没匹配上
解决方法
我的环境中已有
我希望vllm安装时能适配我的torch
git clone https://github.com/vllm-project/vllm.git
cd vllm
python use_existing_torch.py
pip install -r requirements-build.txt
pip install -e . --no-build-isolation
subprocess.CalledProcessError: Command '['cmake', '/home/chenzehao/project/researchProjects/Mywork/TableCoT/vllm', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cuda', '-DVLLM_PYTHON_EXECUTABLE=/home/chenzehao/anaconda3/envs/vllm/bin/python', '-DVLLM_PYTHON_PATH=/home/chenzehao/anaconda3/envs/vllm/lib/python312.zip:/home/chenzehao/anaconda3/envs/vllm/lib/python3.12:/home/chenzehao/anaconda3/envs/vllm/lib/python3.12/lib-dynload:/home/chenzehao/anaconda3/envs/vllm/lib/python3.12/site-packages:/home/chenzehao/anaconda3/envs/vllm/lib/python3.12/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/home/chenzehao/project/researchProjects/Mywork/TableCoT/vllm/.deps', '-DNVCC_THREADS=1', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=96']' returned non-zero exit status 1.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building editable for vllm
Failed to build vllm
ERROR: Failed to build installable wheels for some pyproject.toml based projects (vllm)
但是最后一步又报错
解决方法
参考vllm issue 中方法
pip install --no-build-isolation -e .
但还是报同样的错
现在尝试
pip install -U torch==2.3.1 torchvision==0.18.1 torchaudio==2.3.1 --index-url https://download.pytorch.org/whl/cu118
尝试
conda install nvidia/label/cuda-11.7.0::cuda-nvcc
都同样不行
我打算将torch 卸载然后重新安装vllm
pip install -e .
同样失败
一切重来,重建环境重新开始
pip install vllm --pre --extra-index-url https://wheels.vllm.ai/nightly
同样报错
尝试修改xformers的版本
不行
发现每次pip install vllm时都会自动安装cu117版本,因此去找cu118版本,下载whl,然后本地安装,并且pytorch也同步到cu118
再次发现同样报错,考虑 g++版本问题
从原先7.5.0调整到11.2.0
conda install -c conda-forge gxx=11.2.0
conda install -c conda-forge gcc=11.2.0
随后进行
pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cu118
pip install vllm==0.7.3
成功装上!!!!