ChatGLM-6B模型部署【BUG 记录】AssertionError: Torch not compiled with CUDA enabled

(love) E:\jupyter-notebook\jupyter\chatglm-6b-main>python cli_demo.py Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a `revision` is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Loading checkpoint shards: 100%|███████████████████████████████████████████████████████| 8/8 [00:18<00:00, 2.35s/it] Traceback (most recent call last): File "E:\jupyter-notebook\jupyter\chatglm-6b-main\cli_demo.py", line 8, in <module> model = AutoModel.from_pretrained("model", trust_remote_code=True).half().quantize(4).cuda() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Administrator/.cache\huggingface\modules\transformers_modules\model\modeling_chatglm.py", line 1434, in quantize self.transformer = quantize(self.transformer, bits, empty_init=empty_init, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Administrator/.cache\huggingface\modules\transformers_modules\model\quantization.py", line 159, in quantize weight_tensor=layer.attention.query_key_value.weight.to(torch.cuda.current_device()), ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\anaconda3\envs\love\Lib\site-packages\torch\cuda\__init__.py", line 769, in current_device _lazy_init() File "E:\anaconda3\envs\love\Lib\site-packages\torch\cuda\__init__.py", line 289, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled

验证CUDA安装是否成功: 安装完成后,你可以使用以下命令在命令行中验证CUDA是否成功安装:

 

cssCopy code

nvcc --version

cuda_12.2.2_windows_network和本地有啥区别

方法二:使用--user选项

使用--user选项可以将包安装到用户目录,而不需要管理员权限:

pip install torch torchvision torchaudio --user -f https://download.pytorch.org/whl/torch_stable.html
 

(love) E:\jupyter-notebook\jupyter\chatglm-6b-main>pip install torch torchvision torchaudio --user -f https://download.pytorch.org/whl/torch_stable.html
WARNING: Ignoring invalid distribution ~orch (E:\anaconda3\envs\love\Lib\site-packages)
Looking in links: https://download.pytorch.org/whl/torch_stable.html
Collecting torch
  Using cached https://download.pytorch.org/whl/cu121/torch-2.1.0%2Bcu121-cp311-cp311-win_amd64.whl (2473.9 MB)
Collecting torchvision
  Using cached https://download.pytorch.org/whl/cu121/torchvision-0.16.0%2Bcu121-cp311-cp311-win_amd64.whl (5.8 MB)
Collecting torchaudio
  Using cached https://download.pytorch.org/whl/cu121/torchaudio-2.1.0%2Bcu121-cp311-cp311-win_amd64.whl (4.0 MB)
Requirement already satisfied: filelock in e:\anaconda3\envs\love\lib\site-packages (from torch) (3.12.4)
Requirement already satisfied: typing-extensions in e:\anaconda3\envs\love\lib\site-packages (from torch) (4.7.1)
Requirement already satisfied: sympy in e:\anaconda3\envs\love\lib\site-packages (from torch) (1.11.1)
Requirement already satisfied: networkx in e:\anaconda3\envs\love\lib\site-packages (from torch) (3.1)
Requirement already satisfied: jinja2 in e:\anaconda3\envs\love\lib\site-packages (from torch) (3.1.2)
Requirement already satisfied: fsspec in e:\anaconda3\envs\love\lib\site-packages (from torch) (2023.9.2)
Requirement already satisfied: numpy in e:\anaconda3\envs\love\lib\site-packages (from torchvision) (1.26.0)
Requirement already satisfied: requests in e:\anaconda3\envs\love\lib\site-packages (from torchvision) (2.31.0)
Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in e:\anaconda3\envs\love\lib\site-packages (from torchvision) (10.0.1)
Requirement already satisfied: MarkupSafe>=2.0 in e:\anaconda3\envs\love\lib\site-packages (from jinja2->torch) (2.1.1)
Requirement already satisfied: charset-normalizer<4,>=2 in e:\anaconda3\envs\love\lib\site-packages (from requests->torchvision) (2.0.4)
Requirement already satisfied: idna<4,>=2.5 in e:\anaconda3\envs\love\lib\site-packages (from requests->torchvision) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in e:\anaconda3\envs\love\lib\site-packages (from requests->torchvision) (1.26.16)
Requirement already satisfied: certifi>=2017.4.17 in e:\anaconda3\envs\love\lib\site-packages (from requests->torchvision) (2023.7.22)
Requirement already satisfied: mpmath>=0.19 in e:\anaconda3\envs\love\lib\site-packages (from sympy->torch) (1.3.0)
WARNING: Ignoring invalid distribution ~orch (E:\anaconda3\envs\love\Lib\site-packages)
Installing collected packages: torch, torchvision, torchaudio
  WARNING: The scripts convert-caffe2-to-onnx.exe, convert-onnx-to-caffe2.exe and torchrun.exe are installed in 'C:\Users\Administrator\AppData\Roaming\Python\Python311\Scripts' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed torch-2.1.0+cu121 torchaudio-2.1.0+cu121 torchvision-0.16.0+cu121

(love) E:\jupyter-notebook\jupyter\chatglm-6b-main>
 

测试代码

ChatGPT

ChatGPT

当你的PyTorch安装成功后,你可以使用以下简单的测试代码来验证是否安装正确。在Python环境下执行以下代码:

import torch

# 检查是否支持CUDA(GPU加速)
print("CUDA 可用:" if torch.cuda.is_available() else "CUDA 不可用:")

# 打印安装的PyTorch版本
print("PyTorch 版本:", torch.__version__)

# 创建一个随机的Tensor,如果CUDA可用,将它移动到GPU上
x = torch.rand(5, 3)
if torch.cuda.is_available():
    x = x.to('cuda')
    print("Tensor 已移动到 GPU 上。")
else:
    print("CUDA 不可用,Tensor 在 CPU 上。")
CUDA 可用:
PyTorch 版本: 2.1.0+cu121
Tensor 已移动到 GPU 上。
  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值