ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)

Building wheels for collected packages: llama-cpp-python
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [34 lines of output]
      *** scikit-build-core 0.10.5 using CMake 3.30.2 (wheel)
      *** Configuring CMake...
      loading initial cache file /tmp/tmp12mmpfoy/build/CMakeInit.txt
      -- The C compiler identification is GNU 11.4.0
      -- The CXX compiler identification is GNU 11.4.0
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /usr/bin/gcc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /usr/bin/g++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Could NOT find Git (missing: GIT_EXECUTABLE)
      CMake Warning at vendor/llama.cpp/cmake/build-info.cmake:14 (message):
        Git not found.  Build info will not be accurate.
      Call Stack (most recent call first):
        vendor/llama.cpp/CMakeLists.txt:74 (include)
      
      
      CMake Error at vendor/llama.cpp/CMakeLists.txt:95 (message):
        LLAMA_CUBLAS is deprecated and will be removed in the future.
      
        Use GGML_CUDA instead
      
      Call Stack (most recent call first):
        vendor/llama.cpp/CMakeLists.txt:100 (llama_option_depr)
      
      
      -- Configuring incomplete, errors occurred!
      
      *** CMake configuration failed
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)

在Ubuntu 22.04 上不是Xinferenc,安装时报错如上。

说明一下,这台机器是部署大模型的服务器,有两块英伟达4090显卡,基础环境以及安装了CUDA,Pytorch基础计算包。基础环境安装xinference是没有问题的。就是安装好的xinference包后与原来运行大模型的环境冲突,所有我安装了conda,用conda新创建了一个环境xin_env,用xin_env环境安装xinference时报这个错。

解决办法:

1、在xin_env环境上安装CUDA和Pytorch

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值