下载大模型llama3的中文版文件:
参考网址https://www.cnblogs.com/obullxl/p/18204970/NTopic2024052101
本地部署运行大模型(GGUF文件)
我使用miniconda创建了一个“llm”的环境,并安装运行大模型所必须的库:
pip install llama-cpp-python
pip install openai
pip install uvicorn
pip install starlette
pip install fastapi
pip install sse_starlette
pip install starlette_context
pip install pydantic_settings
具体可参考网址https://mp.weixin.qq.com/s/MekCUJDhKzuUnoykkGoH2g
在安装llama-cpp-python的过程中,遇到一个error,废了我很长时间去解决,以下是error详细信息和解决方法:
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
*** scikit-build-core 0.9.7 using CMake 3.29.6 (wheel)
*** Configuring CMake...
2024-06-28 09:15:22,541 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
loading initial cache file C:\Users\01\AppData\Local\Temp\tmp6261tyw8\build\CMakeInit.txt
-- Building for: NMake Makefiles
CMake Error at CMakeLists.txt:3 (project):
Running'nmake' '-?'
failed with:
no such file or directory
CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!*** CMake configuration failed
[end of output]note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)
解决方法:
(大模型对错误的解析)错误信息表明在尝试构建一个项目时,
CMake
无法找到nmake
,并且CMAKE_C_COMPILER
和CMAKE_CXX_COMPILER
没有被设置。这通常意味着你的构建环境可能不完整或者没有正确配置。以下是一些可能的解决方案:安装 Visual Studio 的 C++ 工具
在 Windows 上,
nmake
通常是由 Visual Studio 的 C++ 工具提供的。你可以按照以下步骤安装:
- 下载并安装 Visual Studio Community Edition。
- 在安装过程中,选择“C++ 的桌面开发”工作负载,这将安装包括
nmake
在内的编译工具。- 按照上面两个步骤就可以解决以上错误并启动大模型,后面可以直接调用本地的大模型构建自己langchain项目啦