基础环境:Ubuntu 22.04.1 LTS \n \l

安装ollama

1.进入官网 Ollama下载

2.或者在命令行直接执行

curl -fsSL https://ollama.com/install.sh | sh
  • 1.

3.检测ollama是否安装成功

ollama
  • 1.

显示如下证明安装成功

Usage:
  ollama [flags]
  ollama [command]

Available Commands:
  serve       Start ollama
  create      Create a model from a Modelfile
  show        Show information for a model
  run         Run a model
  pull        Pull a model from a registry
  push        Push a model to a registry
  list        List models
  ps          List running models
  cp          Copy a model
  rm          Remove a model
  help        Help about any command

Flags:
  -h, --help      help for ollama
  -v, --version   Show version information

Use "ollama [command] --help" for more information about a command.
  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.

4.启动ollama

ollama serve
  • 1.

5.查看已经下载的模型

ollama list
  • 1.

6.运行模型(以千问2为例,这一步包括拉取模型和启动模型)

ollama run qwen2:7b
  • 1.

7.通过ollama拉取的模型位置默认在此路径

/root/.ollama/models
  • 1.

open-webui

1.pip镜像源

pip config set global.index-url https://pypi.tuna.tsinghua.edu.cn/simple
  • 1.

2.pip安装open-webui

pip install open-webui
  • 1.

3.启动open-webui

open-webui serve
  • 1.

4.如果启动报错,请操作(原因是我们无法访问huggingface)

export HF_ENDPOINT="http://hf-mirror.com"
  • 1.