使用服务器本地部署和运行AlphaFold本地版(localcolabfold)

实验室的服务器升级了3090,用来跑alphafold,在此记录部署过程

 

一、环境准备

 

#miniconda最新版下载安装

mkdir -p ~/miniconda3

wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh

bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3

rm -rf ~/miniconda3/miniconda.sh

 

#`curl`, `git`, and `wget`更新或安装

sudo apt update

sudo apt -y install curl git wget

 

#安装gcc

sudo apt update

sudo apt install build-essential

 

#查看gcc版本

gcc --version

gcc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0

Copyright (C) 2021 Free Software Foundation, Inc.

This is free software; see the source for copying conditions.  There is NO

warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

 

#CUDA下载安装

#https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=WSL-Ubuntu&target_version=2.0&target_type=deb_local

#linux-x86_64-WSL-Ubuntu-2.0-deb(local)

wget https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin

sudo mv cuda-wsl-ubuntu.pin /etc/apt/preferences.d/cuda-repository-pin-600

wget https://developer.download.nvidia.com/compute/cuda/12.3.2/local_installers/cuda-repo-wsl-ubuntu-12-3-local_12.3.2-1_amd64.deb

sudo dpkg -i cuda-repo-wsl-ubuntu-12-3-local_12.3.2-1_amd64.deb

sudo cp /var/cuda-repo-wsl-ubuntu-12-3-local/cuda-*-keyring.gpg /usr/share/keyrings/

sudo apt-get update

sudo apt-get -y install cuda-toolkit-12-3

 

#根据colabfold当前版本的要求,更改jax版本

#升级jax

pip install -U "jax[cuda12_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html

pip install --upgrade "jax[cuda12_local]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html

#降级到0.4.23

pip install --upgrade "jax[cuda12_pip]==0.4.23" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html


二、部署ColabFold本地版

# LocalColabFold安装

wget https://raw.githubusercontent.com/YoshitakaMo/localcolabfold/main/install_colabbatch_linux.sh

bash install_colabbatch_linux.sh

 

#添加环境变量PATH

nano ~/.bashrc

#/home/localcolabfold/colabfold-conda/bin

export PATH="/home/localcolabfold/colabfold-conda/bin:$PATH"

export TF_FORCE_UNIFIED_MEMORY="1"

export XLA_PYTHON_CLIENT_MEM_FRACTION="4.0"

export XLA_PYTHON_CLIENT_ALLOCATOR="platform"

export TF_FORCE_GPU_ALLOW_GROWTH="true"

source ~/.bashrc

 

#运行测试

conda activate /home/localcolabfold/colabfold-conda

colabfold_batch <directory_with_fasta_files> <result_dir>

 

  • 10
    点赞
  • 12
    收藏
    觉得还不错? 一键收藏
  • 6
    评论
评论 6
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值