nvidia-docker部署Tensorflow Serving(CPU+GPU) 环境准备
官网:TensorFlow Serving with Docker
一、Docker部署Tensorflow Serving CPU版本
# Download the TensorFlow Serving Docker image and repo
docker pull tensorflow/serving
git clone https://github.com/tensorflow/serving
# Location of demo models
TESTDATA="$(pwd)/serving/tensorflow_serving/servables/tensorflow/testdata"
# Start TensorFlow Serving container and open the REST API port
# 方法一
sudo docker run -t --rm -p 8501:8501 -v "/home/mochen/tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu:/models/half_plus_two" -e MODEL_NAME=half_plus_two tensorflow/serving
# 方法二
sudo docker run -dt -p 8501:8501 -v "/home/mochen/tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu:/models/half_plus_two" -e MODEL_NAME=half_plus_two tensorflow/serving
# 方法三
sudo docker run -d -p 8501:8501 --mount type=bind,source=/home/mochen/tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu/,target=/models/half_plus_two -e MODEL_NAME=half_plus_two -t --name testserver tensorflow/serving
# Query the model using the predict API
curl -d '{"instances": [1.0, 2.0,

本文介绍了如何使用NVIDIA-Docker部署Tensorflow Serving的CPU和GPU版本。在CPU版本中,涉及Docker镜像、端口映射和环境变量配置。在GPU版本中,详细阐述了nvidia-docker的安装步骤,以及如何启动GPU支持的Tensorflow Serving服务。同时,还涵盖了gRPC部署手写体mnist模型的客户端配置、模型训练与保存、服务运行等环节。
最低0.47元/天 解锁文章
302

被折叠的 条评论
为什么被折叠?



