readme

Export and serve model on tensorflow serving

A lightweight, RESTful remote inference library for decoupling deep learning development and deployment.
Includes serving trained Keras model for pointer-based meter reading

Usage

  1. Follow instructions in save_keras_model.py to export model to SavedModel/1 folder. run python save_keras_model.py --output_dir $(path) --model_version $(model_version) in your bash to specify the output directory and model version for tensorflow serving

  2. Make sure tensorflow/serving docker image is pulled by running docker image in your bash and check if tensorflow/serving is pulled. If not, run docker pull tensorflow/serving. It will take a couple of minutes.

  3. In your terminal, run docker run -d --name serving_base tensorflow/serving to start a docker container on local machine

  4. make a diretory for saving exported model.
    mkdir -p /tmp/pointer_model
    cp -r $(Path_to_SavedModel) /tmp/pointer_model

  5. Run docker cp /tmp/pointer_model serving_base:/models/pointer_model to copy the exported model to docker container

  6. (Optional) stop serving_base container by running docker kill serving_base

  7. Run docker run -p 8501:8501 --mount type=bind,source=/tmp/pointer_model,target=/models/pointer_model -e MODEL_NAME=pointer_model -t tensorflow/serving. By running this command, this will run in the docker container tensorflow_model_server --port=8500 --rest_api_port=8501 \ --model_name=${MODEL_NAME} --model_base_path=${MODEL_BASE_PATH}/${MODEL_NAME}

  8. Now you can serve your image by running in your terminal python pointer_server_client -i $(input_image_path)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值