How to run TF hub locally without internet connection

thanks to  Xianbo QIAN

https://blog.csdn.net/bleedingfight/article/details/80768306

https://github.com/tensorflow/hub/issues/53

TF hub provides access to a list of model made free by Google. However it’s sometimes hard to use due to the huge delay downloading the model, especially for prod deployment.

Here I’m going to tell you some undocumented hack to help you fetch the model and put it on cache so that you won’t need to download it again even after you changing to another machine.

I personally tested this approach in TF 1.9 but I have no guarantee that this would work indefinitely.

Step 0: Find the model

Go to tfhub.dev, then modules, pick a module you’d like to use and copy the module URL. Something like this: https://tfhub.dev/google/imagenet/inception_v1/feature_vector/1

Step 1: Get the real download path

The module path below is convenient URL for you to find more information about the model, i.e. if you copy paste that URL to your browser, you’ll get redirected to the module’s home page.

So how can we find the real download path of that module? Easy. Just replace tfhub.dev with storage.googleapis.com/tfhub-modules and append .tar.gz as the suffix.

For example, the real download path for the module above will be https://storage.googleapis.com/tfhub-modules/google/imagenet/inception_v1/feature_vector/1.tar.gz

Step 2: Prepare the cache

On some platforms, TF hub will log the cache directory, but some won’t. Specifying the cache location in the code is much more reliable. Just put the following code in your file before calling tfhub to do so.

os.environ["TFHUB_CACHE_DIR"] = '/tmp/tfhub'

It’s important to understand the structure of this cache directory before putting anything into it. Each module is saved in a standalone folder named after path hash, which can be calculated by the following Python code:

import hashlib
handle = "https://tfhub.dev/google/imagenet/inception_v1/feature_vector/1"
hashlib.sha1(handle.encode("utf8")).hexdigest()

The output would be something f002061d9dee6acda3f90d591a65dbab7627f665.

In order to use modules in the cache directory instead of downloading it from the cloud. You’ll need to uncompress the .tar.gz file then put it under the cache directory with a name calculated above.

Your final cache directory should look like the following:

/    # your cache directory
/f002061d9dee6acda3f90d591a65dbab7627f665/        # a module folder
                                         /assets/
                                         /variables/
                                         /saved_model.pb
                                         /tfhub_module.pb

Well done. Now get back to your tfhub program and feel the magic!

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值