快速解决tensorflow-hub的调用问题!!
再经历无数次试错之后,终于可以跑tensorflow-hub了,在此写出来,让大家少走弯路。
1、用pip安装库
首先一定要用pip来安装如下的包。(原因:text包需要用pip安装,conda链接找不到。同时text包安装会自动下载tensorflow)
tensorflow 2.6.2
tensorflow-addons 0.16.1
tensorflow-estimator 2.6.0
tensorflow-hub 0.8.0
tensorflow-text 2.6.0
2、用conda下载安装tensorflow-gpu
conda下载gpu的tensorflow
tensorflow 2.6.0
大功告成,可以去跑了!
错误分析:
- no module keras ;解决:pip没下载tensorflow
- KeyError:'CaseFoldUTF8;解决:没有下载tensorflow-text,
需要import tensorflow_text as text - hub.KerasLayer(URL) 链接内的model无法下载 ;
preprocess = hub.KerasLayer(
"https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/2",
name="text_preprocessing",
)
# Load the pre-trained BERT model to be used as the base encoder.
bert = hub.KerasLayer(
"https://tfhub.dev/tensorflow/small_bert/bert_en_uncased_L-4_H-512_A-8/1",
"bert",
)
解决:
(1)国外链接要下载内容,你懂的
(2)手动复制链接内容到浏览器,可以下载文件夹,然后把代码中链接更改为本地文件。
preprocess = hub.KerasLayer(
r"D:\bert_en_uncased_preprocess_3",
name="text_preprocessing",
)
# Load the pre-trained BERT model to be used as the base encoder.
bert = hub.KerasLayer(
r"D:\small_bert_bert_en_uncased_L-4_H-512_A-8_2",
"bert",
)
(3)更改链接
#tfhub_bert="https://tfhub.dev/tensorflow/small_bert/bert_en_uncased_L-2_H-128_A-2/1"
tfhub_bert="https://storage.googleapis.com/tfhub-modules/tensorflow/small_bert/bert_en_uncased_L-2_H-128_A-2/1.tar.gz"