来源bert 预训练模型路径里面有更全的链接
第一次直接运行BertModel.from_pretrained(‘bert-base-chinese’)下载模型非常慢,需提前先下载好
踩雷:外网链接用浏览器下载也很慢下载很慢T_T,建议把链接复制到迅雷下载会快很多
存一下模型下载链接
‘bert-base-uncased’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz”,
‘bert-large-uncased’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz”,
‘bert-base-cased’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz”,
‘bert-large-cased’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz”,
‘bert-base-multilingual-uncased’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz”,
‘bert-base-multilingual-cased’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz”,
‘bert-base-chinese’: “https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz”
调用
model = BertModel.from_pretrained(‘F:\1-数据库\1-1-自然语言\bert-base-uncased.tar.gz’)