from datasets import load_dataset
from transformers import AutoTokenizer, DataCollatorWithPadding
raw_datasets = load_dataset("glue", "mrpc")
checkpoint = "bert-base-uncased"
tokenizer = AutoTokenizer.from_pretrained(checkpoint,cachedir="/root/userfolder/HuggingFace/cache")
def tokenize_function(example):
return tokenizer(example["sentence1"], example["sentence2"], truncation=True)
tokenized_datasets = raw_datasets.map(tokenize_function, batched=True)
data_collator = DataCollatorWithPadding(tokenizer=tokenizer)
报错:
cannot import name 'create_repo' from 'huggingface_hub'
参考github上的解答
https://github.com/huggingface/transformers/issues/15062
把datasets,transformer,huggingface_hub全部都升到最新版之后问题解决
接下来在定义超参数的时候又出现了问题
这是win10电脑导入的速度
一分多钟
可是服务器已经一个通宵,有些离谱了家人们
这到底是啥问题,两边python的版本都是3.9.7
不知道,改了环境,路径,重写参数都不行,摆烂了