报错的原因呢是因为使用迭代器的时候用了多个workers
即在这一句for (tokens_X, segments_X, valid_lens_x, pred_positions_X, mlm_weights_X, mlm_Y, nsp_y) in train_iter: 而这个train_iter的参数为:
num_workers = d2l.get_dataloader_workers()
train_iter = torch.utils.data.DataLoader(train_set, batch_size, shuffle=True,num_workers=num_workers)
解决方法:方便起见可以直接把num_workers=num_workers这个参数去掉就行了即:
train_iter = torch.utils.data.DataLoader(train_set, batch_size, shuffle=True)