查看log定位error位置:
在scaler.scale(loss).backward()前添加torch.use_deterministic_algorithms(False)
torch.use_deterministic_algorithms(False) # added
scaler.scale(loss).backward()
查看log定位error位置:
在scaler.scale(loss).backward()前添加torch.use_deterministic_algorithms(False)
torch.use_deterministic_algorithms(False) # added
scaler.scale(loss).backward()