pytorch 将模型中的所有BatchNorm2d layer转换为SyncBatchNorm layer: (单机多卡设置下)
import torch.distributed as dist
dist.init_process_group(backend='gloo', init_method='file:///tmp/somefile', rank=0, world_size=1)
model = torch.nn.SyncBatchNorm.convert_sync_batchnorm(model)