Pytorch(六) —— 模型调优tricks

1.正则化 Regularization

1.1 L1正则化

import torch
import torch.nn.functional as F
from torch import nn

device=torch.device("cuda:0")
MLP = nn.Sequential(nn.Linear(128,64),
                    nn.ReLU(inplace=True),
                    nn.Linear(64,32),
                    nn.ReLU(inplace=True),
                    nn.Linear(32,10)
)
MLP.to(device) 
loss_classify = nn.CrossEntropyLoss().to(device)
# L1范数
l1_loss = 0
for param in MLP.parameters():
    l1_loss += torch.sum(torch.abs(param))
loss = loss_classify+l1_loss

1.2 L2正则化

import torch
import torch.nn.functional as F
from torch import nn

device=torch.device("cuda:0")
MLP = nn.Sequential(nn.Linear(128,64),
                    nn.ReLU(inplace=True),
                    nn.Linear(64,32),
                    nn.ReLU(inplace=True),
                    nn.Linear(32,10)
)
MLP.to(device) 


# L2范数
opt = torch.optim.SGD(MLP.parameters(),lr=0.001,weight_decay=0.1) # 通过weight_decay实现L2
loss = nn.CrossEntropyLoss().to(device)

2 动量与学习率衰减

2.1 momentum

opt = torch.optim.SGD(model.parameters(),lr=0.001,momentum=0.78,weight_decay=0.1)

2.2 learning rate tunning

  • torch.optim.lr_scheduler.ReduceLROnPlateau() 当损失函数值不降低时使用
  • torch.optim.lr_scheduler.StepLR() 按照一定步数降低学习率
opt = torch.optim.SGD(net.parameters(),lr=1)
lr_scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer=opt,mode="min",factor=0.1,patience=10)
for epoch in torch.arange(1000):
    loss_val = train(...)
    lr_scheduler.step(loss_val) # 监听loss
opt = torch.optim.SGD(net.parameters(),lr=1)
lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer=opt,step_size=30,gamma=0.1)
for epoch in torch.arange(1000):
    lr_scheduler.step() # 监听loss
    train(...)

3. Early Stopping

点击这里

4. Dropout

model = nn.Sequential(
nn.Linear(256,128),
nn.Dropout(p=0.5),
nn.ReLu(),
)

by CyrusMay 2022 07 03

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值