pytorch的优化器中,如果我们直接像调用其他优化器一样调用LBFGS,如下形式:
criterion = nn.MSELoss()
optimizer = torch.optim.LBFGS(params=densenet2.parameters(), lr=1e-5)
dataset = DataLoader(dataset = data_gen,
batch_size = 32,
shuffle=True)
epochs = 50
for epoch in tqdm(range(epochs)):
for data in dataset:
optimizer.zero_grad()
input, output = data
pred_output = densenet2(input)
loss = criterion(pred_output, output)
optimizer.step()
那么会得到这个报错:
TypeError: LBFGS.step() missing 1 required positional argument: 'closure'
因为这个优化器的调用语法和一般的不太一样,需要传递一个闭包参数,可以以如下形式写:
def closure():
pred_output = densenet2(input)
loss = criterion(pred_output, output)
# print("batch loss: {:.9f}".format(loss.item()))
loss.backward()
return loss
optimizer.step(closure=closure)
然后就可以使用LBFGS进行优化啦。