B站 刘二大人 ,传送门PyTorch深度学习实践——逻辑斯蒂回归
参考错错莫课代表的PyTorch *深度学习实践 第6讲*https://blog.csdn.net/bit452/article/details/109680909
笔记:
1.逻辑斯蒂回归是个分类模型,和之前的线性回归相比,就是在Linear Unit后面增加sigmoid函数。
不同分布的差异:KL散度,cross-entropy交叉熵
BCELoss 是CrossEntropyLoss的一个特例,只用于二分类问题,而CrossEntropyLoss可以用于二分类,也可以用于多分类。 如果是二分类问题,建议BCELoss
源代码如下(可视化):
import torch import matplotlib.pyplot as plt # prepare dataset x_data = torch.Tensor([[1.0], [2.0], [3.0]]) y_data = torch.Tensor([[0], [0], [1]]) # design model using class class LogisticRegressionModel(torch.nn.Module): def __init__(self): super(LogisticRegressionModel, self).__init__() self.linear = torch.nn.Linear(1, 1) def forward(self, x): # y_pred = F.sigmoid(self.linear(x)) y_pred = torch.sigmoid(self.linear(x)) return y_pred model = LogisticRegressionModel() # construct loss and optimizer # 默认情况下,loss会基于element平均,如果size_average=False的话,loss会被累加。 criterion = torch.nn.BCELoss(size_average=False) optimizer = torch.optim.SGD(model.parameters(), lr=0.01) mse_list=[] epoch_list=[] # training cycle forward, backward, update for epoch in range(1000): y_pred = model(x_data) loss = criterion(y_pred, y_data) print(epoch, loss.item()) optimizer.zero_grad() loss.backward() optimizer.step() mse_list.append(loss.item()) epoch_list.append(epoch) print('w = ', model.linear.weight.item()) print('b = ', model.linear.bias.item()) x_test = torch.Tensor([[4.0]]) y_test = model(x_test) print('y_pred = ', y_test.data) plt.plot(epoch_list,mse_list) plt.xlabel('epoch') plt.ylabel('cost') plt.show()
部分输出结果:
997 1.1217535734176636
998 1.1211905479431152
999 1.1206283569335938
w = 1.0719727277755737
b = -2.595684766769409
y_pred = tensor([[0.8445]])
可视化效果: