Pytorch基础-----多分类问题
Softmax Classifier
Output a Distribution of prediction with Softmax
Softmax Layer
Softmax Layer-Example
Loss function - Cross Entropy
Cross Entropy in Pytorch
Implementation of classifier to MNIST dataset
Course of Training
1. Prepare dataset
2. Design model using Class inherited from nn.Module
3. Construct loss and optimizer using Pytorch API
4. Traing cycle and Test ( forward,backward,update )
1.Prepare Dataset
2.Design model using Class inherited from nn.Module
(1)64维到10维最后一层不使用激活函数,在损失函数中Softmax层进行最后一层变化
(2)将像素转化为一个矩阵 x=x.view(-1,784)
#Design model
class Net(torch.nn.Module):
def __init__(self):
super(Net,self).__init__()
self.l1=torch.nn.Linear(784,512)
self.l2=torch.nn.Linear(512,256)
self.l3=torch.nn.Linear(256,128)
self.l4=torch.nn.Linear(128,64)
self.l5=torch.nn.Linear(64,10)
def forward(self,x):
x=x.view(-1,784)
x=F.relu(self.l1(x))
x=F.relu(self.l2(x))
x=F.relu(self.l3(x))
x=F.relu(self.l4(x))
return self.l5(x) #不使用激活函数
model=Net()
Construct loss and optimizer using Pytorch API
#Construct loss and optimizer
criterion=torch.nn.CrossEntropyLoss()
#增加冲量,跳出局部最优解和鞍点
optimizer=optim.SGD(model.parameters(),lr=0.01,momentum=0.5)
Traing cycle and Test
#Training Cycle
def train(epoch):
runing_loss=0.0
for batch_idex,data in enumerate(train_loader,0):
inputs,target=data
optimizer.zero_grad()
#forward+backward+update
outputs=model(inputs)
loss=criterion(outputs,target)
loss.backward()
optimizer.step()
runing_loss +=loss.item()
if batch_idex %300==299:
print('[%d %5d] loss: %.3f'%(epoch+1,batch_idex+1,runing_loss/300) )
runing_loss=0.0
def test():
correct=0
total=0
with torch.no_grad():
for data in test_loader:
images,labels=data
outputs=model(images)
_,predicted=torch.max(outputs.data,dim=1)
total +=labels.size(0)
correct +=(predicted==labels).sum().item()
print("Accuray on test set: %d %%"%(100*correct/total))
test
if __name__ == '__main__':
for epoch in range(10):
train(epoch)
test()