以下代码已经测试通过,可以实现功能 import torch from torch.autograd import Variable N,D_in,H,D_out = 64,1000,100,10 x = Variable(torch.randn(N,D_in),requires_grad=False) y = Variable(torch.randn(N,D_out),requires_grad=False) w1 = Variable(torch.randn(D_in,H),requires_grad=True) w2 = Variable(torch.randn(H,D_out),requires_grad=True) learning_rate = 1e-6 for t in range(500): y_pred = x.mm(w1).clamp(min=0).mm(w2) loss = ((y_pred - y).pow(2).sum()) # loss_function = loss.MSELoss() # w1.grad = True # w2.grad = True # print("#########") # print(w1.grad) # print ("########") if (w1.grad is not None): w1.grad.data.zero_() if (w2.grad is not None): w2.grad.data.zero_() loss.backward() w1.data -= learning_rate * w1.grad.data w2.data -= learning_rate * w2.grad.data print("1#############") print(w1.data) print("2#############") print(w2.data) print("3#############") print (y_pred) print("4#############") print (y)
anaconda+pytorch实现简单的两层神经网络
最新推荐文章于 2024-06-21 13:32:48 发布