一个全连接的 ReLU 神经网络,一个隐藏层,没有 bias,用来从 x 预测 y。
题目要求:
- 用 pytorch 实现两层神经网络拟合。
- 两层线性层:
h = W1 * x
h_relu = relu(h)
y = W2 * x - 优化方法使用随机梯度下降(SGD)
- Loss 函数使用均方误差函数(MSE)
代码实现:
import torch
EPOCH = 500
LEARNING_RATE = 1e-6
N, D_in, H, D_out = 64, 1000, 100, 10
X = torch.randn(N, D_in)
y = torch.randn(N, D_out)
W1 = torch.randn(D_in, H, requires_grad=True)
W2 = torch.randn(H, D_out, requires_grad=True)
relu=torch.nn.ReLU()
loss_function=torch.nn.MSELoss()
for i in range(EPOCH):
h=torch.mm(X,W1)
h_relu=relu(h)
y_pred=torch.mm(h_relu,W2)
loss=loss_function(y_pred,y)
if i%50==0:
print('epoch:',i,'loss:',loss)
for xi,yi,yi_pred in zip(X,y,y_pred):
xi=xi.unsqueeze(dim=0)
hi=torch.mm(xi,W1)
hi_relu = relu(hi)
yi_pred = torch.mm(hi_relu, W2)
lossi=loss_function(yi_pred,yi)
lossi.backward()
W1=W1-LEARNING_RATE*W1.grad
W1.retain_grad()
W2=W2-LEARNING_RATE*W2.grad
W2.retain_grad()
print("y:",y)
print("pred_y",y_pred)