✨Pytorch 基础 3 -- Machine Learning Pipeline and demo model

本文介绍了PyTorch中构建机器学习管道的步骤,包括设计模型、构建损失函数和优化器,以及训练循环的过程。通过手动建立模型演示了如何消除手动计算的梯度,替换损失函数和更新过程,并改进前向传播。同时,文章引用了Python中`super()`的用法并提到了相关视频教程作为学习资源。
摘要由CSDN通过智能技术生成

目录

Machine Learning Pipeline

Demo Model: manually build a model

Eliminate manually computed gradient

Replace loss function and gradient update process

Replace forward pass

Ref

写在最后


Machine Learning Pipeline

Step 1: Deisgn model (input, output size, forward pass)

Step 2: Construct loss and optimizer

Step 3: Training Loop

      — forward pass: computer prediction

      — backward pass: gradients

      — update the weights

Demo Model: manually build a model

# Manually built a model
import numpy as np
# True function: f = 2*x
# using linear regression model: y = w*x
X = np.array([1, 2, 3, 4], dtype = np.float32)
Y = np.array([2, 4, 6, 8], dtype = np.float32)

w = 0.0 

# model prediction 
def forward(x):
	return w*x

# loss function: MSE = 1/N * (w*x - y)**2
def loss(y, y_pre):
	return (y_pre - y)**2.mean()

# gradient 
# dJ/dw = 1/N * 2*(w*x - y) * x
def gradient( x, y, y_pre):
	return np.dot(2*x, (y_pre - y)).mean()

print(f"prediction before training {forward(5):.3f}")
# Training loop
learning_rate = 0.01
n_iters = 10
for epoch in range(n_iters):
	# prediction
	y_pre = forward(X)
	l = loss(Y, y_pre)
	dw = gradient(X, Y, y_pre)

	# update the weights
	w -= learning_rate * gradient 
	if epoch % 2 == 0:
		print(f"epoch {epoch+1}: w = {w:.3f}, loss = {l:.8f}")
	
print(f"prediction after training {forward(5):.3f}")

Eliminate manually computed gradient

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值