Deep Learning for Computer Vision with Python
Deep Learning for Computer Vision with Python之GD
前言
随着人工智能的不断发展,机器学习这门技术也越来越重要,很多人都开启了学习机器学习,本文就介绍了机器学习的基础内容。
一、GD是什么?
GD就是传统得到梯度下降法。
梯度:就是对多元函数在某点求偏导,并且令偏导等于0,这点的下降速度是最快的(对于局部来说)
因此,我们将参数学习模型简化为F = w1x1 + w2x2 + b1 + b2,对上述四个参数w1,w2 ,b1 ,b2求偏导,对应求出来的值,就是其梯度。
更新原来的参数以w1为例, w1 = w1 + lr * (w1的偏导)(lr叫做学习率,也可以叫做步长:沿着梯度方向走多远)
更新完所有参数,则完成了一次GD。
二、使用步骤
1.引入库
代码如下(示例):
import numpy as np
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report
from sklearn.datasets import make_blobs
import matplotlib.pyplot as plt
import argparse
2.代码
代码如下(示例):
parse = argparse.ArgumentParser()
parse.add_argument("lr", type = float, default = 0.01, help = "input your learning rate")
parse.add_argument("--epoch", type = int, default = 500, help = "input epoch")
arg = parse.parse_args()
def activation_fuction(x):
return 1 / (1 + np.exp(-x))
def predict(X , W):
preds = activation_fuction(X.dot(W))
preds[preds <= 0.5] = 0
preds[preds > 0.5] = 1
return preds
if __name__ == '__main__':
#generate datat sample
(X ,Y) = make_blobs(1000, n_features = 2, centers = 2, cluster_std = 1.5, random_state = 1)
#X的shape = (1000, 2), Y的shape = (1000, 1)
Y = Y.reshape(Y.shape[0], 1)
X = np.c_[X, np.ones((X.shape[0], 1))]
#随机切割样本
trainx, testx, trainy, testy = train_test_split(X, Y, test_size = 0.5, random_state = 42)
######初始化权重矩阵W
W = np.random.randn(3,1)
Loss = []
for i in range(arg.epoch):
preds = predict(trainx, W)
error = trainy - preds
loss = np.sum(1/2*error**2)
Loss.append(loss)
gradient = trainx.T.dot(error)
W += arg.lr * gradient
if i > 0 and (i + 1)%5:
print(f"epoch {i+1}/{arg.epoch+1}: {loss}")
preds = predict(testx, W)
print(classification_report(testy, preds))
color = []
for s in testy:
if s == 1:
color.append('r')
else:
color.append('b')
##第一张图,测试集分布散点图
plt.style.use("ggplot")
plt.figure()
plt.title("data")
plt.scatter(testx[:,0], testx[:,1],c = color)
##第二章图 loss图
plt.style.use("ggplot")
plt.figure()
plt.title("Train Loss")
plt.xlabel("epoch")
plt.ylabel("Loss")
plt.plot(range(0,arg.epoch), Loss)
plt.show()
总结
就这样啦