数据挖掘课讲到了神经网络,让自己写一个简单的例子,笔者觉得非常好,特别有助于加强对神经网络的认识,所以来做一个小小的分享,让我们一起共同进步吧。
1.问题描述:
2.代码展示
dxdy.py
'''
coding:utf-8
@author: Li Sentan
@time:2021.10.24
@file:dxdy.py
'''
from layer_naive import *
x = eval(input("Please input the value of x:"))
y = eval(input("Please input the value of y:"))
x = x*math.pi
threex = MulLayer()
sin = sinLayer()
netwoy = MulLayer()
e = eLayer()
twoadd = AddLayer()
div = divLayer()
# forward
threex1 = threex.forward(3,x)
sin1 = sin.forward(threex1)
netwoy1 = netwoy.forward(-2,y)
e1 =e.forward((netwoy1))
twoadd1 = twoadd.forward(sin1,e1)
div1 = div.forward(twoadd1,8)
result = min(5,div1)
# backward
dresult = 1
dtwoadd,dy= div.backward(dout = dresult)
dsin,de = twoadd.backward(dtwoadd)
dnetwoy = e.backward(de)
dx2,dy = netwoy.backward(dnetwoy)
dthreex = sin.backward(dsin)
dy2,dx = threex.backward(dthreex)
print("result:", result)
print("dx:", int(round(dx,0)))
print("dy:", int(round(dy,0)))
layer_naive.py
'''
coding:utf-8
@author: Li Sentan
@time:2021.10.24
@file:layer_naive.py
'''
import math
class sinLayer:
def __init__(self):
pass
def forward(self,x):
self.x = x
out = math.sin(x)
return out
def backward(self, dout):
dx = dout * math.cos(self.x)
return dx
class divLayer:
def __init__(self):
pass
def forward(self, x, y):
self.x = x
self.y = y
out = y/x
return out
def backward(self, dout):
dx = dout * (-self.y)/(self.x**2)
dy = dout * (1/self.x)
return dx, dy
class eLayer:
def __init__(self):
pass
def forward(self, x):
self.x = x
out = math.exp(x)
return out
def backward(self, dout):
dx = dout * math.exp(self.x)
return dx
class MulLayer:
def __init__(self):
self.x = None
self.y = None
def forward(self, x, y):
self.x = x
self.y = y
out = x * y
return out
def backward(self, dout):
dx = dout * self.y
dy = dout * self.x
return dx, dy
class AddLayer:
def __init__(self):
pass
def forward(self, x, y):
out = x + y
return out
def backward(self, dout):
dx = dout * 1
dy = dout * 1
return dx, dy
3.结果展示:
结果跟计算一致。
从中我们可以感受到,通俗点概述,我们就是将一个复合函数分成一个个简单函数,forward就是向一个个简单函数按顺序代入值, backward就是反顺序对一个个简单函数进行求导,然后进行乘积,跟我们平时的对复合函数的求导思想一致。哈哈哈,放个好玩的图,深度学习 so easy!