转载自 听风的蜗牛 博客 感谢作者分享(侵权告知必删)
博客网址:
https://blog.csdn.net/zhouzx2010/article/details/71126800/
https://www.cnblogs.com/xuhongbin/p/6666826.html
代码:
https://github.com/tingfengjushi/mymllib/tree/master/NeuralNetwork
代码实现:
#!/usr/bin/env
# -*- coding:utf-8 -*-
import numpy as np
#3.1定义激活函数一般有双曲函数和逻辑函数
#双曲正切函数: tanhx=sinhx/coshx
def tanh(x):
return np.tanh(x)
#双曲函数的导数: (tanhx)'=(sechx)^2=1-(tanhx)^2
def tanh_derivative(x):
return 1.0 - np.tanh(x) * np.tanh(x)
#逻辑函数(logistic function): f(x)=1/(1+e^(-x))
def logistic(x):
return 1 / (1 + np.exp(-x))
#逻辑函数的导数: f'(x)=f(x)*[1-f(x)]
def logistic_derivative(x):
return logistic(x) * (1 - logistic(x) )
#3.2然后定义两层的神经网络的class,初始化激活函数和权值:
class NeuralNetwork