- 三层神经网络
- 节点数:784*100*10
- 学习率: 0.1
- 预测结果得分(五次)
- 0.9512
- 0.9497
- 0.9506
- 0.9505
- 0.9464
- 平均预测得分:0.94968
- 四层神经网络
- 节点数:784*100*100*10
- 学习率 :0.1
- 预测结果得分(五次)
- 0.9095
- 0.9142
- 0.9033
- 0.9130
- 0.9046
- 平均预测得分:0.90892
- 结论:针对这种情况,简单的神经网络对MNIST数据集的分析,增加神经网络层数未能提高学习效果。
- 代码(参考 Tariq Rashid的《Python 神经网络编程》)
- 三层神经网络
# python notebook for Make Your Own Neural Network # code for 3-layer neural network, and code for learning the MNIST dataset # 20190603 import numpy as np import matplotlib.pyplot as plt # scipy.special for the sigmoid function expit() import scipy.special as special # ensure the plots are inside this notebook, not an external window %matplotlib inline # neural network class definition class neuralNetwork(object): # initialise the neural network def __init__(self, inputNodes, hiddenNodes, outputNodes, learningRate=0.5): # set number of nodes in each input, hidden, output layer self.iNodes = inputNodes self.hNodes = hiddenNodes self.oNodes = outputNodes # link weight matrices, wih and who # weights inside the arrays are w_i_j, where link is from node i to node j in # the next layer # w11 w21 # w12 w22 etc # pow(x, y), 返回x的y次方 self.wih = np.random.normal(0.0, pow(self.hNodes, -0.5), (self.hNodes, self.iNodes)) self.who = np.random.normal(0.0, pow(self.oNodes, -0.5), (self.oNodes, self.hNodes)) # learning rate self.lr = learningRate # activation function is the sigmoid function # lambda x: special.expit(x) 表示接受x,返回special.expit(x)函数 self.activation_function = lambda x: special.expit(x) pass # train the neural network def train(self, inputs_list, targets_list): # convert inputs list to 2d array inputs = np.array(inputs_list, ndmin=2).T targets = np.array(targets_list, ndmin=2).T # calculate signals into hidden layer hidden_inputs = np.dot(self.wih, inputs) # calculate the signals emerging from hidden layer hidden_outputs = self.activation_function(hidden_inputs) # calculate signals into final output layer final_inputs = np.dot(self.who, hidden_outputs) # calculate signals emerging from final output layer final_outputs = self.activation_function(final_inputs) # error is the (targets - final_outputs) output_errors = (targets - final_outputs) # hidden layer error is the output_errors, split by weights, recombined at # hidden nodes hidden_errors = np.dot(self.who.T, output_errors) # update the weights for the links between the hi
- 三层神经网络
基于《Python神经网络编程》分析三层BP神经网络与四层BP神经网络
最新推荐文章于 2022-09-06 21:42:22 发布
本文对比了三层和四层BP神经网络在处理MNIST数据集时的表现。三层网络的平均预测得分为0.94968,而四层网络的平均预测得分为0.90892,结果显示增加神经网络层数并未显著提升学习效果。
摘要由CSDN通过智能技术生成