基于《Python神经网络编程》分析三层BP神经网络与四层BP神经网络

本文对比了三层和四层BP神经网络在处理MNIST数据集时的表现。三层网络的平均预测得分为0.94968,而四层网络的平均预测得分为0.90892,结果显示增加神经网络层数并未显著提升学习效果。
摘要由CSDN通过智能技术生成
  1. 三层神经网络
    1. 节点数:784*100*10
    2. 学习率: 0.1
    3. 预测结果得分(五次)
      1.  0.9512
      2.  0.9497
      3.  0.9506
      4.  0.9505
      5.  0.9464
    4. 平均预测得分:0.94968
  2. 四层神经网络
    1. 节点数:784*100*100*10
    2. 学习率 :0.1
    3. 预测结果得分(五次)
      1.  0.9095
      2.  0.9142
      3.  0.9033
      4.  0.9130
      5.  0.9046
    4. 平均预测得分:0.90892
  3. 结论:针对这种情况,简单的神经网络对MNIST数据集的分析,增加神经网络层数未能提高学习效果。
  4. 代码(参考 Tariq Rashid的《Python 神经网络编程》)
    1. 三层神经网络 
      # python notebook for Make Your Own Neural Network
      # code for 3-layer neural network, and code for learning the MNIST dataset
      # 20190603
      
      import numpy as np
      import matplotlib.pyplot as plt
      # scipy.special for the sigmoid function expit()
      import scipy.special as special
      # ensure the plots are inside this notebook, not an external window
      %matplotlib inline
      
      # neural network class definition
      class neuralNetwork(object):
          
          # initialise the neural network
          def __init__(self, inputNodes, hiddenNodes, outputNodes, learningRate=0.5):
              # set number of nodes in each input, hidden, output layer
              self.iNodes = inputNodes
              self.hNodes = hiddenNodes
              self.oNodes = outputNodes
              # link weight matrices, wih and who
              # weights inside the arrays are w_i_j, where link is from node i to node j in 
              # the next layer
              # w11 w21
              # w12 w22 etc
              # pow(x, y), 返回x的y次方
              self.wih = np.random.normal(0.0, pow(self.hNodes, -0.5), (self.hNodes, self.iNodes))
              self.who = np.random.normal(0.0, pow(self.oNodes, -0.5), (self.oNodes, self.hNodes))
              # learning rate
              self.lr = learningRate
              # activation function is the sigmoid function
              # lambda x: special.expit(x)  表示接受x,返回special.expit(x)函数
              self.activation_function = lambda x: special.expit(x)
              pass
          
          # train the neural network
          def train(self, inputs_list, targets_list):
              # convert inputs list to 2d array
              inputs = np.array(inputs_list, ndmin=2).T
              targets = np.array(targets_list, ndmin=2).T
              
              # calculate signals into hidden layer
              hidden_inputs = np.dot(self.wih, inputs)
              # calculate the signals emerging from hidden layer
              hidden_outputs = self.activation_function(hidden_inputs)
              
              # calculate signals into final output layer
              final_inputs = np.dot(self.who, hidden_outputs)
              # calculate signals emerging from final output layer
              final_outputs = self.activation_function(final_inputs)
              
              # error is the (targets - final_outputs)
              output_errors = (targets - final_outputs)
              
              # hidden layer error is the output_errors, split by weights, recombined at 
              # hidden nodes
              hidden_errors = np.dot(self.who.T, output_errors)
              
              # update the weights for the links between the hi
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值