Three layer Neural Networks

【Three layer Neural Networks】

1.Abstract
In this paper, we shall first briefly introduce the origin of neural networks and related concepts.

2.Introduction
To begin with, we will provide a brief background on the Neural Networks. We all know that three-layer neural networks developed from perceptron[1]. First of all, Perceptron traced back to a paper of Frank Rosenblatt, an American scholar. The title of the article is Electronic ‘Brain’ Teaches Itself and its published in the New York Times in 1957.

Secondly, we talk briefly about the perceptron that the most fundamental part of neural networks. Perceptron has lots of input signs and its multiplied by the weight, you end up with an output sign. The process is very simple but it pioneered neural networks. However, it has some disadvantages that only process the linear model and unable to solve the xor problem. So the nonlinear activation function is introduced. It forms a three-layer neural network. Details on the three-layer neural network are discussed in later sections.

3.Theory
This essay mainly explains the origin of the three-layer neural network and Backward Propagation[2] and related theory. Three-layer NN also has many input signs and multiplied by the weight, unlike the perceptron, it has an activation function in the middle and lots of output signs, in the end, some model use function such as sigmoid, tanh, relu, leakyrelu. It forms a three-layer NN that processes non-linear problems.

According to Werbos’s paper discussed the aspects of the backpropagation (BP) algorithm that is used in the Three-layer NN model. It includes forward propagation and backward propagation[3]. Firstly, forward propagation is three-layer NN’s process. If we obtained reality and expectations do not agree, it enters backward propagation and adjusts the error and weight. Until we can accept its results.

4.Conclusion
To sum up, Three-layer NN and BP to promote the development of the neural networks. It lays a foundation for the follow-up theory. As a result, we have an application for it.

5.Conference
[1]Rosenblatt F. The perceptron: a probabilistic model for information storage and organization in the brain[J]. Psychological Review. 1958, 65(6): 386-408.
[2]Werbos, P. (1974) Beyond regression: New tools for prediction and analysis in the behavioral sciences. Ph.D. Dissertation, Harvard University.
[3]Rumelhart,McClelland 于 1985 年提出了 BP 网络的误差反向后传播。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值