1.brief introduction
- is one of the Classification Algorithms
- is the basis for neural networks
2.linear boundary
- Find a line to separate tha data.
- The points over the line can be accepted(as blue),while those under the line can be rejected(as red).
- Our goal is to have y-hat resembling y as closely as possible.
3.in higher dimension
linear boundary in 3 dimensions
linear boundary in n dimensions
4.perceptron
- It's the building block of neural networks.It's just an encoding of our equation into a small graph.
step function(one of the step functions)
now the perceptron can be built as follows.
It can be simplified by the following.
- There are two types of perceptron:
The one on the left has a bias unit coming from an input node with a value of one,and the one on the right has the bias inside the node.
5.AND Perceptron
Some logical operators can be represented as perceptrons,such as AND, OR, NOT and XOR.
6.what the computer does
It start at a random place by picking a random linear equation,then move it to try to get better.
Here is a way to move the line.
- If the red point over the line is misclassified,we can adjust the line like this.
But we don't want to move the line drasticlly since other points are minclassified accidentally.We introduce the learning rate to let the line make a small move towards the wrong red point.
- If the blue point under the line is misclassified,we can adjust the line like this.
7.pseudocode for the perceptron algorithm
Repeat the step until we get no errors or until we have a number of errors that is small or simply do this step a thousand times and stop.