Neural network - part 3 - Cost Function
reference Andrew Ng “machine learning8”
In order to fit the paramater, we need to minimize the cost function.How can we get the cost function. It likes the cost function in logistic regression but has some difference. To have a deep learning witth the cost function in nn, we ought to memorize and know backward propagation.
- Cost Function
- Backpropagation Algorithm
Cost Function
- Two types of classification
There are two classification, one is binary classification, the other is multi-class classification. They have different output, just like below.
- Cost function
The cost function in nn has some similarity with the cost function in logistic regresstion. The cost fucntion below is the total layers cost function and h θ ( x ) h_\theta(x) hθ(x) means the final output. θ \theta θ means a big matrix. We will not put the bias unit’s parameter into cost function , which refers to θ i 0 \theta_{i0} θi0.
Backpropagation Algorithm
In order to fit the parameters, we need to minimize the cost funtion
J
(
θ
)
J(\theta)
J(θ) . Thus, we neet to have
J
(
θ
)
J(\theta)
J(θ) and the partial derivative term
d
d
dd
dd.