Basics of Neural Network Programming - Logistic Regression

This article is the notes when I learn the class "Neural Networks & Deep Learning" by teacher Andrew Ng. It shows info of Logistic Regression which is the basics of Neural Network Programming. Shares it with you and hope it helps.


Logistic regression is a learning algorithm used in a supervised learning problem when the output 𝑦 are all either zero or one, so for binary classification problems.

Given an input feature vector x which maybe corresponding to an image that you want to recognize as either a cat picture or not a cat picture, the algorithm will evaluate the probability of a cat being in that image. Mathematically:

\hat{y}=P(y=1|x), where 0\leq \hat{y}\leq 1

It means, given x, we want to know \hat{y} which is the chance of y=1
The parameters used in Logistic regression are:
• The input features vector: x \in \mathbb{R}^{n_{x}}, where n_{x} is the number of features
• The training label: y\in 0,1
• The weights: w\in \mathbb{R}^{n_{x}}, where n_{x} is the number of features
• The threshold: b\in \mathbb{R}
• The output: \hat{y}=\sigma (w^{T}x+b)
• Sigmoid function: s=\sigma (w^{T}x+b)=\sigma (z)=\frac{1}{1+e^{-z}}

Figure-1


(w^{T}x+b) is a linear function (ax+b), but since we are looking for a probability constraint between [0,1], the sigmoid function is used. The function is bounded between [0,1] as shown in Figure-1.
Some observations from the graph:
• If 𝑧 is a large positive number, then \sigma (z)\approx 1
• If 𝑧 is small or large negative number, then \sigma (z)\approx 0
• If 𝑧=0, then \sigma (z)= 0.5

<end>

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值