LMS algorithm

梯度下降求参数

LMS stands for “least mean squares”

image

 

image

 

image

 

batch gradient descent

image

stochastic gradient descent

image

 

直接求参数:

image

 

image

image

image

image

概率解释

image

e(i) are distributed IID (independently and identically distributed) according to a Gaussian distribution (also called a Normal distribution).

image

image

image

likelihood function

image

image

image

To summarize: Under the previous probabilistic assumptions on the data,least-squares regression corresponds to finding the maximum likelihood estimate of θ. This is thus one set of assumptions under which least-squares regression can be justified as a very natural method that’s just doing maximum likelihood estimation. (Note however that the probabilistic assumptions are by no means necessary for least-squares to be a perfectly good and rational procedure, and there may—and indeed there are—other natural assumptions that can also be used to justify it.)
Note also that, in our previous discussion, our final choice of θ did notdepend on what was σ^2, and indeed we’d have arrived at the same result even if σ^2 were unknown. We will use this fact again later, when we talk about the exponential family and generalized linear models.

Locally weighted linear regression

image

image

image

Locally weighted linear regression is the first example we’re seeing of a non-parametric algorithm. The (unweighted) linear regression algorithm that we saw earlier is known as a parametric learning algorithm, because it has a fixed, finite number of parameters, which are fit to the data. Once we’ve fit the θ and stored them away, we no longer need to keep the training data around to make future predictions. In contrast, to make predictions using locally weighted linear regression, we need to keep the entire training set around. The term “non-parametric” (roughly) refers to the fact that the amount of stuff we need to keep in order to represent the hypothesis h grows linearly with the size of the training set.

转载于:https://www.cnblogs.com/gghost/p/3286755.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值