网易公开课 笔记

Class 2 Gradient Descent

For \[n\times n\] matrix A, B,

  tr(AB)=trr(BA)

  tr(ABC)=tr(CAB)=tr(BCA)

  tr(A)=tr($A^T$)

tr():representing the trace of matrix, equal to the sum of diagonal elements of matrix

 

for \[A\in R^{m*n}, f(A) \in R^1:\]

$(\bigtriangledown)_A f(x)=[\frac{\partial f(A)}{\partial A_(ij)}]_{m*n)$

 

$ (\bigtriangledown)_A tr(ABA^TC)=CAB+C^TAB^T$

 

 

 

least square formula solution

 

$x\times \theta  to predict y$

$x=[

1, x_{11}, x_{12}, x_{13},..x_{1n}

1, x_{21}, x_{22}, x_{23},..x_{2n}

...

1, x_{m1}, x_{m2}, x_{m3},..x_{mn}

]

where m is number of observations, n is number of features $

$\theta=[\theta_0, \theta_1, \theta_2, ..., \theta_n] ^T is parameters$ 

To get the least square, we can get the following equaltion

$x^T\times x \times \theta=x^T\times y&

$\theta=(x^T\tims x)^{-1}\tims x^T\times y$

 

 

 

转载于:https://www.cnblogs.com/jsquare/p/3589519.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值