Note coursera-machine learning

This is the note of machine learning course on Cousera. I will continuously update this blog.


------------------------------不华丽的分割线-----------------------------------------

* 同学分享了一个网址,包括了这门课程的video,ppt和pdf.

* 鉴于ppt包括了我所有的工作内容。所以直接上网址,这个就不再写啦。


* https://class.coursera.org/ml-005/lecture

----------------------------------------------------------------------------------------------


1. Gradient descent


- be care of the local optimum


-------------------------------------------***********************************---------------------------------------------------------------



-------------------------------------------------*******************************--------------------------------------------



2. Linear Algebra

This part is relatively easy.

scalar multiplication (数乘)

identity matrix



3.Multivariate Linear Regression

the idea of vector and matrix

---------------------#####################-------------------------------------------------------


------------------------------**************************-------------------------------



---------------------------------------******************-------------------------------------------
Normalization

Get every feature into approximately a -1<= x <=1 range.





About learning rate:
If gradient descent is not working, using smaller learning rate.

For sufficiently small learning rate, the cost function should decrease on every iteration.
But if learning rate is too small, gradient descent can be slow to converge.

Choice of features is an art.



4. Logistic regression

The reason why to use logistic regression is the range of value.



The main part is sigmoid function.
Sigmoid function and logistic function are the same thing.
Object: fit theta to the data



--------------------------*****************-------------------------------------------------




------------------------------------***********----------------------------------------------


Cost function for logistic regression is quite different from the one for linear regression.



Cost function 的这个处理技巧很常用。



---------------------------------*****************-------------------------------------------


---------------------------------********************---------------------------------

If we deal with large data, these algorithms are much faster than gradient descent algorithm.


--------------------------------------------*******************---------------------------------------

Multiclass Classification



--------------------------------------------***************-----------------------------------





5. Regularization


Regularization can help to reduce overfitting.




-----------------------------------------------------------------*********************************-----------------------------------------------------




In the real task, it's hard to judge which features are useful. So we will shrink all thetas except theta 0 (actually theta 0 doesn't make a big difference).



----------------------------------------------***********************------------------------------------------


-----------------------------------------------******************************----------------------------------------------------------


Regularized Logistic Regression





6. Neural Networks


终于到NN啦。由于deep learning的火爆,NN也是容光焕发啊。Andrew Ng 在2011年录的课程中,就流露出了对NN的重视,也提到自己在做相关方面的研究。google brain以及他之后的工作和成就,已经有目共睹了。



-----------------------------------------********************************------------------------------------------------------------------


This picture is for vectorization.




---------------------------------------------------------********************************-----------------------------------------------------------------




Neural networks learn its own features.

----------------------------------------------***********************--------------------------------------------



-------------------------------------------************************-------------------------------------


It is tricky to use hidden layers to implement complex computing.


----------------------------------------------*****************---------------------------------------


  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值