这段视频很好玩,讲了Deep learning的发展

鉴于我的机器学习和统计的相关知识,不足,暂时就不翻译了。我只是把英文原文记下来,也许会有错误,挺好玩的。又看了一片中文的文章,发现视频中记录的也就是深度学习发展的时间点了。


首先,记录一下深度学习与神经网络的区别。深度学习采用很多隐层神经网络进行学习,对于训练速度的问题,深度学习选取逐层训练的方法进行训练。


第一次浪潮:浅层学习

80年代Back Propagation发明,这给机器学习带来了新的希望。此时诞生了人工神经网络,也称为多层感知机,实际上只含一层隐层节点的浅层模型。

90年代,各种各样的浅层机器学习模型相继被提出,比如支撑向量机(SVM,Support Vector Machines)、Boosting、最大熵方法(例如LR, Logistic Regression)等。这些模型的结构基本上可以看成带有一层隐层节点(如SVM、Boosting),或没有隐层节点(如LR)。这些模型在无论是理论分析还是应用都获得了巨大的成功。相比较之下,由于理论分析的难度,加上训练方法需要很多经验和技巧,所以这个时期浅层人工神经网络反而相对较为沉寂。


第二次浪潮:深度学习

2006年,加拿大多伦多大学教授、机器学习领域泰斗——Geoffrey Hinton和他的学生Ruslan Salakhutdinov在顶尖学术刊物《科学》上发表了一篇文章,开启了深度学习在学术界和工业界的浪潮。这篇文章有两个主要的信息:1. 很多隐层的人工神经网络具有优异的特征学习能力,学习得到的特征对数据有更本质的刻画,从而有利于可视化或分类;2. 深度神经网络在训练上的难度,可以通过“逐层初始化”(Layer-wise Pre-training)来有效克服,在这篇文章中,逐层初始化是通过无监督学习实现的。


其实简单翻译过来就是深度学习的发展的时间段了。

1983.

--- One researcher

Deep belive nets, they are more efficient than kernels, I am going to use them again and again.

Jeff Hinton is the leader, follow the leader. But what are the origins of this incredible successful story.


1983.

---Girl
Hello professor Hinton, Please tell us about your research.

---Hinton
Something incredable happened, I know how the brain works. Wow, it's about man machine ,but how the brain efficiently estimates the gradient of the partion function.

---Girl
Did you made progress with the man made machine.

---Hinton
I have found something better, the back propagation algorithm, the brains must deploy this tricks to deploy this tricks to compute gradients.

---Girl
Hello professor Hinton, did you confirm that the brain does back propagation. 

---Hinton
I decide not to answer the question.


1993.

---Girl
Hello professor Hinton, did you confirm the brain does back propagation. 

---Hinton
I try not to answer the question. But I can finally tell you how the brain works. It is grapical model, it is trained thanks to variation to proximation.


2000.

---Hinton
I know how the brain works.

---Girl
Of course it is variational.


Not at all it uses contrasted dirvagence to approximates the gradients of the partition function.


2006.

---Hinton
This time I got it, deep learning.

---Girl
I guess that's how the brain works?

---Hinton
yes, but I am supervised by pretty training about multiple layers.


2010.

---Girl
I don't think you can discover how the brain works.

---Hinton
And I never think that Michael Jordan could play basketball alone. I tell you I am 100% sure that the brain is using ga matrix transformation.
What ever, are you supposed not to organize deep learning workshop.

---Hinton
Are you crazy? They are going to make me tell stupid jokes at the closing backrid, 

---Girl
I see , but how do you envision the future about deep learnning?

---Hinton
live long and prosper.

深度学习,推进人工智能的梦想:http://www.csdn.net/article/2013-05-29/2815479

Deep Learning Video: http://www.youtube.com/watch?v=mlXzufEk-2E

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值