自定义博客皮肤VIP专享

*博客头图:

格式为PNG、JPG,宽度*高度大于1920*100像素,不超过2MB,主视觉建议放在右侧,请参照线上博客头图

请上传大于1920*100像素的图片!

博客底图:

图片格式为PNG、JPG,不超过1MB,可上下左右平铺至整个背景

栏目图:

图片格式为PNG、JPG,图片宽度*高度为300*38像素,不超过0.5MB

主标题颜色:

RGB颜色,例如:#AFAFAF

Hover:

RGB颜色,例如:#AFAFAF

副标题颜色:

RGB颜色,例如:#AFAFAF

自定义博客皮肤

-+
  • 博客(7)
  • 问答 (1)
  • 收藏
  • 关注

原创 Deep Learning by Andrew Ng --- Sparse coding

稀疏编码介绍稀疏编码算法是一种无监督学习方法,它用来寻找一组“超完备”基向量来更高效地表示样本数据。稀疏编码算法的目的就是找到一组基向量 ϕi\begin{align}\mathbf{\phi}_i\end{align} ,使得我们能将输入向量 x 表示为这些基向量的线性组合: x=∑i=1kaiϕi\begin{align}\mathbf{x} = \sum_{i=1}^k a_i \math

2015-04-18 16:41:29 1442

原创 Deep learning by Andrew Ng --- Linear Decoder

Sparse Autoencoder Recap:Because we used a sigmoid activation function for f(z(3)), we needed to constrain or scale the inputs to be in the range [0,1], since the sigmoid function outputs numbers in th

2015-04-10 16:31:37 1326

原创 Deep Learning by Andrew Ng --- stacked autoencoder

When should we use fine-tuning?It is typically used only if you have a large labeled training set; in this setting, fine-tuning can significantly improve the performance of your classifier. However, if

2015-04-08 20:05:04 2688 1

原创 Deep Learning by Andrew Ng --- self-taught

本次UFLDL练习大致流程:通过对标记为5-9的数字图像进行self-taught特征提取(笔画特征),获得特征参数opttheta。use opttheta to obtain a(2) which represente the labeled input data.Training and testing the logistic regression model(with softmax

2015-04-06 21:17:29 1192

原创 python --- 多线程之threading

join函数join()会等到调用了该函数的线程执行完后再去执行接下来的代码。比如:import threadingfrom time import ctime ,sleeploops = [2,4]def loop(nloop , nsce): print 'start loop:',nloop,'at:',ctime() sleep(nsce) print 'loop:

2015-04-05 19:31:35 859

原创 Deep Learning by Andrew Ng --- Softmax regression

这是UFLDL的编程练习。Weight decay(Softmax 回归有一个不寻常的特点:它有一个“冗余”的参数集)后的cost function和梯度函数:cost function:J(θ)=−1m⎡⎣∑i=1m∑j=1k1{y(i)=j}logeθTjx(i)∑kl=1eθTlx(i)⎤⎦+λ2∑i=1k∑j=0nθ2ij\begin{align}J(\theta) = - \frac{

2015-04-04 15:52:16 1696

原创 Deep Learning by Andrew Ng --- PCA and whitening

这是UFLDL的编程练习。具体教程参照官网。PCAPCA will find the priciple direction and the secodary direction in 2-dimention examples. then x~(i)=x(i)rot,1=uT1x(i)∈R.\begin{align}\tilde{x}^{(i)} = x_{{\rm rot},1}^{(i)}

2015-04-02 21:00:32 1071

空空如也

TA创建的收藏夹 TA关注的收藏夹

TA关注的人

提示
确定要删除当前文章?
取消 删除