机器学习
文章平均质量分 89
GarfieldEr007
这个作者很懒,什么都没留下…
展开
-
Geoffrey E. Hinton的Deep Learning代码-- Training a deep autoencoder or a classifier on MNIST digits
Training a deep autoencoder or a classifier on MNIST digitsCode provided by Ruslan Salakhutdinov and Geoff Hinton Permission is granted for anyone to copy, use, modify, or distribute this progra转载 2016-01-15 15:30:27 · 3971 阅读 · 0 评论 -
文本深度表示模型Word2Vec
简介Word2vec 是 Google 在 2013 年年中开源的一款将词表征为实数值向量的高效工具, 其利用深度学习的思想,可以通过训练,把对文本内容的处理简化为 K 维向量空间中的向量运算,而向量空间上的相似度可以用来表示文本语义上的相似度。Word2vec输出的词向量可以被用来做很多 NLP 相关的工作,比如聚类、找同义词、词性分析等等。如果换个思路, 把词当做特征,那么Word2v转载 2016-03-15 18:51:50 · 1249 阅读 · 0 评论 -
网络公开课资源 ——关注CS/AI/Math
当当当当~请看这个网址 - http://www.class-central.com/ - 它是一个列表,列出几大在线课程网站(有英文字幕和习题就是好啊^^)的课程表 (比网易云课堂更原汁原味哦,现在也可以看课程图谱,学累了可以轻松几分钟 ,还有浙大的计算机中的数学)Stanford's Coursera - http://www.coursera.org/ Edx - https://w转载 2016-03-15 18:54:09 · 2500 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Programming Assignment 1: The perceptron learning alg
Programming Assignment 1: The perceptron learning algorithm.Help CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as原创 2016-02-17 11:18:08 · 4010 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Programming Assignment 2: Learning Word Representatio
Programming Assignment 2: Learning Word Representations.Help CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a l原创 2016-02-18 09:29:08 · 3936 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 8 Quiz
Lecture 8 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-18 09:30:04 · 2105 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Programming assignment 3: Optimization and generaliza
Programming assignment 3: Optimization and generalizationHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a原创 2016-02-18 09:30:32 · 3029 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 4 Quiz
Lecture 4 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-18 09:28:50 · 3067 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 5 Quiz
Lecture 5 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-18 09:29:00 · 2595 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 6 Quiz
Lecture 6 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-18 09:29:32 · 1797 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 7 Quiz
Lecture 7 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-18 09:29:15 · 3637 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 9 Quiz
Lecture 9 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-18 09:32:22 · 1786 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 2 Quiz
Lecture 2 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-17 11:17:38 · 4700 阅读 · 3 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 3 Quiz
Lecture 3 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-17 11:18:13 · 2653 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 1 Quiz
Lecture 1 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance w原创 2016-02-17 11:17:28 · 3022 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 13 Quiz
Lecture 13 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.Announcement, ad原创 2016-02-18 09:42:28 · 4851 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 10 Quiz
Lecture 10 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance原创 2016-02-18 09:42:41 · 1763 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 11 Quiz
Lecture 11 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance原创 2016-02-18 09:42:57 · 2458 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Programming Assignment 4: Restricted Boltzmann Machin
Programming Assignment 4: Restricted Boltzmann MachinesHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a le原创 2016-02-18 09:43:23 · 3483 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 12 Quiz
Lecture 12 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance原创 2016-02-18 09:43:53 · 1719 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 14 Quiz
Lecture 14 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance原创 2016-02-18 09:44:48 · 1423 阅读 · 0 评论 -
机器学习中的神经网络Neural Networks for Machine Learning:Lecture 15 Quiz
Lecture 15 QuizHelp CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.In accordance原创 2016-02-18 09:44:57 · 1702 阅读 · 0 评论 -
哥伦比亚大学Coursera课程Natural Language Processing:Quiz 2: covers material from weeks 3 and 4
Quiz 2: covers material from weeks 3 and 4Help CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exerci原创 2016-02-18 09:45:44 · 3445 阅读 · 0 评论 -
哥伦比亚大学Coursera课程Natural Language Processing:Quiz 1: covers material from weeks 1 and 2
Quiz 1: covers material from weeks 1 and 2Help CenterWarning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exerci原创 2016-02-18 09:46:14 · 2149 阅读 · 1 评论 -
逻辑回归Logistic Regression 之基础知识准备
0. 前言 这学期 Pattern Recognition 课程的 project 之一是手写数字识别,之二是做一个网站验证码的识别(鸭梨不小哇)。面包要一口一口吃,先尝试把模式识别的经典问题——手写数字识别做出来吧。这系列博客参考deep learning tutorial ,记录下用以下三种方法的实现过程:Logistic Regression - using Theano转载 2016-03-30 12:36:32 · 5276 阅读 · 0 评论 -
漫谈:机器学习中距离和相似性度量方法
在机器学习和数据挖掘中,我们经常需要知道个体间差异的大小,进而评价个体的相似性和类别。最常见的是数据分析中的相关分析,数据挖掘中的分类和聚类算法,如 K 最近邻(KNN)和 K 均值(K-Means)等等。根据数据特性的不同,可以采用不同的度量方法。一般而言,定义一个距离函数 d(x,y), 需要满足下面几个准则:1) d(x,x) = 0 // 到自己的转载 2016-03-30 12:37:22 · 1183 阅读 · 0 评论 -
稀疏表示介绍(上)
声明 之前虽然听过压缩感知和稀疏表示,实际上昨天才正式着手开始了解,纯属新手,如有错误,敬请指出,共同进步。主要学习资料是 Coursera 上 Duke 大学的公开课——Image and video processing, by Pro.Guillermo Sapiro 第 9 课。由于对图像处理的了解也来自与该课程,没正经儿看过几本图像方面的书籍,有些术语只能转载 2016-03-30 12:37:25 · 2282 阅读 · 0 评论 -
机器学习中导数最优化方法(基础篇)
1. 前言熟悉机器学习的童鞋都知道,优化方法是其中一个非常重要的话题,最常见的情形就是利用目标函数的导数通过多次迭代来求解无约束最优化问题。实现简单,coding 方便,是训练模型的必备利器之一。这篇博客主要总结一下使用导数的最优化方法的几个基本方法,梳理梳理相关的数学知识,本人也是一边写一边学,如有问题,欢迎指正,共同学习,一起进步。 2. 几个数学概念1) 梯度(一阶导数)转载 2016-03-30 12:37:26 · 2007 阅读 · 0 评论 -
支持向量机SVM 简要推导过程
SVM 是一块很大的内容,网上有写得非常精彩的博客。这篇博客目的不是详细阐述每一个理论和细节,而在于在不丢失重要推导步骤的条件下从宏观上把握 SVM 的思路。 1. 问题由来SVM (支持向量机) 的主要思想是找到几何间隔最大的超平面对数据进行正确划分,与一般的线性分类器相比,这样的超平面理论上对未知的新实例具有更好的分类能力。公式表示如下: : 所有点中最小的几何间隔转载 2016-03-30 12:38:17 · 3663 阅读 · 1 评论 -
稀疏表示介绍(中)、(下)
声明之前虽然听过压缩感知和稀疏表示,实际上昨天才正式着手开始了解,纯属新手,如有错误,敬请指出,共同进步。主要学习资料是 Coursera 上 Duke 大学的公开课——Image and video processing, by Pro.Guillermo Sapiro 第 9 课。由于对图像处理的了解也来自与该课程,没正经儿看过几本图像方面的书籍,有些术语只能用转载 2016-03-30 12:39:49 · 1719 阅读 · 0 评论 -
实现一个反向传播人工神经网络
为何实现一个BP神经网络?“What I cannot create, I do not understand”— Richard Feynman, February 1988实现一个BP神经网络的7个步骤选择神经网络 结构随机 初始化权重实现 前向传播 实现 成本函数 $J(\Theta)$实现反向传播算法并计算 偏微分 $\frac{\partial}{\par转载 2016-03-07 12:21:13 · 2436 阅读 · 0 评论 -
机器学习、深度学习、数据挖掘各种资源整理
Deep Learning(深度学习):ufldl的2个教程(这个没得说,入门绝对的好教程,Ng的,逻辑清晰有练习):一ufldl的2个教程(这个没得说,入门绝对的好教程,Ng的,逻辑清晰有练习):二Bengio团队的deep learning教程,用的theano库,主要是rbm系列,搞python的可以参考,很不错。deeplearning.net主页,里面包含转载 2016-03-30 19:32:08 · 5516 阅读 · 2 评论 -
机器学习&数据挖掘笔记_16(常见面试之机器学习算法思想简单梳理)
前言: 找工作时(IT行业),除了常见的软件开发以外,机器学习岗位也可以当作是一个选择,不少计算机方向的研究生都会接触这个,如果你的研究方向是机器学习/数据挖掘之类,且又对其非常感兴趣的话,可以考虑考虑该岗位,毕竟在机器智能没达到人类水平之前,机器学习可以作为一种重要手段,而随着科技的不断发展,相信这方面的人才需求也会越来越大。 纵观IT行业的招聘岗位,机器学习之类的岗位还是挺转载 2016-03-30 19:33:41 · 1138 阅读 · 0 评论 -
机器学习&数据挖掘笔记_14(GMM-HMM语音识别简单理解)
为了对GMM-HMM在语音识别上的应用有个宏观认识,花了些时间读了下HTK(用htk完成简单的孤立词识别)的部分源码,对该算法总算有了点大概认识,达到了预期我想要的。不得不说,网络上关于语音识别的通俗易懂教程太少,都是各种公式满天飞,很少有说具体细节的,当然了,那需要有实战经验才行。下面总结以下几点,对其有个宏观印象即可(以孤立词识别为例)。 一、每个单词的读音都对应一个HMM模型,大家转载 2016-03-30 19:34:54 · 3066 阅读 · 0 评论 -
隐马尔科夫模型(HMM)及其扩展
转载请注明地址(http://blog.csdn.net/xinzhangyanxiang/article/details/8522078)学习概率的时候,大家一定都学过马尔科夫模型吧,当时就觉得很有意思,后来看了数学之美之隐马模型在自然语言处理中的应用后,看到隐马尔科夫模型竟然能有这么多的应用,并且取得了很好的成果,更觉的不可思议,特地深入学习了一下,这里总结出来。马尔科夫过程转载 2016-03-31 12:57:18 · 1397 阅读 · 0 评论 -
生成学习、高斯判别、朴素贝叶斯—斯坦福ML公开课笔记5
转载请注明:http://blog.csdn.net/xinzhangyanxiang/article/details/9285001该系列笔记1-5pdf下载请猛击这里。本篇博客为斯坦福ML公开课第五个视频的笔记,主要内容包括生成学习算法(generate learning algorithm)、高斯判别分析(Gaussian DiscriminantAnalysis,GDA)、朴素贝叶转载 2016-03-31 12:59:12 · 1070 阅读 · 0 评论 -
牛顿方法、指数分布族、广义线性模型—斯坦福ML公开课笔记4
转载请注明:http://blog.csdn.net/xinzhangyanxiang/article/details/9207047最近在看Ng的机器学习公开课,Ng的讲法循循善诱,感觉提高了不少。该系列视频共20个,每看完一个视频,我都要记录一些笔记,包括公式的推导,讲解时候的例子等。按照Ng的说法,公式要自己推理一遍才能理解的通透,我觉得自己能够总结出来,发到博客上,也能达到这个效果,希转载 2016-03-31 13:00:14 · 1196 阅读 · 0 评论 -
线性规划、梯度下降、正规方程组——斯坦福ML公开课笔记1-2
转载请注明链接:http://blog.csdn.net/xinzhangyanxiang/article/details/9101621最近在看Ng的机器学习公开课,Ng的讲法循循善诱,感觉提高了不少。该系列视频共20个,每看完一个视频,我都要记录一些笔记,包括公式的推导,讲解时候的例子等。按照Ng的说法,公式要自己推理一遍才能理解的通透,我觉得自己能够总结出来,发到博客上,也能达到这个效果转载 2016-03-31 13:01:41 · 1566 阅读 · 1 评论 -
局部加权回归、逻辑斯蒂回归、感知器算法—斯坦福ML公开课笔记3
转载请注明:http://blog.csdn.net/xinzhangyanxiang/article/details/9113681最近在看Ng的机器学习公开课,Ng的讲法循循善诱,感觉提高了不少。该系列视频共20个,每看完一个视频,我都要记录一些笔记,包括公式的推导,讲解时候的例子等。按照Ng的说法,公式要自己推理一遍才能理解的通透,我觉得自己能够总结出来,发到博客上,也能达到这个效果,希转载 2016-03-31 13:02:41 · 1202 阅读 · 0 评论 -
微博背后的那些算法
微博背后的那些算法引言微博是一个很多人都在用的社交应用。天天刷微博的人每天都会进行着这样几个操作:原创、转发、回复、阅读、关注、@等。其中,前四个是针对短博文,最后的关注和@则针对的是用户之间的关系,关注某个人就意味着你成为他的粉丝,而他成为你的好友;@某个人意味着你想要他看到你的微博信息。微博被人们认为是“自媒体”,即普通大众分享与本身相关的“新闻”的途径。最近,有些人使转载 2016-03-31 13:04:17 · 1084 阅读 · 0 评论