machine learning
gdtop818
这个作者很懒,什么都没留下…
展开
-
linear regression(1)-multiple features
PS.used to taking notes on papers.summary:hθ(X)=θTXMultiple FeaturesGradient(variables)xj(i)=value of feature j in the ith training examplex(i)=the column vector of all the feature i原创 2017-04-03 00:35:27 · 371 阅读 · 0 评论 -
[coursera/SequenceModels/week2]Operations on word vectors - Debiasing[assignment]
Operations on word vectorsWelcome to your first assignment of this week!Because word embeddings are very computionally expensive to train, most ML practitioners will load a pre-trained set of embeddin...原创 2018-02-18 13:39:37 · 908 阅读 · 0 评论 -
[coursera/SequenceModels/week2]Emojify![assignment]
This assignment have some error with emoji.Emojify!Welcome to the second assignment of Week 2. You are going to use word vector representations to build an Emojifier.Have you ever wanted to make your ...原创 2018-02-18 13:59:34 · 1408 阅读 · 0 评论 -
[coursera/SequenceModels/week3]Sequence models & Attention mechanism (summary&question)
3.1 Various sequence to sequence architectures3.1.1 Basic Models3.1.2 Picking the most likely sentenceconditional probabilitypick most likely sentenceGreedy search(not useful)3.1.3 Beam Searchexample...原创 2018-02-18 17:21:58 · 7050 阅读 · 3 评论 -
[coursera/SequenceModels/week3]Neural machine translation with attention[assignment]
Neural Machine TranslationWelcome to your first programming assignment for this week!You will build a Neural Machine Translation (NMT) model to translate human readable dates ("25th of June, 2009") in...原创 2018-02-18 17:31:08 · 4000 阅读 · 0 评论 -
[coursera/SequenceModels/week3]Trigger Word Detection[assignment]
Trigger Word DetectionWelcome to the final programming assignment of this specialization!In this week's videos, you learned about applying deep learning to speech recognition. In this assignment, you ...原创 2018-02-18 18:02:37 · 5104 阅读 · 2 评论 -
[MLReview]Reference 机器学习参考资料
最近要面試學的太多寫寫總結參考书籍:李航《统计学习方法》周志华《机器学习》Peter Harrington《MachineLearninginAction》Yoshua Bengio《Deep Learning》Sebastian Raschka 《Python Machine Learning 2nd》參考课程:Andrew ng - machine learning呂忠津 - Special ...原创 2018-04-23 20:22:32 · 659 阅读 · 0 评论 -
GAN 对抗生成网络代码实现
作报告写了ppt,这里po上 更完整的介绍关注专栏生成对抗网络Generative Adversarial Network本篇的同名博客[生成对抗网络GAN入门指南](3)GAN的工程实践及基础代码In [1]:import tensorflow as tffrom tensorflow.examples.tutorials.mnist import input_datai...转载 2018-05-25 05:39:25 · 3191 阅读 · 4 评论 -
[MLReview] Decision Tree 决策树代码实现
决策树决策树(Decision Tree),简而言之就是根据特征(features)对数据进行划分(patition),构造成树。然后根据树对新的数据进行预测的方法。本质上说是从数据集中归纳出一组分类规则。我们知道决策树需要根据特征的情况进行划分,那么每一次划分的的时候,该先选取什么特征进行划分呢,这里引入熵的概念。决策树的算法特点优点:计算复杂度不高,输出结果易于理解,数据有缺失...原创 2018-04-29 17:49:32 · 1015 阅读 · 2 评论 -
[MLReview] Naive Bayes 朴素贝叶斯代码实现
朴素贝叶斯原创 2018-06-05 02:50:13 · 1287 阅读 · 2 评论 -
[MLReview] Logistic Regression 逻辑回归代码实现
邏輯回歸湾湾的叫法是这个,感觉还不错~前言:逻辑斯谛回归是最大熵模型的一个准则,属于对数线性模型。本文主要对逻辑回归的原理及代码实现做出说明。并在文末附上拟牛顿法的彩蛋,决定现在每写一篇顺带写一些会用到的数学方法。条件概率分布:$$P(Y=1|x)=\frac{exp(w\cdot x+b)}{1+exp(w\cdot x+b)}$$$$P(Y=0|x)=\frac{1}{1+exp(w\cdot...原创 2018-06-05 03:46:15 · 714 阅读 · 0 评论 -
博客访问量时间序列分析
用blog访问量作为原始数据记录100天,分析一波。13th, June 2256714th, June 2271916th, June 2289219th, June 2305920th, June 231383rd, July 2433329th, July 2718931th, July 2733010th, Aug 2844620th, Aug 2...原创 2018-06-13 15:48:01 · 331 阅读 · 0 评论 -
深度学习入门论文集合
My writing is bad. This is my term paper in CS5312-deep learning course.Section II: List and highlight of papers you have studied.In this section, I separate the papers into 3 parts-NN networks, alg...原创 2018-08-12 00:07:44 · 2245 阅读 · 0 评论 -
[Python深度学习](一)神经网络的数学基础
本文为《Python深度学习》的学习笔记。第2章 神经网络的数学基础2.1 初识神经网络完整代码2.2 神经网络的数据表示2.2.1 标量(0D张量) 2.2.2 向量(1D张量) 2.2.3 矩阵(2D张量) 2.2.4 3D张量和更高维张量2.2.5 关键属性 2.2.6 在Numpy中操作张量 2.2.7 数据批量的概率 2.2.8...原创 2018-12-04 14:14:47 · 2555 阅读 · 1 评论 -
[coursera/SequenceModels/week1]Improvise a Jazz Solo with an LSTM Network - v1[assignment]
Improvise a Jazz Solo with an LSTM NetworkWelcome to your final programming assignment of this week! In this notebook, you will implement a model that uses an LSTM to generate music. You will even be ...原创 2018-02-15 17:45:23 · 1396 阅读 · 0 评论 -
[coursera/SequenceModels/week1]Character level language model - Dinosaurus land[assignment]
Character level language model - Dinosaurus landWelcome to Dinosaurus Island! 65 million years ago, dinosaurs existed, and in this assignment they are back. You are in charge of a special task. Leadin...原创 2018-02-15 16:55:32 · 1268 阅读 · 0 评论 -
[coursera/SequenceModels/week1]Building a Recurrent Neural Network - Step by Step - v3[assignment]
Building your Recurrent Neural Network - Step by StepWelcome to Course 5's first assignment! In this assignment, you will implement your first Recurrent Neural Network in numpy.Recurrent Neural Networ...原创 2018-02-15 16:02:41 · 7706 阅读 · 3 评论 -
linear regression(2)-Gradient Descent for Multiple Variables
Gradient Descent For Multiple VariablesThe gradient descent equation itself is generally the same form; we just have to repeat it for our 'n' features:repeat until convergence:{θ0:=θ0−α1m∑原创 2017-04-04 16:27:52 · 415 阅读 · 0 评论 -
linear regression(3)-Gradient Descent in Practice I/II(Feature Scalling/Learning Rate)
Gradient Descent in Practice I - Feature Scalinggoal:speed up gradient descent by having each of our input values in roughly the same rangexi:=xi−μisiWhere μi is the average of all the val原创 2017-04-04 16:53:46 · 354 阅读 · 0 评论 -
[coursera/ImprovingDL/week1]Practical aspects of Deep Learning(summary&question)
1.1 Setting up your Machine Learning ApplicationTrain/Dev/Test setsIt depends on the data. (98/1/1)worst:Train set error: 50%(high bias) underfittingDev set error: 50%(high variance) overfit原创 2018-01-31 11:23:43 · 365 阅读 · 0 评论 -
[coursera/dl&nn/week2]Basics of Neural Network programming(2.2 py & Vectorization)
2.2 Python and Vectorization2.2.1 Vectorization We use a loop to iterate the training data, it costs long when meets a large amount of data.Vectorization is the secret why numpy have pri原创 2018-01-26 14:23:32 · 313 阅读 · 0 评论 -
[coursera/dl&nn/week2]Basics of Neural Network programming(2.1 Logistic Regression as a NN)
Logistic regression can be regarded as a small neural network.This part is very easy. And this is a review of logistic regression.If you have any other question, you can first FINISH machine le...原创 2018-01-26 13:06:29 · 327 阅读 · 0 评论 -
[coursera/dl&nn/week2]Basics of Neural Network programming(quiz)
This blog helps me review the course on coursera.Wrong answer:3.reshape to a column vector9."*" means the elementwise product ".dot" means matrix multiplication operation1. Question 1原创 2018-01-26 20:19:32 · 723 阅读 · 0 评论 -
[coursera/dl&nn/week3]Shallow Neural Network(summary&question)
recommend an app named "Grammarly"3.1Neural Network Overviewfor iteration in (ntimes)z = np.dot(w.T,x)+ba = 1/(1 + np.exp(-z))dz = a - ydw = 1/m * np.dot(x,dz.T)db = 1/m * np.sum(dz)update原创 2018-01-27 12:47:00 · 855 阅读 · 0 评论 -
[FreeGPU]colab用法详解 妈妈再也不用担心我们买不起显卡租不起服务器啦
colaboratory = Jupyter + GPUFollowing these steps, you are able to enjoy GPU service for FREE.1. Register a google account2. Login to Google Drive3. New folder(take a name whether you like原创 2018-01-27 21:20:11 · 7076 阅读 · 4 评论 -
[coursera/ImprovingDL/week2]Optimization algorithms(summary&question)
summary:<ADAM: A METHOD FOR STOCHASTIC OPTIMIZATION>2.1 Mini-batch gradientthe size of batch: m BGD: too long for each iterationthe size of batch:1 SGD: lose speed up(vectorizat...原创 2018-02-04 12:14:56 · 743 阅读 · 0 评论 -
[coursera/dl&nn/week4]Deep Neural Network(summary&question)
Deep learning is an experiment base on hyperparameters.I strongly encourage you to find a paper to write down forward and backward propagate.You need to review how to compute the derivative of m原创 2018-01-28 15:31:35 · 450 阅读 · 0 评论 -
[coursera/ImprovingDL/week3]Hyperparameter tuning, Batch Normalization(summary&question)
The video for this week is easy.3.1 hyperparameter tuningtry randomselect a zone: log scalemodel/parallel model: depend on the computation power3.2 Batch Normalizationnormalize the hidd原创 2018-02-04 19:12:08 · 242 阅读 · 0 评论 -
[coursera/ConvolutionalNeuralNetworks/week1]Foundations of cnn(summary&question)
Convolutional Neural Networks1.1 Computer Visionwhy to learn cv network for large images1.2 Edge Detection Examplepy: conv_forwardtensorflow: tf.nn.conv2dkeras: Con2DHere are another convolutional com...原创 2018-02-09 21:01:07 · 3271 阅读 · 0 评论 -
[coursera/StructuringMLProjects/week1&2]ML Strategy1(summary&question)
Welcome to the third course which I think is the most important helpful videos in this data specialization!Week1 ML Strategy(1)1.1 IntroductionPrepare carefully for our projects.training set原创 2018-02-07 17:20:00 · 683 阅读 · 0 评论 -
linear regression(4)-normal equation compare with gradient descent
Normal Equation Gradient descent gives one way of minimizing J.Here is an another wayof doing so.θ=(XTX)−1XTy There is no need to do feature scaling with the normal equation.The f...原创 2017-04-04 18:13:44 · 320 阅读 · 0 评论