机器学习
文章平均质量分 77
jiongjiongai
这个作者很懒,什么都没留下…
展开
-
Machine Learning Concepts
参考: Machine Learning Concepts ,周志华的西瓜书 《机器学习》。原创 2017-12-22 15:39:13 · 320 阅读 · 0 评论 -
Face Recognition
Face Verification vs Face Recognition Name Input Output Description Face Verification Image and Name / ID Is the image the person with this given ID? Face Recognition Ima...原创 2018-04-15 19:39:30 · 408 阅读 · 0 评论 -
Neural Style Transfer
ConceptContent C + Style S = Generated image GWhat are Deep ConvNet Learning?More abstract features in deeper layer.Cost Functionloss(G;C,S)=αlosscontent(S,G)+βlossstyle(C,G)loss(G;C,...原创 2018-04-16 00:04:20 · 262 阅读 · 0 评论 -
Recurrent Neural Networks
Examples of Sequence DataSpeech RecognitionMusic GenerationSentiment ClassificationDNA Sequence AnalysisMachine TranslationVideo Activity RecognitionName Entity RecognitionNotation ...原创 2018-04-16 06:28:37 · 436 阅读 · 0 评论 -
Exponentially Weighted Averages
Exponentially Weighted Averagesvt=βvt−1+(1−β)θtvt=βvt−1+(1−β)θtv _{t} = \beta v _{t - 1} + \left (1 - \beta \right ) \theta _{t} =β[βvt−2+(1−β)θt−1]+(1−β)θt=β[βvt−2+(1−β)θt−1]+(1−β)θt= \beta \lef...原创 2018-04-11 00:09:48 · 510 阅读 · 0 评论 -
Momentum, RMSprob and Adam
Gradient Descent with MomentumCompute exponentially weighed average of gradient, and use the gradient to update weights.AlgorithmOn iteration t:Compute dWdW\operatorname {d} W and dbdb\op...原创 2018-04-11 02:03:54 · 460 阅读 · 0 评论 -
Shallow Neural Network Week 3
Single SampleSymbolsX=⎛⎝⎜⎜x1⋮xnx⎞⎠⎟⎟,Y=⎛⎝⎜⎜y1⋮yny⎞⎠⎟⎟,X=(x1⋮xnx),Y=(y1⋮yny),X = \begin{pmatrix} x_1 \\ \vdots \\ x_{n _{x}} \end{pmatrix}, Y = \begin{pmatrix} y_1 \\ \vdots \\ y_{n _{y}} \end{pm...原创 2018-04-04 05:30:25 · 177 阅读 · 0 评论 -
Softmax Function
Sigmoid Functionsigmoid(z)=11−e−zsigmoid(z)=11−e−z\operatorname {sigmoid} (z) = \dfrac {1} {1 - e ^{-z}}Softmax Functionsoftmax(zi;Z)=ezi∑i=1nezi,1≤i≤nsoftmax(zi;Z)=ezi∑i=1nezi,1≤i≤n\operato...原创 2018-04-11 22:13:26 · 282 阅读 · 1 评论 -
机器学习的求导公式
机器学习的求导公式损失函数的求导公式设 loss(X)loss(X)\operatorname {loss} \left (X\right ) 为单个样本 XXX 的损失函数, A=g(Z)=⎛⎝⎜⎜g(z1)⋮g(zn)⎞⎠⎟⎟A=g(Z)=(g(z1)⋮g(zn))A = g\left (Z\right ) = \begin{pmatrix} \operatorname {g...原创 2018-04-18 12:30:10 · 832 阅读 · 2 评论 -
Bias and Variance with Mismatched Distributions
Bias and Variance with Mismatched Distributions原创 2018-04-12 22:08:00 · 220 阅读 · 0 评论 -
Learning from Multiple tasks
Where Transfer Learning from A to B Makes SenseTask A and B have the same input X.You have a lot more data for A than B.Low level features in A could be helpful for learning B.Where Multi-task...原创 2018-04-12 23:37:42 · 326 阅读 · 0 评论 -
Convolutional Neural Networks
PaddingOutput Dimensionn+2p−f+1n+2p−f+1n + 2 p - f + 1Padding TypesValid: p=0p=0p = 0 Same: n+2p−f+1=n⇒p=f−12n+2p−f+1=n⇒p=f−12n + 2 p - f + 1 = n \Rightarrow p = \dfrac {f - 1} {2}Str...原创 2018-04-13 01:34:11 · 264 阅读 · 0 评论 -
Nesterov Momentum
x_ahead = x + mu * v# evaluate dx_ahead (the gradient at x_ahead instead of at x)v = mu * v - learning_rate * dx_aheadx += v=>x_prev = xv_prev = vx_ahead = x_prev+ mu * v_prev v = mu * v_...原创 2018-08-09 08:19:11 · 1184 阅读 · 0 评论 -
CS231n Note
CS231n NoteConcepts Concept Description Image Classification Object Detection Action Classification Image Captioning Semantic Segmentation Perceptual ...原创 2018-08-04 20:56:43 · 347 阅读 · 0 评论 -
Deep Learning Notes: Chapter 1 Introduction
前言最近开始读《Deep Learning》一书。这让我有了一个边读书边写笔记的动机:很有必要有一个笔记,能够让人很轻松流畅的读懂这本书的核心内容,至少可以把握住这本书的脉络。 由于终究是英文表达更地道,因此该笔记都是节选自书中的原文。各位读者如果有建议或意见,欢迎留言。谢谢!Deep Learning Chapter 1 Introduction Concept Des...原创 2018-08-18 20:16:48 · 440 阅读 · 0 评论 -
Object Detection
Concepts Name Description yyy Object Classification At most one object y=⎛⎝⎜c1c2c3⎞⎠⎟y=(c1c2c3)y = \begin{pmatrix} c_1 \\ c_2 \\ c_3 \end{pmatrix} Object Localization At most ...原创 2018-04-15 19:17:33 · 212 阅读 · 0 评论 -
KL散度,相对熵
KL散度,相对熵原创 2017-12-28 19:07:02 · 308 阅读 · 0 评论 -
熵的性质
熵的性质原创 2017-12-29 11:35:23 · 1864 阅读 · 0 评论 -
离散随机变量的熵的推导
离散随机变量的熵的推导原创 2017-12-29 16:54:46 · 4646 阅读 · 1 评论 -
深度学习处理概率分布中常用的函数
函数logistic sigmoid 函数: σ(x)=11+e−x\sigma (x) = \dfrac {1} {1 + e ^ {-x} } softplus 函数: ζ(x)=log(1+ex)\zeta (x) = \log (1 + e ^x) positive part (正部)函数 x+=max(0,x)x ^+ = \max (0, x) negative part (翻译 2018-01-20 03:06:53 · 492 阅读 · 0 评论 -
《深度学习》笔记
《深度学习》笔记3.14 节公式 3.52的推导: 由于 P(c|a,b)=P(a,b,c)P(a,b)=P(a,c|b)P(b)P(a|b)P(b)=P(a,c|b)P(a|b)P(c| a, b) = \dfrac {P(a, b, c)} {P(a, b)} = \dfrac {P(a, c | b) P(b)} {P(a | b) P(b)} = \dfrac {P(a原创 2018-01-20 16:09:56 · 187 阅读 · 0 评论 -
Backpropagation Algorithm 的梯度
损失函数 J(θ)J(θ)\operatorname {J} \left (\mathbf {\theta}\right )J(θ)=−1m∑i=1m∑k=1K[y(i)kln(hθ(X(i))k)+(1−y(i)k)ln(1−hθ(X(i))k)]J(θ)=−1m∑i=1m∑k=1K[yk(i)ln(hθ(X(i))k)+(1−yk(i))ln(1−hθ(X(i))k)]\opera...原创 2018-03-15 23:49:12 · 184 阅读 · 2 评论 -
Reason of Random Initialization - Neural Networks
Symmetry Problem若对于神经网络任意一层 l,l,l, 该层所有参数 ωli,jωi,jl\omega ^{l} _{i,j} 的初始值都一样,则在梯度下降每次迭代中: {ωl−11,j=ωl−12,j,0≤j≤sl−1,ωli,1=ωli,2,1≤i≤sl+1,,2≤l≤L−1{ω1,jl−1=ω2,jl−1,0≤j≤sl−1,ωi,1l=ωi,2l,1≤i≤sl+1,,2...原创 2018-03-18 14:04:45 · 189 阅读 · 0 评论 -
Cost Function of Support Vector Machine
Logistic Regression 中的函数 f,gf,gf, gf(x)=ln(1+ex),x∈R,g(x)=f(−x)f(x)=ln(1+ex),x∈R,g(x)=f(−x)f(x) = \ln (1 + e ^{x}), x \in \mathbb R, g(x) = f(-x)f,gf,gf, g 的性质f′(x)=ex1+ex>0,x∈Rf′(x)=ex1+e...原创 2018-03-18 19:05:27 · 218 阅读 · 0 评论 -
Cost function of Logistic Regression and Neural Network
Logistic / Sigmoid functiong(x)=11+e−x=ex1+exg(x)=11+e−x=ex1+exg(x) = \dfrac {1} {1 + e ^{-x}} = \dfrac {e ^{x}} {1 + e ^{x}}Cost functionLogistic Regressionhθ(X)=f(X⊺θ)=P(y=1|X;θ)hθ(X)=f(X⊺...原创 2018-03-12 22:25:59 · 294 阅读 · 0 评论 -
Support Vector Machine's Large Margin
SVM Cost FunctionJ(θ)=C∑i=1m[yicost1(W⊺Xi+θ0)+(1−yi)cost0(W⊺Xi+θ0)]+∑j=1nλ2θ2jJ(θ)=C∑i=1m[yicost1(W⊺Xi+θ0)+(1−yi)cost0(W⊺Xi+θ0)]+∑j=1nλ2θj2J\left (\theta \right ) = C \sum \limits_{i = 1} ^{m} \le...原创 2018-03-19 12:20:55 · 189 阅读 · 0 评论 -
Activation function in Neural Network
Logistic / Sigmoid functiong(x)=11+e−x=ex1+exg(x)=11+e−x=ex1+exg(x) = \dfrac {1} {1 + e ^{-x}} = \dfrac {e ^{x}} {1 + e ^{x}} g(−x)=11+ex=e−x1+e−xg(−x)=11+ex=e−x1+e−xg(-x) = \dfrac {1} {1 + e ^{x}}...原创 2018-03-30 19:47:41 · 283 阅读 · 0 评论 -
Code to download files from google drive to colab
Code:def download_from_google_drive(file_name_prefix): # 1. Authenticate and create the PyDrive client. auth.authenticate_user() gauth = GoogleAuth() gauth.credentials = GoogleCredentials....原创 2018-03-26 15:06:02 · 599 阅读 · 0 评论