![](https://img-blog.csdnimg.cn/20190918140158853.png?x-oss-process=image/resize,m_fixed,h_224,w_224)
专题
文章平均质量分 93
Supervised Learning
大眼呆萌君
Python小白、算法小虾的三月突击之旅,Aza Aza Fighting!!!
求深度、求知识的连接与碰撞,不求写作的完整性。谢众多同行详尽的博文与回答,叹学海无涯。
展开
-
Deep Learning相关概念
Epoch One Epoch is when an ENTIRE dataset is passed forward and backward through the neural network only ONCE [1]. Iteration Iterations is the number of batches needed to complete one epoch [1]...原创 2019-12-19 04:13:05 · 170 阅读 · 0 评论 -
principal component analysis
Derivation (method of Lagrangian multiplier) Derivation First step: Find αk′x\bm \alpha'_k \bm xαk′x that maximises var(αk′x)\text{var}(\bm \alpha'_k \bm x)var(αk′x) Choose normalisation constrai...原创 2020-01-15 00:47:58 · 140 阅读 · 0 评论 -
Multi-task Learning
Multi-task learning and its definition Linear MTL Regularisers for linear MTL (Quadratic regulariser, Structured sparsity) Clustered MTL Further topics (Transferring to new tasks)原创 2016-11-30 22:11:19 · 584 阅读 · 0 评论