deep-learning
文章平均质量分 91
蚊子爱牛牛
等待和希望
展开
-
Deep Learning:深度前馈神经网络(一)
深度前馈网络,也叫前馈神经网络或多层感知机(MLPs)。原创 2017-10-18 17:36:09 · 868 阅读 · 0 评论 -
Deep Learning:正则化(六)
Semi-Supervised Learning:In the paradigm of semi-supervised learning, both unlabeled examples from P(x) and labeled examples from P (x, y) are used to estimate P (y | x) or predict y from x.原创 2017-10-24 09:26:25 · 341 阅读 · 0 评论 -
Deep Learning:正则化(七)
Multi-Task Learning原创 2017-10-24 09:59:33 · 390 阅读 · 0 评论 -
Deep Learning:正则化(八)
Early Stopping原创 2017-10-24 14:33:38 · 404 阅读 · 0 评论 -
Deep Learning:正则化(九)
Parameter Tying and Parameter Sharing原创 2017-10-24 15:16:07 · 479 阅读 · 0 评论 -
Deep Learning:正则化(十)
Sparse Representations原创 2017-10-24 21:26:15 · 365 阅读 · 0 评论 -
Deep Learning:正则化(十一)
Bagging and Other Ensemble Methods:Bagging (short for bootstrap aggregating) is a technique for reducing generalization error by combining several models.原创 2017-10-25 10:07:28 · 621 阅读 · 0 评论 -
Deep Learning:正则化(十二)
Dropout原创 2017-10-25 17:04:42 · 398 阅读 · 0 评论 -
Deep Learning:正则化(十三)
Adversarial Training原创 2017-10-26 11:55:19 · 376 阅读 · 0 评论 -
Deep Learning:正则化(十四)
Tangent Distance, Tangent Prop, and Manifold Tangent ClassifierMany machine learning algorithms aim to overcome the curse of dimensionality by assuming that the data lies near a low-dimensional manifol原创 2017-10-26 14:48:48 · 502 阅读 · 0 评论 -
Deep Learning:正则化(五)
Noise Robustness:Dataset Augmentation has motivated the use of noise applied to the inputs as a dataset augmentation strategy.原创 2017-10-23 23:59:58 · 446 阅读 · 0 评论 -
Deep Learning:正则化(四)
Dataset Augmentation:The best way to make a machine learning model generalize better is to train it on more data.原创 2017-10-23 23:00:50 · 374 阅读 · 0 评论 -
Deep Learning:深度前馈神经网络(二)
Deep Learning : 深度前馈神经网路-Output Units原创 2017-10-18 21:12:53 · 896 阅读 · 0 评论 -
Deep Learning:Optimization for Training Deep Models(一)
How Learning Differs from Pure Optimization原创 2017-10-26 16:44:47 · 743 阅读 · 0 评论 -
Deep Learning : 深度前馈神经网络(三)
Hidden UnitsSo far we have focused our discussion on design choices for neural networks that are common to most parametric machine learning models trained with gradientbased optimization.原创 2017-10-19 21:18:48 · 759 阅读 · 0 评论 -
Deep Learning:深度前馈神经网络(四)
Architecture Design原创 2017-10-19 23:09:50 · 469 阅读 · 0 评论 -
Deep Learning:深度前馈神经网络(五)
Back-Propagation and Other Differentiation Algorithms原创 2017-10-20 16:51:18 · 511 阅读 · 0 评论 -
Deep Learning:正则化(一)
Many strategies used in machine learning are explicitly designed to reduce the test error, possibly at the expense of increased training error. These strategies are known collectively as regularization原创 2017-10-23 16:04:23 · 694 阅读 · 0 评论 -
Deep Learning:正则化(二)
We can minimize a function subject to constraints by constructing a generalized Lagrange function, consisting of the original objective function plus a set of penalties.原创 2017-10-23 16:59:43 · 431 阅读 · 0 评论 -
Deep learning:正则化(三)
Regularization and Under-Constrained ProblemsMany linear models in machine learning, including linear regression and PCA, depend on inverting the matrix原创 2017-10-23 17:37:20 · 359 阅读 · 0 评论 -
Deep Learning:Optimization for Training Deep Models(二)
Optimization for Training Deep Models原创 2017-10-31 19:11:00 · 431 阅读 · 0 评论 -
Deep Learning:Optimization for Training Deep Models(零)
Deep Learning:Optimization for Training Deep Models原创 2017-10-26 15:15:13 · 455 阅读 · 0 评论