深度学习-李宏毅
文章平均质量分 91
扔出去的回旋镖
真是无力又着迷
展开
-
ccc-Tips for Deep Learning-李宏毅(8)
Recipe of Deep Learning、Good Results on Training Data、Good Results on Testing Data、why Dropout work?原创 2023-02-15 14:32:21 · 305 阅读 · 0 评论 -
ccc-Backpropagation-李宏毅(7)
Notation、Backpropagation、Forward pass、Backward pass、Summary原创 2023-02-14 19:54:33 · 289 阅读 · 0 评论 -
ccc-Brief Introduction of Deep Learning-李宏毅(6)
Three Steps for Deep Learning、Fully Connect Feedforward Network、Matrix Operation、Output Layer as Multi-Class Classifier、Example Application原创 2023-02-13 14:04:23 · 484 阅读 · 0 评论 -
ccc-Logistic Regression-李宏毅(5)
Step 1: Function Set、Step 2: Goodness of a Function、Step 3: Find the best function、Why not Logistic Regression + Square Error、Discriminative v.s. Generative、Multi-class Classification(3 Class)、Limitation of Logistic Regression、Deep Learning!原创 2023-02-11 23:24:19 · 485 阅读 · 0 评论 -
ccc-Classification-李宏毅(4)
Classification 概念、Example Application、How to do Classification、Why not Regesssion、Probability from Class - Feature、Probability from Class、How’s the results?、Modifying Model、Three Steps、Probability Distribution原创 2023-02-11 20:57:34 · 650 阅读 · 0 评论 -
ccc-New Optimizers for Deep Learning-Chung Ming Chien(3)
Different Optimizers、Optimizers Application、Adam vs SGDM、Towards Improving Adam、Towards Improving SGDM、RAdam vs SWATS、k step forward, 1 step back、Can we look into the future?、Do you really know your optimizer?、Something helps optimization…、learned &Advices原创 2023-02-10 00:21:24 · 384 阅读 · 1 评论 -
ccc-Gradient Descent-李宏毅(2)
Tuning your learning rates、Stochastic Gradient Descent、Feature Scaling、Gradient Descent:Theory、More Limitation of Gradient Descent原创 2023-02-08 22:57:39 · 231 阅读 · 0 评论 -
ccc-Regression-李宏毅(1)
宝可梦预测第一次尝试、Model 结果探讨、宝可梦预测第二次尝试、误差来源探究、处理偏差/方差的注意事项原创 2023-02-07 20:30:38 · 600 阅读 · 0 评论