![](https://img-blog.csdnimg.cn/20201014180756925.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
AI论文阅读记录
文章平均质量分 55
qq_30362711
话痨
展开
-
2020DEEP NETWORKS FROM THE PRINCIPLE OF RATE REDUCTION 阅读记录
We contend that all key features and structures of modern deep (convolution) neural networks can be naturally derived from optimizing a principled objective, namely the rate reduction recently proposed by Yu et al. (2020)Yu et al. (2020)Learning Diverse原创 2021-05-19 17:00:27 · 154 阅读 · 0 评论 -
resnet initial weights method 学习记录
1主题学习并理解resnet中的权重初始化方法。其相关论文是【13】Delving deep into rectifiers:Surpassing human-level performance on imagenet classification1.1【13】Delving deep into rectifiers:Surpassing human-level performance on imagenet classificationGlorot and Bengio [7] proposed原创 2020-12-24 16:28:44 · 249 阅读 · 1 评论 -
learning deep architectures for AI
1,介绍一张图片的像素级别是lower-level ,而图像的内容是高度抽象的。因此想要让机器像人一样识别需要将lower-level一层一层抽象到higher-level。1.1:深度学习实现了上述想法,并提高了准确率1.2-1.3: Ability to learn complex, highly-varyingfunctions, i.e., with a number of...原创 2018-09-12 14:12:56 · 589 阅读 · 0 评论 -
A Tutorial on Energy-Based Learning(机器学习能量模型)学习记录
1,Introduction其中Y是标签,X是输入。基于能量的机器学习模型即是寻找这样一个函数,和输入数据越接近,能量越小。给定一个输入,最好的模型即是对应的Y的值是真实值,这个真实值的能量是最小的。此模型可以做的工作有:预测、Ranking、Detection、条件概率密度估计。上图是Gibbs 分布,用这个函数可以构造出概率模型。2,Energy-Based Train......原创 2018-09-18 16:03:00 · 5903 阅读 · 3 评论 -
SPECTRAL NORMALIZATION FOR GENERATIVE ADVERSARIAL NETWORKS阅读记录
1,A number of works (Ue-hara et al., 2016; Qi, 2017; Gulrajani et al., 2017) advocate the importance of Lipschitz continuity in assuring the boundedness of statistics1.1 Ue-hara et al., 2016;Genera...原创 2019-07-05 15:19:49 · 408 阅读 · 0 评论 -
Deep Layer Aggregation实现
这里作者没写明白,他应该写出来T1(x)等于啥,序列你不列第一项不是有问题么,而且也没列上面的n是否要大于下面的m。但是大概意识明白了我。...原创 2019-08-27 17:35:40 · 483 阅读 · 0 评论