深度学习
文章平均质量分 91
Dreamcreationman
科学谦卑的自我审视和自我否定不但没有削弱他的光荣,反而使它获得了永恒的力量。
展开
-
Few-shot Object Detection via Feature Reweighting 阅读笔记
Zeng H, Song X, Chen G, et al. Learning Scene Attribute for Scene Recognition[J]. IEEE Transactions on Multimedia, 2019, 22(6): 1519-1530.文章目录什么是场景识别MotivationMethodology场景属性建模多特征上下文关系建模Experimental ResultsMy CommentReference什么是场景识别场景识别(scene classifi.原创 2021-08-11 12:54:40 · 405 阅读 · 0 评论 -
Mixer阅读笔记
Mixer阅读笔记Tolstikhin I, Houlsby N, Kolesnikov A, et al. MLP-Mixer: An all-MLP Architecture for Vision[J]. arXiv preprint arXiv:2105.01601, 2021.很久没有看论文了,前一段时间又在准备考试……其实就是懒了点哈哈哈,五一回来在arxiv上看到这篇paper,看了abstract觉得这篇paper有点意思,点进pdf一看各种画风,遥想vision transforme原创 2021-05-21 22:37:31 · 300 阅读 · 2 评论 -
TIM阅读笔记
TIM阅读笔记Boudiaf M, Masud Z I, Rony J, et al. Transductive information maximization for few-shot learning[J]. arXiv preprint arXiv:2008.11297, 2020.昨天下午出去蹦跶去了,今天来赶工。今天这篇论文是 NeurIPS 2020 的一个工作,乍一看还是一个Tim?腾讯的那个?不是哈哈哈,Transductive Information Maximization F原创 2021-01-15 14:00:52 · 532 阅读 · 0 评论 -
DIM阅读笔记
DIM阅读笔记Hjelm R D, Fedorov A, Lavoie-Marchildon S, et al. Learning deep representations by mutual information estimation and maximization[J]. arXiv preprint arXiv:1808.06670, 2018.前面一段时间也比较忙(忙着让王者荣耀上王者),其实也在做实验,后来验证想法失败了,没写笔记就代表没怎么精读论文,所以一直也没更新,但是一想不能就酱原创 2021-01-12 17:27:04 · 632 阅读 · 2 评论 -
TAML阅读笔记
TAML阅读笔记Jamal M A, Qi G J. Task agnostic meta-learning for few-shot learning[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019: 11719-11727.这篇TAML一看名字就知道是meta learning……他的这个名字太像MAML了哈哈哈,只不过MAML是Model Agnostic,这个TAM原创 2020-12-23 00:32:46 · 954 阅读 · 0 评论 -
Prototypical Net阅读笔记
Prototypical Net阅读笔记Snell J, Swersky K, Zemel R. Prototypical networks for few-shot learning[C]//Advances in neural information processing systems. 2017: 4077-4087.文章主要思想上篇博客在讲Relation Network的时候讲到relation net在处理few shot learning的时候对于N-way K-shot当K大于1原创 2020-12-18 20:58:57 · 624 阅读 · 2 评论 -
Relation Net阅读笔记
Relation Net阅读笔记Sung F, Yang Y, Zhang L, et al. Learning to compare: Relation network for few-shot learning[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018: 1199-1208.乱扯这篇文章叫Relation Network,今天早上醒来一看群里,今天组会分享的同学原创 2020-12-18 18:28:17 · 533 阅读 · 1 评论 -
Matching Net阅读笔记
Matching Net阅读笔记Vinyals O, Blundell C, Lillicrap T, et al. Matching networks for one shot learning[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016: 3637-3645.文章主要思想本文主要解决的是FSL问题,但是本文的出现时间比MAML早,是Goog原创 2020-12-17 20:40:25 · 757 阅读 · 0 评论 -
Reviews On Few Shot Learning
文章目录IntroductionBackgroundDefinitionGoalApplication in CVMethodologyDataModelMulti-Task LearningMetrics LearningNTM Based LearningAlgorithmFuture WorkIntroductionBackground随着2012年AlexNet在ImageNet上取得优异的成绩后,这种基于Data-Driven的深度学习逐渐成为现代机器学习最庞大和繁荣的一个分支体系,其实仔细原创 2020-12-13 23:02:52 · 207 阅读 · 0 评论 -
Siamese Neural Networks 阅读笔记
Siamese Neural Networks 阅读笔记文章题目:Siamese Neural Networks for One-shot Image Recognition文章信息:Koch G, Zemel R, Salakhutdinov R. Siamese neural networks for one-shot image recognition[C]//ICML deep learning workshop. 2015, 2.背景最近在做一些Few-Shot Learning的一些原创 2020-12-09 10:08:21 · 415 阅读 · 0 评论 -
Vision Transformer 阅读笔记
Vision Transformer 阅读笔记文章题目:An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale文章信息:Submitted to International Conference on Learning Representation文章状态:Under Reviewgithub:https://github.com/emla2805/vision-transformer给这篇文章算一卦原创 2020-12-09 10:05:30 · 989 阅读 · 2 评论 -
Mutual Guidance 阅读笔记
Mutual Guidance 阅读笔记Zhang H, Fromont E, Lefevre S, et al. Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection[J]. arXiv preprint arXiv:2009.14085, 2020.背景当前的目标检测主要是以RCNN为主的两阶段检测器和以YOLO为代表的一阶段检测器,这些检测器大多都是采用了滑窗或者selective原创 2020-12-09 10:03:43 · 385 阅读 · 1 评论 -
常见损失函数
常见损失函数@(cs231n)文章目录常见损失函数损失函数定义0-1损失函数(zero-one loss)绝对值损失函数(Absolute loss)log对数损失函数(Log Loss)平方损失函数(Square Loss)指数损失函数 (Exponential Loss)SVM损失函数(Hinge Loss)感知损失函数(perceptron Loss)交叉熵损失函数 (Cross-entropy loss function)Modified Huber Loss参考文献损失函数定义0-1损失函数(原创 2020-12-09 09:56:34 · 2421 阅读 · 0 评论