Microsoft Visual C++ 14.0 or greater is required. Get it with “Microsoft C++ Build Tools“的解决办法 【代码】Microsoft Visual C++ 14.0 or greater is required. Get it with “Microsoft C++ Build Tools“的解决办法。
Towards Open Vocabulary Object Detection without Human-provided Bounding Boxes(2021CVPR)----论文阅读笔记 Towards Open Vocabulary Object Detection without Human-provided Bounding Boxes----论文阅读笔记Abstract1. Introduction如何实现? pseudo bounding box label如何生成的?2. Related Work3. Related Work3.1. Generating Pseudo Box Labels3.2. Open vocabulary Object Detection with Ps
什么是RPN,ROIAlign? RPNRPNRPNRPN(Region Proposal Network)用来产生Regin Proposal(前景框,候选区域,检测框(Faster RCNN 直接拿来做检测框)
Open-Vocabulary Object Detection Using Captions(2021 CVPR)----论文解读 Open-Vocabulary Object Detection Using Captions[2021CVPR]----论文解读papercode1. AbstractOpen-Vocabulary Object Detection Using Captions2. Introduction设想与构思思路与做法OVD、ZSD、 WSD的区别?3. Related WorkZSDWSDObject detection using mixed supervisionVisual grounding of re
IoU-aware Single-stage Object Detector for Accurate Localization-----论文阅读笔记 IoU-aware Single-stage Object Detector for Accurate Localization-----论文阅读笔记原文和代码:Abstract存在的问题?解决办法Introduction总结:问题:作者解决方案:2. Related Work3. Method3.1. IoU-aware single-stage object detector3.2. Training3.3. Inference未写实验可以从参考原文原文和代码:原文代码Abstract存在的问
Variational Information Distillation----论文阅读笔记 Variational Information Distillation主要贡献:VIDAlgorithm formulation代码(重点在这,方便理解)提出了一个最大化师生网络互信息作为知识转移的信息论框架。主要贡献:我们提出了变分信息提取,这是一种基于变分信息最大化技术,通过最大化两个网络之间的互信息实现的原则性知识转移框架。我们证明了VID概括了几种现有的知识转移方法。此外,在各种知识转移实验中,我们的框架实现在经验上优于最先进的知识转移方法,包括相同数据集或不同数据集上(异构)DNN之间
Similarity-Preserving Knowledge Distillation(2019ICCV)----论文阅读笔记 Similarity-Preserving Knowledge DistillationAbstract1. IntroductionAbstract在训练网络的过程中,语义相似的输入倾向于在引发相似的激活模式。保持相似性的知识提取指导学生网络的训练,使在教师网络中产生相似(不同)激活的输入指导学生网络中产生相似(不同)激活。与以前的提取方法不同,学生不需要模仿教师的表示空间,而是要在自己的表示空间中保留成对的相似性。1. Introduction...
Probabilistic Knowledge Transfer for Deep Representation Learning(2018)----论文笔记 Probabilistic Knowledge Transfer for Deep Representation LearningAbstract1. Introduction后续存在问题:本文提出的方法:优点:贡献2 Related Work3 Probabilistic Knowledge Transfer4 Experimental EvaluationAbstract1.蒸馏的知识:学习表示和label的互信息量 (mutual information between learned repre
Self-supervised Knowledge Distillation using Singular Value Decomposition(2018ECCV)----阅读笔记 Self-supervised Knowledge Distillation using Singular Value Decomposition----阅读笔记AbstractIntroduction2 Related Works2.1 Knowledge Distillation2.2 SVD and RBF2.3 Training Mechanism3 Method3.1 Proposed Distillation ModuleTruncated SVDAbstract提出了一种新的
Like What Y ou Like: Knowledge Distill via Neuron Selectivity Transfer(2017)------论文阅读笔记 Like What Y ou Like: Knowledge Distill via Neuron Selectivity Transfer------论文阅读笔记写在前面Abstract1. Introduction2. Related Works3. Background3.1. Notations3.2. Maximum Mean Discrepancy (最大平均偏差MMD)可视化结果4. Neuron Selectivity Transfer4.1. MotivationWhat is wrong
A Gift from Knowledge Distillation(2017 CVPR)----论文笔记 A Gift from Knowledge Distillation:Fast Optimization, Network Minimization and Transfer Learning(2017 CVPR)Abstract1. Introduction贡献2. Related Work3. Method3.1. Proposed Distilled Knowledge3.2. Mathematical Expression of the Distilled Knowledge3.3. Loss f
Correlation Congruence for Knowledge Distillation (2019 ICCV)----阅读笔记 Correlation Congruence for Knowledge DistillationAbstractIntrodution2.Related Work3.CCKD3.1 3.1. Background and Notations3.2. Knowledge Distillation3.3. Correlation Congruence3.4. Generalized kernel-based correlation3.5. Strategy for Mini-batch Sampler具体采样
正负样本不平衡---->Focal Loss(笔记) 添加链接描述添加链接描述简单来说,正样本 就是对于gt center point 所落在 的 grid 网格 所生成的 anchors 中与 gt 计算出最大的IOU的 anchor。负样本就是预测框 与所有的 gt 计算得到的最大IOU,如果 < ignore_thres ,就是负样本。上方图片表诉中括号的意为:一个anchor可能既是正样本又是负样本。Focal Loss 其实很简单来说就是平衡了,正负样本不平衡,难分类样本,易分类样本的不平衡。...