目录
Are Labels Necessary for Neural Architecture Search
Self-supervised Learning: Generative or Contrastive(2020)
Big Self-Supervised Models are Strong Semi-Supervised Learners (2020)
Region-of-interest guided Supervoxel Inpainting for Self-supervision (MICCAI2020)
Unsupervised Learning of Visual Features by Contrasting Cluster Assignments(2020)
Are Labels Necessary for Neural Architecture Search
Author:Chenxi Liu ,...,Kaiming He(Facebook AI)
发现:
-
the architecture rankings produced with and without labels are highly correlated. NAS和UnNAS的结果具有高度相关性
-
using unlabeled images from a large dataset may be a more promising approach. Label不是必须的,可以进行无监督的搜索。
参考:https://zhuanlan.zhihu.com/p/118161558
Self-supervised Learning: Generative or Contrastive(2020)
Author:Xiao Liu (Tsinghua University)
Key:介绍自监督最新的一些研究进展,包括以GAN为基础的生成式方法还有Instance discrimination为主的contrastive的方法
Note:
-
mutual information is only loosely related to the success of several MI-based methods, in which the sampling strategies and architecture design may count more.
-
There is an essential gap between pre-training and downstream tasks.(之前有一个论文也讲到了这个gap的问题)。解决方法可能是:design pre-training tasks for a specifific downstream task automatically
-
The process of selecting pretext tasks seems to be too heuristic and tricky without patterns to follow.
-
对于MI-based-me