![](https://img-blog.csdnimg.cn/20201014180756923.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
增量学习
focus_clam
这个作者很懒,什么都没留下…
展开
-
words and sentences
2021-01-04mingle(混合,联合) the features of current task with the features of all previous tasksknowledge recitation (知识背诵,朗诵)omit(删除)the encoder and maps an arbitrary sample ID to the corresponding feature map directly...原创 2021-01-04 15:20:12 · 228 阅读 · 0 评论 -
持续学习-Towards reusable network components by learning compatible representations-arxiv2020
AbstractThis paper proposed to make a first step towards compatible and hence reusable network components. Split a network into two components: a feature extractor and a target task head. 最终验证在三个应用上,unsupervised domain adaptation, transferring classifier.原创 2020-08-21 16:22:54 · 114 阅读 · 0 评论 -
持续学习——Continual Unsupervised Representation Learning——NeurIPS2019
AbstractUnsupervised continual learning (learning representations without any knowledge about task identity)Introduction挖坑写法,however, most of these techniques have focused on a sequence of tasks in which both the identity of the task (task label) and b.原创 2020-07-22 10:35:04 · 530 阅读 · 0 评论 -
增量学习——Maintaining Discrimination and Fairness in Class Incremental Learning——CVPR2020
Abstractknowledge distillation; 造成灾难性遗忘的很大一个原因是the weights in the last fully connected layer are highly biased in class-incremental learning;IntroductionConclusionmaintain the discrimination via knowledge distillation and maintains the fairness via a.原创 2020-07-14 16:07:55 · 1136 阅读 · 2 评论 -
增量学习——Incremental Learning in Online Scenario——CVPR2020
Abstract两个问题,1)灾难性遗忘;2)As new observations of old classes come sequentially over time, the distribution may change in unforeseen way, making the performance degrade dramatically on future data, which is referred to as concept drift.一个新的online learning s.原创 2020-07-14 15:29:13 · 1019 阅读 · 0 评论 -
持续学习——Neural Topic Modeling with Continual Lifelong Learning——ICML2020
Abstractcontinual learning+ unsupervised topic modeling《Lifelong machine learning for natural language processing, EMNLP2016》《Topic modeling using topics from many domains, lifelong learning and big data, ICML2014》难点data sparsity(in a small collection ..原创 2020-07-14 15:26:59 · 510 阅读 · 0 评论 -
持续学习——Optimal Continual Learning has Perfect Memory and is NP-HARD——ICML2020
AbstractThe main finding is that such optimal continual algorithms generally solve an NP-HARD problem and will require a perfect memory to do so.Introduction分类方法分成regularization-based, replay-based和bayesian and variationally Bayesian三类;另外就是每个任务学一份参数;如..原创 2020-07-14 11:20:12 · 343 阅读 · 0 评论 -
持续学习——Automatic Recall Machines-Internal Replay, Continual Learning and the Brain——arxiv202006
AbstractReplay-based methods, present a method where these auxiliary samples are generated on the fly(出发点,就是减少内存开销),也加入了神经科学的启发来加强motivation。Introductionlearn from sequential or non-stationary data的能力(人和神经网络相比),谈到replay这一类的方法;The goal of this work, Aut.原创 2020-07-06 11:10:32 · 2365 阅读 · 0 评论 -
持续学习——《Selfless Sequential Learning》——ICLR2019
Abstractsequential learning=lifelong learning=incremental learning = continual learning, look at the scenario with fixed model capacity, the learning process should account for future tasks to be added and thus leave enough capacity for them. (not selfish原创 2020-07-01 16:57:41 · 319 阅读 · 0 评论 -
增量学习——《Insights from the Future for Continual Learning》——arxiv202006
Abstract提出了一个持续学习的新情景,prescient continual learning(测试模型不仅past classes and current classes,还需要考虑future classes)。基于zero-shot learning的启发,提出了Ghost Model,表征空间的生成模型与损失函数的小心微调Introductionthe future classes(no training samples),作者认为这个setting需要让模型know the class原创 2020-06-29 12:21:09 · 320 阅读 · 2 评论