机器学习
focus_clam
这个作者很懒,什么都没留下…
展开
-
增量学习——Maintaining Discrimination and Fairness in Class Incremental Learning——CVPR2020
Abstractknowledge distillation; 造成灾难性遗忘的很大一个原因是the weights in the last fully connected layer are highly biased in class-incremental learning;IntroductionConclusionmaintain the discrimination via knowledge distillation and maintains the fairness via a.原创 2020-07-14 16:07:55 · 1186 阅读 · 2 评论 -
增量学习——Incremental Learning in Online Scenario——CVPR2020
Abstract两个问题,1)灾难性遗忘;2)As new observations of old classes come sequentially over time, the distribution may change in unforeseen way, making the performance degrade dramatically on future data, which is referred to as concept drift.一个新的online learning s.原创 2020-07-14 15:29:13 · 1062 阅读 · 0 评论 -
持续学习——Neural Topic Modeling with Continual Lifelong Learning——ICML2020
Abstractcontinual learning+ unsupervised topic modeling《Lifelong machine learning for natural language processing, EMNLP2016》《Topic modeling using topics from many domains, lifelong learning and big data, ICML2014》难点data sparsity(in a small collection ..原创 2020-07-14 15:26:59 · 555 阅读 · 0 评论 -
持续学习——Optimal Continual Learning has Perfect Memory and is NP-HARD——ICML2020
AbstractThe main finding is that such optimal continual algorithms generally solve an NP-HARD problem and will require a perfect memory to do so.Introduction分类方法分成regularization-based, replay-based和bayesian and variationally Bayesian三类;另外就是每个任务学一份参数;如..原创 2020-07-14 11:20:12 · 373 阅读 · 0 评论 -
机器学习领域著名的期刊与会议
结合自己的了解和在知乎上的阅读,将机器学习领域内的期刊与会议大概进行了罗列,“:”后简要说明了其特点。期刊 1. JMLR 2. MLJ 3. PAMI 4. TNN 5. neural computation 6. PR 5. PRL 6. neuralcomputing 7. 。。。会议 1. NIPS:顶级的理论会议 2. ICML:顶级的综合会议 3. COLT:顶级的原创 2017-03-18 20:13:01 · 9424 阅读 · 0 评论