Abstract
神经科学的理论complementary learning systems; fast学当前具体任务信息(supervised),slow学通用的(self-supervised)
Introduction
the main focus of this study is exploring how the CLS theory can motivate a general continual learning framework with a better trade-off between alleviating catastrophic forgetting and facilitating knowledge transfer.
A single backbone to model both the hippocampus and neocortex; Figure1, slow learner use samples from the memory;
框架上来看,universal里面slow net利用自监督学习,通过memory的样本训练;
Focus在online continual learning, 传统的batch continual learning,
Method
Online continual learning setting.
区别task-aware setting, where only the corresponding classifier is selected to make a prediction; where the task-indentifier is not provided, task-free setting;
什么是episodic memory?
什么是supervised knowledge consolidation?
DualNet使用的是和别人episodic memory的默认配置。
Slow learner-> Self-supervised配置,只使用slow learner
现有各类自监督方法的分类和对应优缺点?
什么是动量更新在自监督里面?
Meta-learning for the slow learner->better generalization
Slow learner的学习方式都是采用集成了现有的方法。
Fast learner如何利用slow learner的信息?也是扩展了[42,43]的相关方法;
如何改进这个变换系数模块(transformation coefficients)?
在fast learner训练的时候,fast和slow learner如何交互?
Experiments
构建的benchmarks,split mini-ImageNet, CORE50;
什么是三个指标定义(ACC,FM,LA)
做了task-aware和task-free两个setting的实验;
分解实验给到了什么信息?4.2就刷sota; 4.3学习slow learner的不同SSL方式(loss和optimier) 4.4.给slower learner学习的时间 4.5 fast learner对性能的贡献;
4.6半监督的continual learning setting