Abstract
The main finding is that such optimal continual algorithms generally solve an NP-HARD problem and will require a perfect memory to do so.
Introduction
分类方法分成regularization-based, replay-based和bayesian and variationally Bayesian三类;另外就是每个任务学一份参数;如何验证一个continual learning算法的性能用理论《Three scenarios for
continual learning, arxiv 2019.04》;
Optimal CL algorithms would have to solve an NP-HARD problem and perfectly memorize the past.
Conclusion
the first generic theoretical study of the CL problem
Key points:
没开源代码;Well-written,值得细读,理论部分可以学习;总结经验发现型文章,有一定的理论推导;
主要讨论的是基于正则项的方法;P问题:可以在多项式级时间复杂度内解决;
NP问题:可以在多项式级时间复杂度内被验证
NP-hard问题:指问题S,满足任何NP问题都可以在多项式级时间复杂度内被归约为S(归约:即被归约的NP问题与S的答案相同,当解决了S时,就同时解决了所有的NP问题)。可以理解为,这是一个比所有NP问题都难的问题;