元学习论文集

MAML

  • Black box(memory model) based

  • Optimization(Gradient) based (Parametrics)
    learn to initialize

    • Finn C, Abbeel P, Levine S. Model-agnostic meta-learning for fast adaptation of deep networks[C]//Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, 2017: 1126-1135.(MAML的源头文章,用于求解分类、回归等多个问题。)
    • Nichol A, Achiam J, Schulman J. On first-order meta-learning algorithms[J]. arXiv preprint arXiv:1803.02999, 2018.(MAML涉及二阶梯度的计算,本文提出了一阶MAML,包括FO-MAML和Reptile,并给出了一些理论分析结果。)
    • Rajeswaran A, Finn C, Kakade S M, et al. Meta-learning with implicit gradients[C]//Advances in Neural Information Processing Systems. 2019: 113-124.(提出了隐式MAML, 内循环迭代只依赖迭代结果,不依赖迭代过程,减小缓存。)
    • Antoniou A, Edwards H, Storkey A. How to train your maml[J], ICLR 2019. (提出了MAML并不稳定这一结论,提出了加强版MAML,针对存在的问题提出了各种trick)
  • Non-parametric(Metric based)
    learn to compare

  • hybrid
    bayesian
    MAML可以找到一种初始值,使得模型可以用少量数据上进行快速学习。

    • Yoon J, Kim T, Dia O, et al. Bayesian model-agnostic meta-learning[C]//Advances in Neural Information Processing Systems. 2018: 7332-7342.(提出了 Bayesian MAML,将MAML的思想用于Bayesian后验估计中。)
    • Finn C, Xu K, Levine S. Probabilistic model-agnostic meta-learning[C]//Advances in Neural Information Processing Systems. 2018: 9516-9527.(任务间的学习可能会产生矛盾,本文提出了基于概率分布进行task采样的元学习方法,推广了MAML。 )
  • 经典文献

    • Fallah A, Mokhtari A, Ozdaglar A. On the Convergence Theory of Gradient-Based Model-Agnostic Meta-Learning Algorithms[J]. arXiv preprint arXiv:1908.10400, 2019.(分析了MAML的复杂度,梯度下降算法收敛性分析的那一套分析工具,比较系统,值得一读。)
    • Ji K, Yang J, Liang Y. Multi-Step Model-Agnostic Meta-Learning: Convergence and Improved Algorithms[J]. arXiv preprint arXiv:2002.07836, 2020.(同上一篇一脉相承,分析了多步MAML求解非凸问题的收敛性。)
  • Raghu A, Raghu M, Bengio S, et al. Rapid learning or feature reuse? towards understanding the effectiveness of maml[J]. arXiv preprint arXiv:1909.09157, 2019.( 顾名思义。)

    • Xu R, Chen L, Karbasi A. Meta Learning in the Continuous Time Limit[J]. arXiv preprint arXiv:2006.10921, 2020.(从ODE的角度分析了MAML,同时提出了BIMAML,利用了梯度流。)
  • 最新文章(倒序)

    • 待补充。。

    • Wang H, Sun R, Li B. Global convergence and induced kernels of gradient-based meta-learning with neural nets[J]. arXiv preprint arXiv:2006.14606, 2020.(阐释了MAML with DNN 是否有全局收敛保证,以及MAML如何快速适用于DNN,可以借鉴。)

    • Lekkala K, Itti L. Attentive Feature Reuse for Multi Task Meta learning[J]. arXiv preprint arXiv:2006.07438, 2020. (将注意力机制用于多任务的同步学习。)

    • Frikha A, Krompaß D, Köpken H G, et al. Few-Shot One-Class Classification via Meta-Learning[J]. arXiv preprint arXiv:2007.04146, 2020.(基于MAML提出了OCC(one-class classification)问题的预训练方法并理论分析了为什么所提算法比MAML更适用于这个问题,可借鉴其分析

    • Raj P, Namboodiri V P, Behera L. Learning to Switch CNNs with Model Agnostic Meta Learning for Fine Precision Visual Servoing[J]. arXiv preprint arXiv:2007.04645, 2020.(应用类,将CNN用于高精度Visual Servoing?)

    • Zheng Y, Xiang J, Su K, et al. BI-MAML: Balanced Incremental Approach for Meta Learning[J]. arXiv preprint arXiv:2006.07412, 2020.(提出了增量式MAML算法,更新规则是在不遗忘旧任务的同时适应于新任务,实验效果很不错。)

    • Wijekoon A, Wiratunga N. Learning-to-Learn Personalised Human Activity Recognition Models[J]. arXiv preprint arXiv:2006.07472, 2020.(应用类,将MAML用于Human Activity Recognition问题。)

    • Ma N, Bu J, Yang J, et al. Few-Shot Graph Classification with Model Agnostic Meta-Learning[J]. arXiv preprint arXiv:2003.08246, 2020.(应用类,将MAML用于GCN。)

    • Rajasegaran J, Khan S, Hayat M, et al. iTAML: An Incremental Task-Agnostic Meta-learning Approach[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020: 13588-13597.(与MAML非常类似,算法过程几乎一致,但是文章号称有所区别,觉得很奇怪,待阅读。。。)

    • Kang J, Liu R, Li L, et al. Domain-Invariant Speaker Vector Projection by Model-Agnostic Meta-Learning[J]. arXiv preprint arXiv:2005.11900, 2020.(将MAML用于语音识别)

    • Lingxiao Wang, Qi Cai, Zhuoran Yang, Zhaoran Wang. On the Global Optimality of Model-Agnostic Meta-Learning, ICML, 2020.(提出了meta-RL算法用于求解RL问题,分析了 ϵ \epsilon ϵ-stationary point 同Optimal set 的距离误差界)

    • Igor Molybog, Javad Lavaei. Global Convergence of MAML for LQR, https://arxiv.org/abs/2006.00453, 2020.(文章分析了利用MAML求解sigle-task问题的全局收敛性,只证明了某几种简单情形,其他的收敛性分析也许可以挖掘。。。)

    • Liu Z, Zhang R, Song Y, et al. When does MAML Work the Best? An Empirical Study on Model-Agnostic Meta-Learning in NLP Applications[J]. arXiv preprint arXiv:2005.11700, 2020.(文章调研了MAML在NLP问题中的适用性,因为MAML初始化结果随机性很强,所以这个工作有点意思,不过此篇做的非常粗略。)

    • Yap P C, Ritter H, Barber D. Bayesian Online Meta-Learning with Laplace Approximation[J]. arXiv preprint arXiv:2005.00146, 2020.(利用一个Bayesian online meta-learning framework克服MAML的灾难性遗忘问题。)

  • 11
    点赞
  • 19
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

pinn山里娃

原创不易请多多支持

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值