![](https://img-blog.csdnimg.cn/20201014180756927.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
元学习
文章平均质量分 93
pinn山里娃
交流请加vx:worker_harder
展开
-
元学习论文集
MAMLBlack box(memory model) basedOptimization(Gradient) based (Parametrics)learn to initializeFinn C, Abbeel P, Levine S. Model-agnostic meta-learning for fast adaptation of deep networks[C]//Proceedings of the 34th International Conference on Mac原创 2022-01-13 09:24:36 · 710 阅读 · 0 评论 -
元学习入门详解(MAML算法及Reptile算法复现)
元学习以及常用的元学习算法介绍原创 2020-08-26 14:25:40 · 4222 阅读 · 3 评论 -
元学习MAML算法详解
MAML and Model Pre-training为了进一步区分两者,MAML在意训练后KaTeX parse error: Undefined control sequence: \thea at position 6: \hat{\̲t̲h̲e̲a̲}^{n}表现得好,如图处,对task1和task2都下降很快在图中Model Pre-training在所有task上都最好两者就像刚毕业学生一样,Model Pre-training就是直接毕业工作拿工资,对于当前是很好,MAML...原创 2020-12-18 10:49:09 · 8247 阅读 · 0 评论 -
元学习算法MAML论文详解
提出一种 meta-learning 算法MAML,该算法是模型无关的,适用于任何利用梯度下降的方法来训练的模型,并且适用于任何任务原创 2020-08-27 11:18:57 · 964 阅读 · 0 评论 -
元学习Reptile算法详解
元学习Reptile算法详解原创 2020-06-01 22:56:29 · 1087 阅读 · 0 评论 -
MAML元学习论文集
文章目录PINN加速定义问题, 建立工程架构网络结构选择不确定性杂项超参与元学习Loss与优化算法与迁移学习MAMLBlack box(memory model) basedOptimization(Gradient) based (Parametrics)Non-parametric(Metric based)hybridFrequency analysicsPINN加速定义问题, 建立工程架构 Physics-informed neural networks: A deep learning原创 2020-10-20 09:43:03 · 394 阅读 · 0 评论 -
A novel meta-learning initialization method for physics-informed neural networks
Physics-informed neural networks (PINN)可以看作是通用的PDE求解器,但是,PINN存在两点缺陷:1.在特定问题上训练效率慢;2.真实样本少的情况下,即使迭代多次,仍得不到好的结果。为了克服PINN的两个缺陷,在这篇报告中,拓展了PINN,将元学习的Reptile算法应用于PINN中,针对两类先验信息,提出了两种算法。在我们的方法中,通过利用先验信息,经过预训练后,能够实现在少量真实样本甚至没有真实样本的新的问题上更快收敛。最后,在一维和二维的泊松方程的算例上演示原创 2022-01-13 09:22:44 · 797 阅读 · 3 评论 -
A novel meta-learning initialization method for physics-informed neural networks
摘要Physics-informed neural networks (PINN)可以看作是通用的PDE求解器,但是,PINN存在两点缺陷:1.在特定问题上训练效率慢;2.真实样本少的情况下,即使迭代多次,仍得不到好的结果。为了克服PINN的两个缺陷,在这篇报告中,拓展了PINN,将元学习的Reptile算法应用于PINN中,针对两类先验信息,提出了两种算法。在我们的方法中,通过利用先验信息,经过预训练后,能够实现在少量真实样本甚至没有真实样本的新的问题上更快收敛。最后,在一维和二维的泊松方程的算例上演示原创 2022-01-13 09:19:38 · 522 阅读 · 0 评论 -
Meta-Learning with Implicit Gradients
论文信息题目:Meta-Learning with Implicit Gradients作者:期刊会议:年份:2019论文地址代码基础补充内容动机动机:Recent work has studied how meta-learning algorithms can acquire such a capability by learning to efficiently learn a range of tasks, thereby enabling learning of a new原创 2022-01-13 09:11:24 · 421 阅读 · 0 评论 -
Online Meta-Learning详解
论文信息题目:Online Meta-Learning作者:Chelsea Finn * 1 Aravind Rajeswaran * 2 Sham Kakade 2 Sergey Levine 11 University of California, Berkeley2University of Washington. Correspondence to: Chelsea Finn cbfinn@cs.stanford.edu, Aravind Rajeswaran aravraj@cs原创 2020-10-28 16:43:18 · 1344 阅读 · 1 评论 -
Rapid learning or feature reuse ? Toward understanding the effectiveness of MAML
论文信息题目:Rapid learning or feature reuse ? Toward understanding the effectiveness of MAML作者及单位:Aniruddh Raghu,MITMaithra Raghu,Cornell University & Google BrainSamy Bengio,Google BrainOriol Vinyals,DeepMind期刊、会议:ICLR时间:2020论文地址:论文链接原创 2020-12-18 10:45:36 · 1497 阅读 · 5 评论 -
Learning and Meta-Learning of Stochastic Advection-Diffusion-Reaction Systems from Sparse Measuremen
论文信息题目:Learning and Meta-Learning of Stochastic Advection-Diffusion-Reaction Systems from Sparse Measurements(对流扩散响应)作者:Xiaoli Chen(a,b), Jinqiao Duan©, George Em Karniadakis(b,d)期刊、会议:computational physics单位:a Center for Mathematical Sciences & S原创 2020-12-18 10:46:44 · 580 阅读 · 0 评论