- 笔记速记,源视频请点击:斯坦福的Chelsea Finn报告。
- 详细的[pdf]下载。
数据的long tail
数据较多的区域模型训练学习效果好,但小样本数据区如何学习(如极端驾驶工况)?
基本思想: explicitly learn priors from previous experience that lead to efficient downstream learning? Learn to learn to solve tasks
基本分类
- black-box adaptation
- optimization-based inference
- non-parametric methods
- bayesian meta-learning
Problem Formulation
监督学习
a. 模型性能强烈依赖于大量的带标签的数据
b. 标签数据有些任务中是受限的
Meta-learning problem
利用additional data?
D
m
e
t
a
−
t
r
a
i
n
D_{meta-train}
Dmeta−train
differences between Meta-learning vs. Transfer learning?
元学习、迁移学习、多任务学习、半监督学习在问题设置上是有相似性的。
共同点:结合additional data, 这些数据是任务之外的数据,但是能有效的学习到任务
不同点:元学习deals with a setting where u sitll have to do some amount of adaptation on ur new task.
迁移学习某种程度上可以视作一类元学习
多任务学习:学习模型参数
θ
∗
\theta^*
θ∗,使模型能立即解决多类任务问题。可视作zero-shot meta learning
如何训练?
一些术语
training set: few shots set for a particular task, 同 support set
test set: 测试的数据, inner test, not the meta test, 同query
meta training: whole set of tasks, 由multiple tasks组成,每个task具有各自的
meta testing: to adapt to a training set for a new task
eg: 5 shot 5 way, 是25个数据点,包含5类,每一类有5个example数据点