![](https://img-blog.csdnimg.cn/20201014180756913.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
论文阅读
文章平均质量分 91
呆博士实验室
大家好,我是呆博士,一枚正在努力沉迷可解释性机器学习的研究僧
展开
-
对比学习学习笔记
我自己的一些对比学习的学习笔记,想了解的也可以看一看原创 2021-04-08 15:46:43 · 595 阅读 · 0 评论 -
【论文阅读笔记】GAN Memory with No Forgetting
细读Nips2020中杜克大学(Duke University)ECE的一篇关于终身学习(Lifelong learning or continual learning)的问题的文章。本文提出了一种没有遗忘的GAN模型并对其进行了压缩,文中对于模型各个参数进行了全面的分析,值得一看。卷积层(Conv layers)额外压缩这个是本文的一大亮点。原本的网络参数数量为52.2M。...原创 2020-12-10 17:32:24 · 889 阅读 · 1 评论 -
【论文阅读笔记】NeurIPS2020文章列表Part2
Online Multitask Learning with Long-Term MemoryFewer is More: A Deep Graph Metric Learning Perspective Using Fewer ProxiesAdaptive Graph Convolutional Recurrent Network for Traffic ForecastingOn Reward-Free Reinforcement Learning with Linear Function A.原创 2020-12-09 10:12:13 · 5906 阅读 · 0 评论 -
【论文阅读笔记】NeurIPS2020文章列表Part1
A graph similarity for deep learningAn Unsupervised Information-Theoretic Perceptual Quality MetricSelf-Supervised MultiModal Versatile NetworksBenchmarking Deep Inverse Models over time, and the Neural-Adjoint methodOff-Policy Evaluation and Learning.原创 2020-12-09 10:09:04 · 27426 阅读 · 4 评论 -
【论文阅读笔记】2020NeurIPS终身学习(Lifelong /continual learning)相关论文列表
GAN Memory with No Forgetting已读待写;Calibrating CNNs for Lifelong Learning已读待写;Improved Schemes for Episodic Memory-based Lifelong Learning已读待写;Lifelong Policy Gradient Learning of Factored Policies for Faster Training Without Forgetting强化学习相关,暂未读懂;Cont.原创 2020-12-08 15:03:02 · 1223 阅读 · 0 评论