ICLR 2017-2019 中的 Attention Papers

ICLR 2017-2019 中的 Attention Papers

关键词:注意力机制


ICLR 2020 中的 Attention Papers

ICLR 2017-2019 中的 Attention Papers

CVPR 2019 中的 Attention Papers

CVPR 2018 中的 Attention Papers

CVPR 2017 中的 Attention Papers

ICCV 2019 中的 Attention Papers

ICCV 2017 中的 Attention Papers

ECCV 2018 中的 Attention Papers


2019


Pay Less Attention with Lightweight and Dynamic Convolutions 

Felix WuAngela FanAlexei BaevskiYann DauphinMichael Auli

28 Sep 2018 (modified: 21 Feb 2019) ICLR 2019 Conference Blind Submission

 

How Powerful are Graph Neural Networks? 

Keyulu Xu*Weihua Hu*Jure LeskovecStefanie Jegelka

28 Sep 2018 (modified: 23 Feb 2019) ICLR 2019 Conference Blind Submission

 

Posterior Attention Models for Sequence to Sequence Learning 

Shiv ShankarSunita Sarawagi

28 Sep 2018 (modified: 16 Mar 2019) ICLR 2019 Conference Blind Submission

 

Attention, Learn to Solve Routing Problems! 

Wouter KoolHerke van HoofMax Welling

28 Sep 2018 (modified: 07 Feb 2019) ICLR 2019 Conference Blind Submission

 

Residual Non-local Attention Networks for Image Restoration 

Yulun ZhangKunpeng LiKai LiBineng ZhongYun Fu

28 Sep 2018 (modified: 21 Feb 2019) ICLR 2019 Conference Blind Submission

 

Marginalized Average Attentional Network For Weakly-Supervised Learning 

Yuan YuanYueming LyuXi ShenIvor W. TsangDit-Yan Yeung

28 Sep 2018 (modified: 27 Feb 2019) ICLR 2019 Conference Blind Submission

 

Coarse-grain Fine-grain Coattention Network for Multi-evidence Question Answering 

Victor ZhongCaiming XiongNitish Shirish KeskarRichard Socher

28 Sep 2018 (modified: 26 Dec 2018) ICLR 2019 Conference Blind Submission

 

 

Delta: Deep Learning Transfer Using Feature Map With Attention For Convolutional Networks 

Xingjian LiHaoyi XiongHanchao WangYuxuan RaoLiping LiuJun Huan

28 Sep 2018 (modified: 26 Jan 2019) ICLR 2019 Conference Blind Submission

 

Hyperbolic Attention Networks 

Caglar GulcehreMisha DenilMateusz MalinowskiAli RazaviRazvan PascanuKarl Moritz HermannPeter BattagliaVictor BapstDavid RaposoAdam Santoro, Nando de Freitas

28 Sep 2018 (modified: 04 May 2019) ICLR 2019 Conference Blind Submission

 


2018


 

Graph Attention Networks 

Petar VeličkovićGuillem CucurullArantxa CasanovaAdriana RomeroPietro LiòYoshua Bengio

16 Feb 2018 (modified: 23 Feb 2018) ICLR 2018 Conference Blind Submissio

 

QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension 

Adams Wei YuDavid DohanMinh-Thang LuongRui ZhaoKai ChenMohammad NorouziQuoc V. Le

16 Feb 2018 (modified: 24 Apr 2018) ICLR 2018 Conference Blind Submission

 

Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling 

Tao ShenTianyi ZhouGuodong LongJing JiangChengqi Zhang

16 Feb 2018 (modified: 23 Feb 2018) ICLR 2018 Conference Blind Submission

 

Learn to Pay Attention 

Saumya JetleyNicholas A. LordNamhoon LeePhilip H. S. Torr

16 Feb 2018 (modified: 24 Feb 2018) ICLR 2018 Conference Blind Submissio

 

Monotonic Chunkwise Attention 

Chung-Cheng Chiu*Colin Raffel*

16 Feb 2018 (modified: 23 Feb 2018) ICLR 2018 Conference Blind Submission

 

DCN+: Mixed Objective And Deep Residual Coattention for Question Answering 

Caiming XiongVictor ZhongRichard Socher

16 Feb 2018 (modified: 23 Feb 2018) ICLR 2018 Conference Blind Submission

 

FusionNet: Fusing via Fully-aware Attention with Application to Machine Comprehension 

Hsin-Yuan HuangChenguang ZhuYelong ShenWeizhu Chen

16 Feb 2018 (modified: 16 Feb 2018) ICLR 2018 Conference Blind Submission

 

Compositional Attention Networks for Machine Reasoning 

Drew A. HudsonChristopher D. Manning

16 Feb 2018 (modified: 23 Feb 2018) ICLR 2018 Conference Blind Submissio

 


2017


Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer 

Sergey ZagoruykoNikos Komodakis

06 Nov 2016 (modified: 13 Feb 2017)ICLR 2017 conference submission

 

Deep Biaffine Attention for Neural Dependency Parsing 

Timothy DozatChristopher D. Manning

05 Nov 2016 (modified: 04 Mar 2017)ICLR 2017 conference submission

 

Bidirectional Attention Flow for Machine Comprehension 

Minjoon SeoAniruddha KembhaviAli FarhadiHannaneh Hajishirzi

05 Nov 2016 (modified: 04 Mar 2017)ICLR 2017 conference submission

 

Dynamic Coattention Networks For Question Answering 

Caiming XiongVictor ZhongRichard Socher

05 Nov 2016 (modified: 15 Feb 2017)ICLR 2017 conference submission

 

Structured Attention Networks 

Yoon KimCarl DentonLuong HoangAlexander M. Rush

05 Nov 2016 (modified: 17 Feb 2017)ICLR 2017 conference submission

 

Frustratingly Short Attention Spans in Neural Language Modeling 

Michał DanilukTim RocktäschelJohannes WelblSebastian Riedel

05 Nov 2016 (modified: 19 Feb 2017)ICLR 2017 conference submission

 

Recurrent Mixture Density Network for Spatiotemporal Visual Attention 

Loris BazzaniHugo LarochelleLorenzo Torresani

04 Nov 2016 (modified: 11 Feb 2017)ICLR 2017 conference submission

 

 

 

 

 

 

 

 

 

 

 

 

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值