机器翻译领域最重要的论文和学术文献目录清单(清华大学NLP组)

转载自:http://blog.sina.com.cn/s/blog_56eb62d30102y694.html
参考资料:http://ju.outofmemory.cn/entry/326011 ACL2017录用论文整理
https://www.leiphone.com/news/201708/3bt3QcwNF3o1o3aA.html 万字长文,深度解读11篇 EMNLP 2017 被录用论文

先验知识整合单词/短语约束

  • Philip Arthur, Graham Neubig, and Satoshi Nakamura. 2016. Incorporating Discrete Translation Lexicons into Neural Machine Translation. In Proceedings of EMNLP 2016.(离散词汇)
  • Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu and Maosong Sun. 2017. Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization. In Proceedings of ACL 2017.(本论文提出了使用后验正则化将先验知识集成到神经机器翻译中的一般框架,该方法将先验知识表征为对数线性模型中的特征,并用来指导神经机器翻译的学习过程。)
  • Chris Hokamp and Qun Liu. 2017. Lexically Constrained Decoding for Sequence Generation Using Grid Beam Search. In Proceedings of ACL 2017.(提出了一个扩展了传统的beam search的算法Grid Beam Search(GBS)。这个算法能够包含一些预先定义好的词法限制。实验证实GBS能够对翻译质量打来很大的提升。)
  • Zichao Yang, Zhiting Hu, Yuntian Deng, Chris Dyer, and Alex Smola. 2017. Neural Machine Translation with Recurrent Attention Modeling. In Proceedings of EACL 2017.
  • Rongxiang Weng, Shujian Huang, Zaixiang Zheng, Xinyu Dai, and Jiajun Chen. 2017. Neural Machine Translation with Word Predictions. In Proceedings of EMNLP 2017.(提出了一种减小 target vocabulary 的方法,主要用到了词预测机制(word predictor)。之前 MT 的目标是生成一个词序列(ordered sequence),而现在 word predictor 的目标是生成 y1…yn 的词,但是不考虑词序(no order)。)
  • Yang Feng, Shiyue Zhang, Andi Zhang, Dong Wang, and Andrew Abel. 2017. Memory-augmented Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Leonard Dahlmann, Evgeny Matusov, Pavel Petrushkov, and Shahram Khadivi. 2017. Neural Machine Translation Leveraging Phrase-based Models in a Hybrid Search. In Proceedings of EMNLP 2017.
  • Xing Wang, Zhaopeng Tu, Deyi Xiong, and Min Zhang. 2017. Translating Phrases in Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Baosong Yang, Derek F. Wong, Tong Xiao, Lidia S. Chao, and Jingbo Zhu. 2017. Towards Bidirectional Hierarchical Representations for Attention-based Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Po-Sen Huang, Chong Wang, Sitao Huang, Dengyong Zhou, and Li Deng. 2018. Towards Neural Phrase-based Machine Translation. In Proceedings of ICLR 2018.
  • Toan Nguyen and David Chiang. 2018. Improving Lexical Choice in Neural Machine Translation. In Proceedings of NAACL 2018.
  • Huadong Chen, Shujian Huang, David Chiang, Xinyu Dai, and Jiajun Chen. 2018. Combining Character and Word Information in Neural Machine Translation Using a Multi-Level Attention. In Proceedings of NAACL 2018.
  • Matt Post and David Vilar. 2018. Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation. In Proceedings of NAACL 2018.
  • Jingyi Zhang, Masao Utiyama, Eiichro Sumita, Graham Neubig, and Satoshi Nakamura. 2018. Guiding Neural Machine Translation with Retrieved Translation Pieces. In Proceedings of NAACL 2018.
  • Eva Hasler, Adrià de Gispert, Gonzalo Iglesias, and Bill Byrne. 2018. Neural Machine Translation Decoding with Terminology Constraints. In Proceedings of NAACL 2018.
  • Nima Pourdamghani, Marjan Ghazvininejad, and Kevin Knight. 2018. Using Word Vectors to Improve Word Alignments for Low Resource Machine Translation. In Proceedings of NAACL 2018.
  • Shuming Ma, Xu SUN, Yizhong Wang, and Junyang Lin. 2018. Bag-of-Words as Target for Neural Machine Translation. In Proceedings of ACL 2018.
  • Mingxuan Wang, Jun Xie, Zhixing Tan, Jinsong Su, Deyi Xiong, and Chao Bian. 2018. Neural Machine Translation with Decoding-History Enhanced Attention. In Proceedings of COLING 2018.
  • Arata Ugawa, Akihiro Tamura, Takashi Ninomiya, Hiroya Takamura, and Manabu Okumura. 2018. Neural Machine Translation Incorporating Named Entity. In Proceedings of COLING 2018.
  • Longyue Wang, Zhaopeng Tu, Andy Way, and Qun Liu. 2018. Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism. In Proceedings of EMNLP 2018.
  • Qian Cao and Deyi Xiong. 2018. Encoding Gated Translation Memory into Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Chengyue Gong, Di He, Xu Tan, Tao Qin, Liwei Wang, and Tie-Yan Liu. 2018. FRAGE: Frequency-Agnostic Word Representation. In Proceedings of NeurIPS 2018.

句法/语义约束

  • Trevor Cohn, Cong Duy Vu Hoang, Ekaterina Vymolova, Kaisheng Yao, Chris Dyer, and Gholamreza Haffari. 2016. Incorporating Structural Alignment Biases into an Attentional Neural Translation Model. In Proceedings of NAACL 2016.
  • Yong Cheng, Shiqi Shen, Zhongjun He, Wei He, Hua Wu, Maosong Sun, and Yang Liu. 2016. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation. In Proceedings of IJCAI 2016.
  • Akiko Eriguchi, Kazuma Hashimoto, and Yoshimasa Tsuruoka. 2016. Tree-to-Sequence Attentional Neural Machine Translation. In Proceedings of ACL 2016.
  • Junhui Li, Deyi Xiong, Zhaopeng Tu, Muhua Zhu, Min Zhang, and Guodong Zhou. 2017. Modeling Source Syntax for Neural Machine Translation. In Proceedings of ACL 2017.
  • Shuangzhi Wu, Dongdong Zhang, Nan Yang, Mu Li, and Ming Zhou. 2017. Sequence-to-Dependency Neural Machine Translation. In Proceedings of ACL 2017.
  • Jinchao Zhang, Mingxuan Wang, Qun Liu, and Jie Zhou. 2017. Incorporating Word Reordering Knowledge into Attention-based Neural Machine Translation. In Proceedings of ACL 2017.
  • Huadong Chen, Shujian Huang, David Chiang, and Jiajun Chen. 2017. Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder. In Proceedings of ACL 2017.
  • Akiko Eriguchi, Yoshimasa Tsuruoka, and Kyunghyun Cho. 2017. Learning to Parse and Translate Improves Neural Machine Translation. In Proceedings of ACL 2017.
    Roee Aharoni and Yoav Goldberg. 2017. Towards String-To-Tree Neural Machine Translation. In Proceedings of ACL 2017.
  • Kazuma Hashimoto and Yoshimasa Tsuruoka. 2017. Neural Machine Translation with Source-Side Latent Graph Parsing. In Proceedings of EMNLP 2017.
  • Joost Bastings, Ivan Titov, Wilker Aziz, Diego Marcheggiani, and Khalil Simaan. 2017. Graph Convolutional Encoders for Syntax-aware Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Kehai Chen, Rui Wang, Masao Utiyama, Lemao Liu, Akihiro Tamura, Eiichiro Sumita, and Tiejun Zhao. 2017. Neural Machine Translation with Source Dependency Representation. In Proceedings of EMNLP 2017.
  • Peyman Passban, Qun Liu, and Andy Way. 2018. Improving Character-Based Decoding Using Target-Side Morphological Information for Neural Machine Translation. In Proceedings of NAACL 2018.
  • Diego Marcheggiani, Joost Bastings, and Ivan Titov. 2018. Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks. In Proceedings of NAACL 2018.
  • Chunpeng Ma, Akihiro Tamura, Masao Utiyama, Tiejun Zhao, and Eiichiro Sumita. 2018. Forest-Based Neural Machine Translation. In Proceedings of ACL 2018.
  • Shaohui Kuang, Junhui Li, António Branco, Weihua Luo, and Deyi Xiong. 2018. Attention Focusing for Neural Machine Translation by Bridging Source and Target Embeddings. In Proceedings of ACL 2018.
  • Duygu Ataman and Marcello Federico. 2018. Compositional Representation of Morphologically-Rich Input for Neural Machine Translation. In Proceedings of ACL 2018.
  • Danielle Saunders, Felix Stahlberg, Adrià de Gispert, and Bill Byrne. 2018. Multi-representation ensembles and delayed SGD updates improve syntax-based NMT. In Proceedings of ACL 2018.
  • Wen Zhang, Jiawei Hu, Yang Feng, and Qun Liu. 2018. Refining Source Representations with Relation Networks for Neural Machine Translation. In Proceedings of COLING 2018.
  • Poorya Zaremoodi and Gholamreza Haffari. 2018. Incorporating Syntactic Uncertainty in Neural Machine Translation with a Forest-to-Sequence Model. In Proceedings of COLING 2018.
  • Hao Zhang, Axel Ng, and Richard Sproat. 2018. Fast and Accurate Reordering with ITG Transition RNN. In Proceedings of COLING 2018.
  • Jetic G, Hassan S. Shavarani, and Anoop Sarkar. 2018. Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing. In Proceedings of EMNLP 2018.
  • Anna Currey and Kenneth Heafield. 2018. Multi-Source Syntactic Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Xinyi Wang, Hieu Pham, Pengcheng Yin, and Graham Neubig. 2018. A Tree-based Decoder for Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Eliyahu Kiperwasser and Miguel Ballesteros. 2018. Scheduled Multi-Task Learning: From Syntax to Translation. Transactions of the Association for Computational Linguistics.

覆盖范围约束

  • Zhaopeng Tu, Zhengdong Lu, Yang Liu, Xiaohua Liu, and Hang Li. 2016. Modeling Coverage for Neural Machine Translation. In Proceedings of ACL 2016.
  • Yonghui Wu, Mike Schuster, Zhifeng Chen, Quoc V. Le, Mohammad Norouzi, Wolfgang Macherey, Maxim Krikun, Yuan Cao, Qin Gao, Klaus Macherey, Jeff Klingner, Apurva Shah, Melvin Johnson, Xiaobing Liu, ukasz Kaiser, Stephan Gouws, Yoshikiyo Kato, Taku Kudo, Hideto Kazawa, Keith Stevens, George Kurian, Nishant Patil, Wei Wang, Cliff Young, Jason Smith, Jason Riesa, Alex Rudnick, Oriol Vinyals, Greg Corrado, Macduff Hughes, and Jeffrey Dean. 2016. Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation. In Proceedings of NIPS 2016.
  • Haitao Mi, Baskaran Sankaran, Zhiguo Wang, and Abe Ittycheriah. 2016. Coverage Embedding Models for Neural Machine Translation. In Proceedings of EMNLP 2016.
  • Yanyang Li, Tong Xiao, Yinqiao Li, Qiang Wang, Changming Xu, and Jingbo Zhu. 2018. A Simple and Effective Approach to Coverage-Aware Neural Machine Translation. In Proceedings of ACL 2018.
  • Zaixiang Zheng, Hao Zhou, Shujian Huang, Lili Mou, Xinyu Dai, Jiajun Chen, and Zhaopeng Tu. 2018. Modeling Past and Future for Neural Machine Translation. Transactions of the Association for Computational Linguistics.

文档级翻译

  • Longyue Wang, Zhaopeng Tu, Andy Way, and Qun Liu. 2017. Exploiting Cross-Sentence Context for Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, and Hang Li. 2017. Context Gates for Neural Machine Translation. Transactions of the Association for Computational Linguistics.
  • Rachel Bawden, Rico Sennrich, Alexandra Birch, and Barry Haddow. 2018. Evaluating Discourse Phenomena in Neural Machine Translation. In Proceedings of NAACL 2018.
  • Elena Voita, Pavel Serdyukov, Rico Sennrich, and Ivan Titov. 2018. Context-Aware Neural Machine Translation Learns Anaphora Resolution. In Proceedings of ACL 2018.
  • Sameen Maruf and Gholamreza Haffari. 2018. Document Context Neural Machine Translation with Memory Networks. In Proceedings of ACL 2018.
  • Modeling Coherence for Neural Machine Translation with Dynamic and Topic Caches. In Proceedings of COLING 2018.
  • Jiacheng Zhang, Huanbo Luan, Maosong Sun, Feifei Zhai, Jingfang Xu, Min Zhang and Yang Liu. 2018. Improving the Transformer Translation Model with Document-Level Context. In Proceedings of EMNLP 2018.
  • Samuel Läubli, Rico Sennrich, and Martin Volk. 2018. Has Machine Translation Achieved Human Parity? A Case for Document-level Evaluation. In Proceedings of EMNLP 2018.
  • Lesly Miculicich, Dhananjay Ram, Nikolaos Pappas, and James Henderson. 2018. Document-Level Neural Machine Translation with Hierarchical Attention Networks. In Proceedings of EMNLP 2018.
  • Zhaopeng Tu, Yang Liu, Shumin Shi, and Tong Zhang. 2018. Learning to Remember Translation History with a Continuous Cache. Transactions of the Association for Computational Linguistics.

翻译鲁棒性

  • Yonatan Belinkov and Yonatan Bisk. 2018. Synthetic and Natural Noise Both Break Neural Machine Translation. In Proceedings of ICLR 2018.
  • Zhengli Zhao, Dheeru Dua, and Sameer Singh. 2018. Generating Natural Adversarial Examples. In Proceedings of ICLR 2018.
  • Yong Cheng, Zhaopeng Tu, Fandong Meng, Junjie Zhai, and Yang Liu. 2018. Towards Robust Neural Machine Translation. In Proceedings of ACL 2018.
  • Marco Tulio Ribeiro, Sameer Singh, and Carlos Guestrin. 2018. Semantically Equivalent Adversarial Rules for Debugging NLP models. In Proceedings of ACL 2018.
  • Javid Ebrahimi, Daniel Lowd, and Dejing Dou. 2018. On Adversarial Examples for Character-Level Neural Machine Translation. In Proceedings of COLING 2018.
  • Shaohui Kuang and Deyi Xiong. 2018. Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model. In Proceedings of COLING 2018.
  • Paul Michel and Graham Neubig. 2018. MTNT: A Testbed for Machine Translation of Noisy Text. In Proceedings of EMNLP 2018.

翻译可视化与可解释性

  • Felix Hill, Kyunghyun Cho, Sebastien Jean, Coline Devin, and Yoshua Bengio. 2015. Embedding Word Similarity with Neural Machine Translation. In Proceedings of ICLR 2015.
  • Yanzhuo Ding, Yang Liu, Huanbo Luan and Maosong Sun. 2017. Visualizing and Understanding Neural Machine Translation. In Proceedings of ACL 2017.
  • Yonatan Belinkov, Nadir Durrani, Fahim Dalvi, Hassan Sajjad, and James Glass. 2017. What do Neural Machine Translation Models Learn about Morphology?. In Proceedings of ACL 2017.
  • Ella Rabinovich, Noam Ordan, and Shuly Wintner. 2017. Found in Translation: Reconstructing Phylogenetic Language Trees from Translations. In Proceedings of ACL 2017.
  • Rico Sennrich. 2017. How Grammatical is Character-level Neural Machine Translation? Assessing MT Quality with Contrastive Translation Pairs. In Proceedings of EACL 2017.
  • Adam Poliak, Yonatan Belinkov, James Glass, and Benjamin Van Durme. 2018. On the Evaluation of Semantic Phenomena in Neural Machine Translation Using Natural Language Inference. In Proceedings of NAACL 2018.
  • Arianna Bisazza and Clara Tump. 2018. The Lazy Encoder: A Fine-Grained Analysis of the Role of Morphology in Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Lijun Wu, Xu Tan, Di He, Fei Tian, Tao Qin, Jianhuang Lai, and Tie-Yan Liu. 2018. Beyond Error Propagation in Neural Machine Translation: Characteristics of Language Also Matter. In Proceedings of EMNLP 2018.
  • Hendrik Strobelt, Sebastian Gehrmann, Michael Behrisch, Adam Perer, Hanspeter Pfister, and Alexander M. Rush. 2018. Seq2Seq-Vis: A Visual Debugging Tool for Sequence-to-Sequence Models. In Proceedings of VAST 2018 and Proceedings of EMNLP-BlackBox 2018.
  • Alessandro Raganato and Jorg Tiedemann. 2018. An Analysis of Encoder Representations in Transformer-Based Machine Translation. In Proceedings of EMNLP-BlackBox 2018.
  • Felix Stahlberg, Danielle Saunders, and Bill Byrne. 2018. An Operation Sequence Model for Explainable Neural Machine Translation. In Proceedings of EMNLP-BlackBox 2018.
  • Anthony Bau, Yonatan Belinkov, Hassan Sajjad, Nadir Durrani, Fahim Dalvi, and James Glass. 2019. Identifying and Controlling Important Neurons in Neural Machine Translation. In Proceedings of ICLR 2019

翻译公平性与多样性

  • Hayahide Yamagishi, Shin Kanouchi, Takayuki Sato, and Mamoru Komachi. 2016. Controlling the Voice of a Sentence in Japanese-to-English Neural Machine Translation. In Proceedings of the 3rd Workshop on Asian Translation.
  • Rico Sennrich, Barry Haddow and Alexandra Birch. 2016. Controlling Politeness in Neural Machine Translation via Side Constraints. In Proceedings of NAACL 2016.
  • Xing Niu, Marianna Martindale, and Marine Carpuat. 2017. A Study of Style in Machine Translation: Controlling the Formality of Machine Translation Output. In Proceedings of EMNLP 2016.
  • Ella Rabinovich, Raj Nath Patel, Shachar Mirkin, Lucia Specia, and Shuly Wintner. 2017. Personalized Machine Translation: Preserving Original Author Traits. In Proceedings of EACL 2017.
  • Myle Ott, Michael Auli, David Grangier, and Marc’Aurelio Ranzato. 2018. Analyzing Uncertainty in Neural Machine Translation. In Proceedings of ICML 2018.
  • Paul Michel and Graham Neubig. 2018. Extreme Adaptation for Personalized Neural Machine Translation. In Proceedings of ACL 2018.
  • Philip Schulz, Wilker Aziz, and Trevor Cohn. 2018. A Stochastic Decoder for Neural Machine Translation. In Proceedings of ACL 2018.
  • Eva Vanmassenhove, Christian Hardmeier, and Andy Way. 2018. Getting Gender Right in Neural Machine Translation. In Proceedings of EMNLP 2018.

翻译效率

  • Abigail See, Minh-Thang Luong, and Christopher D. Manning. 2016. Compression of Neural Machine Translation Models via Pruning. In Proceedings of CoNLL 2016.
  • Yusuke Oda, Philip Arthur, Graham Neubig, Koichiro Yoshino, and Satoshi Nakamura. 2017. Neural Machine Translation via Binary Code Prediction. In Proceedings of ACL 2017.
  • Xing Shi and Kevin Knight. 2017. Speeding Up Neural Machine Translation Decoding by Shrinking Run-time Vocabulary. In Proceedings of ACL 2017.
  • Xiaowei Zhang, Wei Chen, Feng Wang, Shuang Xu, and Bo Xu. 2017. Towards Compact and Fast Neural Machine Translation Using a Combined Method. In Proceedings of EMNLP 2017.
  • Felix Stahlberg and Bill Byrne. 2017. Unfolding and Shrinking Neural Machine Translation Ensembles. In Proceedings of EMNLP 2017.
  • Jacob Devlin. 2017. Sharp Models on Dull Hardware: Fast and Accurate Neural Machine Translation Decoding on the CPU. In Proceedings of EMNLP 2017.
  • Dakun Zhang, Jungi Kim, Josep Crego, and Jean Senellart. 2017. Boosting Neural Machine Translation. In Proceedings of IJCNLP 2017.
  • Gonzalo Iglesias, William Tambellini, Adrià de Gispert, Eva Hasler, and Bill Byrne. 2018. Accelerating NMT Batched Beam Decoding with LMBR Posteriors for Deployment. In Proceedings of NAACL 2018.
  • Jerry Quinn and Miguel Ballesteros. 2018. Pieces of Eight: 8-bit Neural Machine Translation. In Proceedings of NAACL 2018.
  • Matt Post and David Vilar. 2018. Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation. In Proceedings of NAACL 2018.
  • Biao Zhang, Deyi Xiong, and Jinsong Su. 2018. Accelerating Neural Transformer via an Average Attention Network. In Proceedings of ACL 2018.
  • Rui Wang, Masao Utiyama, and Eiichiro Sumita. 2018. Dynamic Sentence Sampling for Efficient Training of Neural Machine Translation. In Proceedings of ACL 2018.
  • Myle Ott, Sergey Edunov, David Grangier, and Michael Auli. 2018. Scaling Neural Machine Translation. In Proceedings of the Third Conference on Machine Translation: Research Papers.
  • Joern Wuebker, Patrick Simianer, and John DeNero. 2018. Compact Personalized Models for Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Wen Zhang, Liang Huang, Yang Feng, Lei Shen, and Qun Liu. 2018. Speeding Up Neural Machine Translation Decoding by Cube Pruning. In Proceedings of EMNLP 2018.
  • Zhisong Zhang, Rui Wang, Masao Utiyama, Eiichiro Sumita, and Hai Zhao. 2018. Exploring Recombination for Efficient Decoding of Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Nikolay Bogoychev, Kenneth Heafield, Alham Fikri Aji, and Marcin Junczys-Dowmunt. 2018. Accelerating Asynchronous Stochastic Gradient Descent for Neural Machine Translation. In Proceedings of EMNLP 2018.

预训练

  • Bryan McCann, James Bradbury, Caiming Xiong, and Richard Socher. 2017. Learned in Translation: Contextualized Word Vectors. In Proceedings of NIPS 2017.
  • Ye Qi, Devendra Sachan, Matthieu Felix, Sarguna Padmanabhan, and Graham Neubig. 2018. When and Why Are Pre-Trained Word Embeddings Useful for Neural Machine Translation?. In Proceedings of NAACL 2018.
  • Matthew Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, and Luke Zettlemoyer. 2018. Deep Contextualized Word Representations. In Proceedings of NAACL 2018.
  • Jeremy Howard and Sebastian Ruder. 2018. Universal Language Model Fine-tuning for Text Classification. In Proceedings of ACL 2018.
  • Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. 2018. Improving Language Understanding by Generative Pre-Training. Technical Report, OpenAI.
  • Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv:1810.04805.

语音翻译与同声翻译

  • Matt Post, Gaurav Kumar, Adam Lopez, Damianos Karakos, Chris Callison-Burch and Sanjeev Khudanpur. 2013. Improved Speech-to-Text Translation with the Fisher and Callhome Spanish–English Speech Translation Corpus. In Proceedings of IWSLT 2013.
  • Gaurav Kumar, Matt Post, Daniel Povey and Sanjeev Khudanpur. 2014. Some insights from translating conversational telephone speech In Proceedings of ICASSP 2014.
  • Long Duong, Antonios Anastasopoulos, David Chiang, Steven Bird, and Trevor Cohn. 2016. An Attentional Model for Speech Translation without Transcription. In Proceedings of NAACL 2016.
  • Antonios Anastasopoulos, David Chiang, and Long Duong. 2016. An Unsupervised Probability Model for Speech-to-translation Alignment of Low-resource Languages. In Proceedings of EMNLP 2016.
  • Ron J. Weiss, Jan Chorowski, Navdeep Jaitly, Yonghui Wu and Zhifeng Chen. 2017. Sequence-to-sequence models can directly translate foreign speech. In Proceedings of Interspeech 2017.
  • Jiatao Gu, Graham Neubig, Kyunghyun Cho, and Victor O.K. Li. 2017. Learning to Translate in Real-time with Neural Machine Translation. In Proceedings of EACL 2017.
  • Sameer Bansal, Herman Kamper, Adam Lopez, and Sharon Goldwater. 2017. Towards speech-to-text translation without speech recognition. In Proceedings of EACL 2017.
  • Jiatao Gu, James Bradbury, Caiming Xiong, Victor O.K. Li, and Richard Socher. 2018. Non-Autoregressive Neural Machine Translation. In Proceedings of ICLR 2018.
  • Antonios Anastasopoulos and David Chiang. 2018. Tied Multitask Learning for Neural Speech Translation. In Proceedings of NAACL 2018.
  • Fahim Dalvi, Nadir Durrani, Hassan Sajjad, and Stephan Vogel. 2018. Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation. In Proceedings of NAACL 2018.
  • Craig Stewart, Nikolai Vogler, Junjie Hu, Jordan Boyd-Graber, and Graham Neubig. 2018. Automatic Estimation of Simultaneous Interpreter Performance. In Proceedings of ACL 2018.
  • Florian Dessloch, Thanh-Le Ha, Markus Müller, Jan Niehues, Thai Son Nguyen, Ngoc-Quan Pham, Elizabeth Salesky, Matthias Sperber, Sebastian Stüker, Thomas Zenkel, and Alexander Waibel. 2018. KIT Lecture Translator: Multilingual Speech Translation with One-Shot Learning. In Proceedings of COLING 2018.
  • Chunqi Wang, Ji Zhang, and Haiqing Chen. 2018. Semi-Autoregressive Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Jindich Libovický and Jindich Helcl. 2018. End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification. In Proceedings of EMNLP 2018.
  • Ashkan Alinejad, Maryam Siahbani, and Anoop Sarkar. 2018. Prediction Improves Simultaneous Neural Machine Translation. In Proceedings of EMNLP 2018.
  • Mingbo Ma, Liang Huang, Hao Xiong, Kaibo Liu, Chuanqiang Zhang, Zhongjun He, Hairong Liu, Xing Li, and - Haifeng Wang. 2018. STACL: Simultaneous Translation with Integrated Anticipation and Controllable Latency. arXiv:1810.08398.

多模态翻译

  • Lucia Specia, Stella Frank, Khalil Sima’an, and Desmond Elliott. 2016. A Shared Task on Multimodal Machine Translation and Crosslingual Image Description. In Proceedings of the First Conference on Machine Translation: Volume 2, Shared Task Papers.
  • Sergio Rodríguez Guasch, Marta R. Costa-jussà. 2016. WMT 2016 Multimodal Translation System Description based on Bidirectional Recurrent Neural Networks with Double-Embeddings. In Proceedings of the First Conference on Machine Translation: Volume 2, Shared Task Papers.
  • Po-Yao Huang, Frederick Liu, Sz-Rung Shiang, Jean Oh, and Chris Dyer. 2016. Attention-based Multimodal Neural Machine Translation. In Proceedings of the First Conference on Machine Translation: Volume 2, Shared Task Papers.
  • Iacer Calixto, Desmond Elliott, and Stella Frank. 2016. DCU-UvA Multimodal MT System Report. In Proceedings of the First Conference on Machine Translation: Volume 2, Shared Task Papers.
  • Desmond Elliott, Stella Frank, Loïc Barrault, Fethi Bougares, and Lucia Specia. 2017. Findings of the Second Shared Task on Multimodal Machine Translation and Multilingual Image Description. In Proceedings of the Second Conference on Machine Translation.
  • Iacer Calixto, Qun Liu, and Nick Campbell. 2017. Doubly-Attentive Decoder for Multi-modal Neural Machine Translation. In Proceedings of ACL 2017.
  • Jean-Benoit Delbrouck and Stéphane Dupont. 2017. An empirical study on the effectiveness of images in Multimodal Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Iacer Calixto and Qun Liu. 2017. Incorporating Global Visual Features into Attention-based Neural Machine Translation. In Proceedings of EMNLP 2017.
  • Jason Lee, Kyunghyun Cho, Jason Weston, and Douwe Kiela. 2018. Emergent Translation in Multi-Agent Communication. In Proceedings of ICLR 2018.
  • Yun Chen, Yang Liu, and Victor O. K. Li. 2018. Zero-Resource Neural Machine Translation with Multi-Agent Communication Game. In Proceedings of AAAI 2018.
  • Loïc Barrault, Fethi Bougares, Lucia Specia, Chiraag Lala, Desmond Elliott, and Stella Frank. 2018. Findings of the Third Shared Task on Multimodal Machine Translation. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers.
  • John Hewitt, Daphne Ippolito, Brendan Callahan, Reno Kriz, Derry Tanti Wijaya, and Chris Callison-Burch. 2018. Learning Translations via Images with a Massively Multilingual Image Dataset. In Proceedings of ACL 2018.
  • Mingyang Zhou, Runxiang Cheng, Yong Jae Lee, and Zhou Yu. 2018. A Visual Attention Grounding Neural Model for Multimodal Machine Translation. In Proceedings of EMNLP 2018.
    Desmond Elliott. 2018. Adversarial Evaluation of Multimodal Machine Translation. In Proceedings of EMNLP 2018.

领域适应

  • Chenhui Chu, Raj Dabre, and Sadao Kurohashi. 2017. An Empirical Comparison of Domain Adaptation Methods for Neural Machine Translation. In Proceedings of ACL 2017.
  • Rui Wang, Andrew Finch, Masao Utiyama, and Eiichiro Sumita. 2017. Sentence Embedding for Neural Machine Translation Domain Adaptation. In Proceedings of ACL 2017.
  • Boxing Chen, Colin Cherry, George Foster, and Samuel Larkin. 2017. Cost Weighting for Neural Machine Translation Domain Adaptation. In Proceedings of the First Workshop on Neural Machine Translation.
  • Rui Wang, Masao Utiyama, Lemao Liu, Kehai Chen, and Eiichiro Sumita. 2017. Instance Weighting for Neural Machine Translation Domain Adaptation. In Proceedings of EMNLP 2017.
  • Antonio Valerio Miceli Barone, Barry Haddow, Ulrich Germann, and Rico Sennrich. 2017. Regularization techniques for fine-tuning in neural machine translation. In Proceedings of EMNLP 2017.
  • David Vilar. 2018. Learning Hidden Unit Contribution for Adapting Neural Machine Translation Models. In Proceedings of NAACL 2018.
  • Shiqi Zhang and Deyi Xiong. 2018. Sentence Weighting for Neural Machine Translation Domain Adaptation. In Proceedings of COLING 2018.
  • Chenhui Chu and Rui Wang. 2018. A Survey of Domain Adaptation for Neural Machine Translation. In Proceedings of COLING 2018.
  • Jiali Zeng, Jinsong Su, Huating Wen, Yang Liu, Jun Xie, Yongjing Yin, and Jianqiang Zhao. 2018. Multi-Domain Neural Machine Translation with Word-Level Domain Context Discrimination. In Proceedings of EMNLP 2018.
  • Graham Neubig and Junjie Hu. 2018. Rapid Adaptation of Neural Machine Translation to New Languages. In Proceedings of EMNLP 2018.

翻译质量评估

  • Hyun Kim and Jong-Hyeok Lee. 2016. A Recurrent Neural Networks Approach for Estimating the Quality of Machine Translation Output. In Proceedings of NAACL 2016.
  • Hyun Kim and Jong-Hyeok Lee, Seung-Hoon Na. 2017. Predictor-Estimator using Multilevel Task Learning with Stack Propagation for Neural Quality Estimation. In Proceedings of WMT 2017.
  • Osman Baskaya, Eray Yildiz, Doruk Tunaoglu, Mustafa Tolga Eren, and A. Seza Doruöz. 2017. Integrating Meaning into Quality Evaluation of Machine Translation. In Proceedings of EACL 2017.
  • Yvette Graham, Qingsong Ma, Timothy Baldwin, Qun Liu, Carla Parra, and Carolina Scarton. 2017. Improving Evaluation of Document-level Machine Translation Quality Estimation. In Proceedings of EACL 2017.
  • Pierre Isabelle, Colin Cherry, and George Foster. 2017. A Challenge Set Approach to - Evaluating Machine Translation. In Proceedings of EMNLP 2017.
  • André F.T. Martins, Marcin Junczys-Dowmunt, Fabio N. Kepler, Ramón Astudillo, Chris Hokamp, and Roman Grundkiewicz. 2017. Pushing the Limits of Translation Quality - Estimation. Transactions of the Association for Computational Linguistics.
  • Maoxi Li, Qingyu Xiang, Zhiming Chen, and Mingwen Wang. 2018. A Unified Neural Network for Quality Estimation of Machine Translation. IEICE Transactions on Information and Systems.
  • Lucia Specia, Frédéric Blain, Varvara Logacheva, Ramón F. Astudillo, and André Martins. 2018. Findings of the WMT 2018 Shared Task on Quality Estimation. In Proceedings of WMT 2018.
  • Craig Stewart, Nikolai Vogler, Junjie Hu, Jordan Boyd-Graber, and Graham Neubig. 2018. Automatic Estimation of Simultaneous Interpreter Performance. In Proceedings of ACL 2018.
  • Julia Ive, Frédéric Blain, and Lucia Specia. 2018. deepQuest: A Framework for Neural-based Quality Estimation. In Proceedings of COLING 2018.
  • Kai Fan, Jiayi Wang, Bo Li, Fengming Zhou, Boxing Chen, and Luo Si. 2019. “Bilingual Expert” Can Find Translation Errors. In Proceedings of AAAI 2019.

自动化译后编辑

  • Santanu Pal, Sudip Kumar Naskar, Mihaela Vela, and Josef van Genabith. 2016. A neural network based approach to automatic post-editing. In Proceedings of ACL 2016.
  • Marcin Junczys-Dowmunt and Roman Grundkiewicz. 2016. Log-linear Combinations of Monolingual and Bilingual Neural Machine Translation Models for Automatic Post-Editing. In Proceedings of the First Conference on Machine Translation: Volume 2, Shared Task Papers.
  • Santanu Pal, Sudip Kumar Naskar, Mihaela Vela, Qun Liu, and Josef van Genabith. 2017. Neural Automatic Post-Editing Using Prior Alignment and Reranking. In Proceedings of EACL 2017.
  • Rajen Chatterjee, Gebremedhen Gebremelak, Matteo Negri, and Marco Turchi. 2017. Online Automatic Post-editing for MT in a Multi-Domain Translation Environment. In Proceedings of EACL 2017.
  • David Grangier and Michael Auli. 2018. QuickEdit: Editing Text & Translations by Crossing Words Out. In Proceedings of NAACL 2018.
  • Thuy-Trang Vu and Gholamreza Haffari. 2018. Automatic Post-Editing of Machine Translation: A Neural Programmer-Interpreter Approach. In Proceedings of EMNLP 2018.

词汇翻译与双语词汇归纳

  • Meng Zhang, Yang Liu, Huanbo Luan, Maosong Sun, Tatsuya Izuha, and Jie Hao. 2016. Building Earth Mover’s Distance on Bilingual Word Embeddings for Machine Translation. In Proceedings of AAAI 2016.
  • Meng Zhang, Yang Liu, Huanbo Luan, Yiqun Liu, and Maosong Sun. 2016. Inducing Bilingual Lexica From Non-Parallel Data With Earth Mover’s Distance Regularization. In Proceedings of COLING 2016.
  • Meng Zhang, Haoruo Peng, Yang Liu, Huanbo Luan, and Maosong Sun. Bilingual Lexicon Induction from Non-Parallel Data with Minimal Supervision. In Proceedings of AAAI 2017.
  • Ann Irvine and Chris Callison-Burch. 2017. A Comprehensive Analysis of Bilingual Lexicon Induction. Computational Linguistics.
  • Meng Zhang, Yang Liu, Huanbo Luan, and Maosong Sun. 2017. Adversarial Training for Unsupervised Bilingual Lexicon Induction. In Proceedings of ACL 2017.
  • Geert Heyman, Ivan Vuli, and Marie-Francine Moens. 2017. Bilingual Lexicon Induction by Learning to Combine Word-Level and Character-Level Representations. In Proceedings of EACL 2017.
  • Bradley Hauer, Garrett Nicolai, and Grzegorz Kondrak. 2017. Bootstrapping Unsupervised Bilingual Lexicon Induction. In Proceedings of EACL 2017.
  • Yunsu Kim, Julian Schamper, and Hermann Ney. 2017. Unsupervised Training for Large Vocabulary Translation Using Sparse Lexicon and Word Classes. In Proceedings of EACL 2017.
  • Derry Tanti Wijaya, Brendan Callahan, John Hewitt, Jie Gao, Xiao Ling, Marianna Apidianaki, and Chris Callison-Burch. 2017. Learning Translations via Matrix Completion. In Proceedings of EMNLP 2017.
  • Meng Zhang, Yang Liu, Huanbo Luan, and Maosong Sun. 2017. Earth Mover’s Distance Minimization for Unsupervised Bilingual Lexicon Induction. In Proceedings of EMNLP 2017.
  • Ndapandula Nakashole and Raphael Flauger. 2017. Knowledge Distillation for Bilingual Dictionary Induction. In Proceedings of EMNLP 2017.
  • Guillaume Lample, Alexis Conneau, Marc’Aurelio Ranzato, Ludovic Denoyer, and Hervé Jégou. 2018. Word translation without parallel data. In Proceedings of ICLR 2018.
  • Ndapa Nakashole and Raphael Flauger. 2018. Characterizing Departures from Linearity in Word Translation. In Proceedings of ACL 2018.
  • Anders Søgaard, Sebastian Ruder, and Ivan Vuli. 2018. On the Limitations of Unsupervised Bilingual Dictionary Induction. In Proceedings of ACL 2018.
  • Parker Riley and Daniel Gildea. 2018. Orthographic Features for Bilingual Lexicon Induction. In Proceedings of ACL 2018.
  • Amir Hazem and Emmanuel Morin. 2018. Leveraging Meta-Embeddings for Bilingual Lexicon Extraction from Specialized Comparable Corpora. In Proceedings of COLING 2018.
  • Sebastian Ruder, Ryan Cotterell, Yova Kementchedjhieva, and Anders Søgaard. 2018. A Discriminative Latent-Variable Model for Bilingual Lexicon Induction. In Proceedings of EMNLP 2018.
  • Zi-Yi Dou, Zhi-Hao Zhou, and Shujian Huang. 2018. Unsupervised Bilingual Lexicon Induction via Latent Variable Models. In Proceedings of EMNLP 2018.
  • Armand Joulin, Piotr Bojanowski, Tomas Mikolov, Hervé Jégou, and Edouard Grave. 2018. Loss in Translation: Learning Bilingual Word Mapping with a Retrieval Criterion. In Proceedings of EMNLP 2018.

诗歌翻译

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值