反向翻译back-translations

参考论文:

Rico Sennrich, Barry Haddow, and Alexandra Birch.2016. Edinburgh neural machine translation systems for wmt 16. arXiv preprint arXiv:1606.02891.

Rico Sennrich, Barry Haddow, and Alexandra Birch. 2016a. Improving Neural Machine Translation Models with Monolingual Data. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (ACL 2016), Berlin, Germany.

 

还是这个Sennrich,他在WMT16中说:We found that during decoding, the model would occasionally assign a high probability to
words based on the target context alone, ignoring the source sentence.

具体做法是 we experiment with training separate models that produce the target text from right-to-left (r2l), and re-scoring the nbest lists that are produced by the main (left-toright) models with these r2l models. Since the right-to-left model will see a complementary target context at each time step, we expect that the averaged probabilities will be more robust.

利用反向翻译模型对主模型翻译出的候选list进行了重打分。
 

在反向翻译时,为了增强模型的抗噪声能力,会人为加入噪声,常用的方法是drop out 

这个人用了一个简单粗暴的方法:In our English↔Romanian experiments, we drop out full words (both on the source and target side) with a probability of 0.1. For all other layers, the dropout probability is set to 0.2.

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值