概率语言模型 Probabilistic Language Modeling (三) --- 训练工具汇总

传统算法

1) BerkeleyLM 是用java写的,号称跟KenLM差不多,内存比srilm小

https://github.com/adampauls/berkeleylm

2)MITLM (The MIT Language Modeling toolkit) 参数优化做的比较好

https://code.google.com/p/mitlm/ or https://github.com/mitlm/mitlm

3)SRILM(The SRI language modeling toolkit) 老牌语言模型工具,SRI(Stanford Research Institute)开发,使用比较广泛,c++版本

http://www.speech.sri.com/projects/srilm/

另外Maximum entropy (MaxEnt) language models is苏pported in the SRILM toolkit

https://phon.ioc.ee/dokuwiki/doku.php?id=people:tanel:srilm-me.en

4)IRSTLM (IRST language modeling toolkit意大利TrentoFBK-IRST实验室开发处理较大规模的训练数据,integrated into Moses (a popular open source Statistical Machine Translation decoder) 。IRSTLM在训练语言模型时采用了划分词典分块训练快速合并的方式,从而在训练大规模语料时取得了优异的性能。IRSTLM 训练语言模型时分以下 5 步: a)在训练语料上统计带词频词汇表;b)按照词频均衡的原则将词汇表划分为若干个子词汇表;c)对各个子词汇表统计 n-gram,这些 n-gram 必须以词汇表中的词汇开头;d)根据第四步的统计结果,建立多个子语言模型;e)把所有的子语言模型融合成最终语言模型

http://hlt-mt.fbk.eu/technologies/irstlm or https://github.com/irstlm-team/irstlm

5)KenLM (Kenneth Heafield's language model toolkit) 最大特点是速度快、占用内存少。号称比SRILM要好一些,支持单机大数据的训练,包括c++和python两个接口

http://kheafield.com/code/kenlm/ or https://github.com/kpu/kenlm

6)Bigfatlm Provides Hadoop training of Kneser-ney language models, written in Java

https://github.com/jhclark/bigfatlm

7)Kylm (Kyoto Language Modeling Toolkit)  written in Java,Output in WFST format for use with WFST decoders

http://www.phontron.com/kylm/  or  https://github.com/neubig/kylm

8) OpenGrm Language modelling toolkit for use with OpenFst, makes and modifies n-gram language models encoded as weighted finite-state transducers (FSTs)

http://opengrm.org/

深度学习

1)RNNLM(Recurrent neural network language model toolkit)

http://rnnlm.org/ or http://www.fit.vutbr.cz/~imikolov/rnnlm/

2)BRNNLM (Bayesian recurrent neural network for language model)

http://chien.cm.nctu.edu.tw/bayesian-recurrent-neural-network-for-language-modeling

3)RWTHLM (RWTH Aachen University Neural Network Language Modeling Toolkit, includes feedforward, recurrent, and long short-term memory neural networks)

http://www-i6.informatik.rwth-aachen.de/web/Software/rwthlm.php

4)Character-Aware Neural Language Models,employs a convolutional neural network (CNN)over characters to use as inputs into an long short-term memory (LSTM)recurrent neural network language model (RNN-LM)

https://github.com/yoonkim/lstm-char-cnn


  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值