给一系列的词语计算概率的模型叫做语言模型(Language Models),其中,n-gram是最简单的一种。一个n-gram
就是一个长度为N
的词语组成的序列:
N=2
,则是2-gram
(bigram)N=3
,则是3-gram
(trigram)
一个简单的例子
有一个任务,要计算 P ( w ∣ h ) P(w\vert h) P(w∣h),即给定历史 h h h计算 w w w的概率。假设 h = i t s w a t e r i s s o t r a n s p a r e n t t h a t h=its\ water\ is\ so\ transparent\ that h=its water is so transparent that,我们要计算下一个词the
的概率,即:
P ( t h e ∣ i t s w a t e r i s s o t r a n s p a r e n t t h a t ) P(the\ \vert\ its\ water\ is\ so\ transparent\ that) P(the ∣ its water is so transparent that)
一个可行的方式是:在一个很大的语料库中,统计 i t s w a t e r i s s o t r a n s p a r e n t t h a t its\ water\ is\ so\ transparent\ that its water is so transparent that出现的次数,然后统计 i t s w a t e r i s s o t r a n s p a r e n t t h a t t h e its\ water\ is\ so\ transparent\ that\ the its water is so transparent that the的次数,后者除以前者,即:
P ( t h e ∣ i t s w a t e r i s s o t r a n s p a r e n t t h a t ) = C ( i t s w a t e r i s s o t r a n s p a r e n t t h a t t h e ) C ( i t s w a t e r i s s o t r a n s p a r e n t t h a t ) P(the\ \vert\ its\ water\ is\ so\ transparent\ that)=\frac{C(\ its\ water\ is\ so\ transparent\ that\ the)}{C(\ its\ water\ is\ so\ transparent\ that)} P(the ∣ its water is so transparent that)=C( its water is so transparent that)C( its water is so transparent that the)
这种方式在很多情况下可行。但是某些情况下仍然会有以下问题:
- 有些词语在预料中出现次数为0。
相似的问题还出现在:如果我们想知道整个序列的联合概率,例如 P ( i t s w a t