LM perplexity by using tensorflow
1、Language model perplexity是衡量语言模型好坏的重要指标,其计算公式P(sentence)^-(1/N)
2、tensorflow的RNN模型如何使用
参考API文档:tf.contrib.legacy_seq2seq.sequence_loss_by_example,这个函数会返回一个大小为N的列表,N为句子数目,每个值代表该句话的log-perplexity。
tf.contrib.legacy_seq2seq.sequence_loss_by_example(
logits,
targets,
weights,
average_across_timesteps=True,
softmax_loss_function=None,
name=None
)
#计算这个batch的log-perplexity,返回的是shape=[batch]的值,代表每个句子的log-perplexity
loss = legacy_seq2seq.sequence_loss_by_example([self.logit