文本模型整理 (持续更新)

传统模型

  • TF-IDF: term frequency–inverse document frequency
  • LDA: Latent Dirichlet Allocation, JMLR, 2003

词向量

  • Word2vec: Efficient Estimation of Word Representations in Vector Space, ICLR, 2013
  • GloVe: GloVe: Global Vectors for Word Representation, EMNLP, 2014
  • FastText: Bag of Tricks for Efficient Text Classification, EACL, 2017

句向量

  • STV: Skip-Thought Vector, NIPS, 2015 (word2vec相同思想)
  • QTV (Quick-Thought Vectors): An efficient framework for learning sentence representations, LCLR, 2018 (改进STV为一个分类任务)

RNN

  • RNN: Finding structure in time, Cognitive science, 1990
  • LSTM: Long short-term memory, Neural computation, 1997
  • GRU: Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling, NIPS, 2014
  • SRU: Simple Recurrent Units for Highly Parallelizable Recurrence, ENNLP, 2018
  • IndRNN: Independently Recurrent Neural Network (IndRNN): Building a Longer and Deeper RNN, CVPR, 2018

Deep Model

  • Text CNN: Convolutional Neural Networks for Sentence Classification, ENNLP, 2014
  • Char-CNN: Character-level Convolutional Networks for Text Classification, NIPS, 2015
  • C-LSTM: A C-LSTM Neural Network for Text Classification, arXiv, 2015
  • Text RNN: Recurrent Neural Network for Text Classification with Multi-Task Learning, IJCAI 2016
  • Text CNN 2: A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification, IJCNLP, 2017
  • Attention: Attention Is All You Need, NIPS, 2017
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值