浏览论文记录

How to Read A Paper

My suggestion would be to pick one thing, one book or one library and read it cover to cover or work through all of the tutorials. Pick one and stick to it, then once you master it, pick another and repeat. Let’s get into it.

How do I learn machine learning?-Quora

单词向量

硕士论文

  • * 基于神经网络的词和文档语义向量表示方法研究 * [pdf]

Old

  • A neural probabilistic language model (2003), Y Bengio [pdf]
  • Word representations: a simple and general method for semi-supervised learning (2010) Y Bengio et al. [pdf]
  • From frequency to meaning: vector space models of semantics (2010) Turney et al. [pdf]
  • Natural Language Processing (almost) from Scratch (2011), R Collobert et al. [pdf]
  • A Probabilistic Model for Semantic Word Vectors (2010), Andrew Y. Ng [pdf]

2012

  • Improving word representations via global context and multiple word prototypes (2012), Richard Socher et al. [pdf][code]

2013

  • Distributed representations of words and phrases and their compositionality (2013), T. Mikolov et al. [pdf][python code][c code]
  • Efficient estimation of word representations in vector space (2013), T. Mikolov et al. [pdf]
  • Chinese Parsing Exploiting Characters (2013), M Zhang et al. [pdf][c# code]

2014

  • Glove: Global vectors for word representation (2014), J. Pennington et al. [pdf] [c code][python code]
  • Distributed representations of sentences and documents (2014), Q. Le and T. Mikolov [pdf]

2015

  • Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space (2015), Arvind Neelakantan et al. [pdf][code]
  • Joint Learning of Character and Word Embeddings (2015), Xinxiong Chen et al. [pdf][code]
  • Topical Word Embeddings (2015), Y Liu et al. [pdf][code]

2016

  • Inside out : two jointly predictive models for word representations and phrase representations (2016), Fei Sun et al. [pdf][code]
  • How to Generate a Good Word Embedding? (2016), S Lai et al. [pdf][code]
  • Improve Chinese Word Embeddings by Exploiting Internal Structure (2016), J Xu et al. [pdf][code]

TODO

  • Learning Context-Specific Word/Character Embeddings (2017), Xiaoqing Zheng et al. [pdf]
  • Learning Sentiment-Specific Word Embedding (2014), Tang D et al. [pdf]
  • Context-specific and multi-prototype character representations (2016) X Zheng et al. [pdf]
  • Character-based parsing with convolutional neural network (2015), Xiaoqing Zheng et al. [pdf]
  • Learning word representation considering proximity and ambiguity (2014), Lin Qiu et al. [pdf]
  • A Unified Model for Word Sense Representation and Disambiguation (2014), X Chen et al. [pdf]

Deep Learning

A curated list of the most cited deep learning papers (since 2012)

Overview

  • Neural Network and Deep Learning (Book, Jan 2017), Michael Nielsen. [html]
  • Deep learning (2015), Y. LeCun, Y. Bengio and G. Hinton [pdf]

optimization

  • Batch normalization: Accelerating deep network training by reducing internal covariate shift (2015), S. Loffe and C. Szegedy [pdf]
  • Dropout: A simple way to prevent neural networks from overfitting (2014), N. Srivastava et al. [pdf]
  • Improving neural networks by preventing co-adaptation of feature detectors (2012), G. Hinton et al. [pdf]
  • Understanding the difficulty of training deep feedforward neural networks (2010), X. Glorot and Y. Bengio [pdf]
  • Why does unsupervised pre-training help deep learning (2010), D. Erhan et al. [pdf]

CNN

  • Visualizing and Understanding Recurrent Networks (2014), M.D.Zeiler et al. [pdf]
  • Very deep convolutional networks for large-scale image recognition (2014), K. Simonyan and A. Zisserman [pdf]
  • Maxout networks (2013), I. Goodfellow et al. [pdf]
  • Network in network (2013), M. Lin et al. [pdf]
  • ImageNet classification with deep convolutional neural networks (2012), A. Krizhevsky et al. [pdf]
  • Deep Residual Learning for Image Recognition(2015) He et al. [pdf]

SDAE

  • Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion (2010), P. Vincent et al. [pdf]

RNN

  • Long short-term memory (1997), S. Hochreiter and J. Schmidhuber. [pdf]
  • Visualizing and Understanding Recurrent Networks (2015), A. Karpathy et al. [pdf]

Attention Machine

  • A Structured Self-attentive Sentence Embedding (2017), Zhouhan Lin et al. [pdf]
  • Attention Is All You Need (2017), Ashish Vaswani et al. [pdf]
  • Attention-over-Attention Neural Networks for Reading Comprehension (2016), Yiming Cui et al. [pdf]

NLP

  • Stanford University CS224d: Deep Learning for Natural Language Processing [html]
  • Convolutional Sequence to Sequence Learning (2017), Jonas Gehring et al. [pdf]
  • A Hierarchical Neural Autoencoder for Paragraphs and Documents (2015), Jiwei Li et al. [pdf]
  • Effective approaches to attention-based neural machine translation (2015), M. Luong et al. [pdf]
  • Character-Aware Neural Language Models (2015), Yoon Kim et al. [pdf]
  • Semi-supervised Sequence Learning (2015), Andrew M et al. [pdf]
  • Neural machine translation by jointly learning to align and translate (2014), D. Bahdanau et al. [pdf]
  • Sequence to sequence learning with neural networks (2014), I. Sutskever et al. [pdf]
  • Learning phrase representations using RNN encoder-decoder for statistical machine translation (2014), K. Cho et al. [pdf]
  • A convolutional neural network for modeling sentences (2014), N. Kalchbrenner et al. [pdf]
  • Convolutional neural networks for sentence classification (2014), Y. Kim [pdf]
  • Glove: Global vectors for word representation (2014), J. Pennington et al. [pdf]
  • Distributed representations of sentences and documents (2014), Q. Le and T. Mikolov [pdf]
  • Recurrent continuous translation models (2013), Nal Kalchbrenner et al. [pdf]
  • Distributed representations of words and phrases and their compositionality (2013), T. Mikolov et al. [pdf]
  • Efficient estimation of word representations in vector space (2013), T. Mikolov et al. [pdf]
  • Generating sequences with recurrent neural networks (2013), A. Graves. [pdf]
  • Semi-supervised recursive autoencoders for predicting sentiment distributions (2011), R Socher et al. [pdf]
  • Recurrent neural network based language model (2010), T. Mikolov et al. [pdf]
  • A unified architecture for natural language processing:deep neural networks with multitask learning (2008), R Collobert et al. [pdf]

Generative Adversarial Networks

  • Generative Adversarial Networks: An Overview (2017),Antonia Creswell et al. [pdf]

Graph

  • Learning Convolutional Neural Networks for Graphs (2016) Mathias Niepert et al. [pdf]
  • DeepWalk: Online Learning of Social Representations (2014) Bryan Perozzi et al. [pdf]
  • Generalized Graph Regularized Non-negative Matrix Factorization for Data Representation (2011) Y Hao et al. [pdf]

Non-mainstream

  • Very Deep Convolutional Networks for Text Classification (2016), Alexis Conneau et al. [pdf]
  • Weakly-supervised deep learning for customer review sentiment classification (2016) Z Guan et al. [pdf]
  • A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification (2016) Ye Zhang et al. [pdf]
  • Selective Unsupervised Feature Learning with Convolutional Neural Network (S-CNN) (2016) Amir Ghaderi et al. [pdf]
  • Large-Margin Softmax Loss for Convolutional Neural Networks (2016) W Liu et al. [pdf]
  • Recurrent Convolutional Neural Networks for Text Classification (2015), Siwei Lai et al. [pdf]
  • Semi-supervised Convolutional Neural Networks for Text Categorization via Region Embedding. (2015) R Johnson et al. [pdf]
  • Learning Document Embeddings by Predicting N-grams for Sentiment Classification of Long Movie Reviews (2015), B Li et al. [pdf]
  • An Analysis of Unsupervised Pre-training in Light of Recent Advances (2014), Tom Le Paine et al. [pdf]
  • Applications of Deep Learning to Sentiment Analysis of Movie Reviews (2014) H Shirani-Mehr. [pdf]
  • Discriminative-unsupervised-feature-learning-with-convolutional-neural-networks (2014) Alexey Dosovitskiy et al. [pdf]
  • #TagSpace: Semantic Embeddings from Hashtags (2014), J Weston et al. [pdf]
  • Effective Use of Word Order for Text Categorization with Convolutional Neural Networks (2014) Rie Johnson et al. [pdf]
  • Stacked convolutional auto-encoders for hierarchical feature extraction (2011) J Masci et al. [pdf]
  • Deep supervised t-distributed embedding (2010) R Min et al. [pdf]
  • Deconvolutional networks (2010), MD Zeiler et al. [pdf]
  • Three new graphical models for statistical language modelling (2007), G Hinton et al. [pdf]

Language Model

  • A Novel Neural Topic Model and Its Supervised Extension (2015), Z Cao et al. [pdf]
  • Topic Compositional Neural Language Model(2015), W Wang et al. [pdf]

TODO

  • Deep learning (Book, 2016), Goodfellow et al. [html]
  • Feed Forward and Backward Run in Deep Convolution Neural Network (2017), Pushparaja Murugan. [paper]
  • Gradient-based learning applied to document recognition (2001), Y. Lecun et al. [paper]
  • Recent Advances in Convolutional Neural Networks (2015), J Gu et al. [pdf]
  • A Probabilistic Theory of Deep Learning[pdf]

Recommendation Systems

Overview

  • Recommender systems survey (2013), J Bobadilla et al. [pdf]
  • Deep Learning based Recommender System: A Survey and New Perspectives (2017), S Zhang et al. [pdf]

Collaborative Filtering

  • Probabilistic Matrix Factorization (2007), R Salakhutdinov et al. [pdf]
  • Matrix Factorization Techniques for Recommender Systems (2009) Y Koren et al. [pdf]

Deep Learning

  • Neural Collaborative Filtering (2017) Xiangnan He et al. [paper]
  • CCCFNet: A Content-Boosted Collaborative Filtering Neural Network for Cross Domain Recommender Systems (2017), Jianxun Lian et al. [pdf]
  • Towards Bayesian Deep Learning: A Survey (2016), Hao Wang et al. [pdf]
  • Collaborative Recurrent Autoencoder: Recommend while Learning to Fill in the Blanks (2016) Hao Wang et al. [pdf]
  • A Novel Recommendation Model Regularized with User Trust and Item Ratings (2016), G Guo et al. [pdf]
  • AutoRec:Autoencoders Meet Collaborative Filtering (2015), Suvash Sedhain et al. [pdf]
  • collaborative deep learning for recommender systems (2014) Hao Wang et al. [pdf]
  • Modeling Interestingness with Deep Neural Network (2014), J Gao et al. [pdf]
  • Learning deep structured semantic models for web search using clickthrough data (2013), PS Huang et al. [paper]

Latent Dirichlet Allocation

  • Capturing semantic correlation for item recommendation in tagging systems (2016), C Chen et al. [pdf]
  • Relational Stacked Denoising Autoencoder for Tag Recommendation (2015), H Wang et al. [pdf]
  • Collaborative Topic Regression with Social Regularization for Tag Recommendation (2013), Hao Wang et al. [pdf]

Others

  • BPR: Bayesian Personalized Ranking from Implicit Feedback(2009), Steffen Rendle et al. [pdf]

Multi-label

Overview

  • A Review on Multi-Label Learning Algorithms (2014), ML Zhang et al. [pdf]

Others

  • Learning Deep Latent Spaces for Multi-Label Classification (2017), Chih-Kuan Yeh et al. [pdf]
  • Logistic Boosting Regression for Label Distribution Learning (2016), C Xing et al. [pdf]
  • All-in Text: Learning Document, Label, and Word Representations Jointly (2016) Jinseok Nam et al. [pdf]
  • CNN-RNN: A Unified Framework for Multi-label Image Classification (2014), Jiang Wang et al. [pdf]
  • Large-scale multi-label text classification — revisiting neural networks (2013), Jinseok Nam et al. [pdf]
  • Label Distribution Learning (2013), G Xin. [pdf]
  • Deep Convolutional Ranking for Multilabel Image Annotation (2013), Yunchao Gong et al. [paper]
  • Multilabel Classification with Principal Label Space Transformation (2012), F Tai, H Lin. [pdf]
  • Feature-aware label space dimension reduction for multi-label classification (2012), YN Chen et al. [pdf]
  • WSABIE: Scaling Up To Large Vocabulary Image Annotation (2011), Jason Weston et al. [pdf]
  • Canonical Correlation Analysis: An Overview with Application to Learning Methods (2004), David R. Hardoon et al. [pdf]

TODO

  • Recommendation System Handbook

框架介绍

Tensorflow

  • A Tour of Tensorflow (2016), Peter GoldSborough. [pdf]
  • TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems (2016) Google. [pdf]

其他资源

大牛列表
introduction to machine learning
机器学习资源搜素引擎
A Deep Dive into Recurrent Neural Nets
Numerical Optimization: Understanding L-BFGS
Statistical foundations of machine learning
http://nirvacana.com/thoughts/becoming-a-data-scientist/

Books

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值