个人技术栈构建

Resources

Deep Learning

Natural Language Processing

Reinforcement Learning

Machine Learning


Natural Language Processing 学习

接下来成为专家的领域:Transformer + TL -> 知识图谱 -> DRL

Topiccs224nslp3CS11-747others
nlp basics: math and optimizerslecture 0
Word Vectorslecture 1: Introduction and Word Vectors
lecture 2: Word Vectors 2 and Word Senses
lecture 12: Information from parts of words: Subword Models
chapter 6: Vector SemanticsDistributional Semantics and Word Vectors (1/22/2019)ruder.io/word-embeddings
Neural Networkslecture 3: Word Window Classification, Neural Networks, and Matrix Calculus
lecture 4: Backpropagation and Computation Graphs
chapter 7: Neural Networks and Neural LMNeural Networks and Deep Learning
RNN and Language Modelslecture 6: Recurrent Neural Networks and Language Models
lecture 7: Vanishing Gradients, Fancy RNNs
chapter 9: Sequence Processing with Recurrent NetworksA Simple (?) Exercise: Predicting the Next Word in a Sentence (1/17/2019)
Recurrent Networks for Sentence or Language Modeling (1/29/2019)
Recurrent Neural Networks Tutorial, Part 1–Introduction to RNNs
Understanding LSTM Networks
The Unreasonable Effectiveness of Recurrent Neural Networks
seq2seq+Attentionlecture 8: Machine Translation, Seq2Seq and Attentionchapter 22: Machine TranslationConditioned Generation (2/5/2019)
Attention (2/7/2019)
https://github.com/tensorflow/nmt
https://arxiv.org/abs/1703.01619
CNN for Textlecture 11: ConvNets for NLPConvolutional Neural Nets for Text (1/24/2019)
Contextuallecture 13: contexts of use: Contextual Representations and PretrainingSentence and Contextual Word Representations (2/12/2019)http://jalammar.github.io/illustrated-bert/
✨***Transformer***✨Transformers and Self-Attention For Generative Modelshttps://jalammar.github.io/illustrated-transformer/
http://nlp.seas.harvard.edu/2018/04/03/attention.html
Dependency Parsinglecture 5: Linguistic Structure: Dependency Parsingchapter 8: Part-of-Speech Tagging
Structured Prediction ModelsSearch-based Structured Prediction (2/19/2019)
Reinforcement Learning (2/21/2019)
Structured Prediction with Local Independence Assumptions (2/26/2019)
Advanced Learning TechniquesLatent Random Variables (3/5/2019)
Adversarial Methods for Text (3/7/2019)
Unsupervised and Semi-supervised Learning of Structure (3/28/2019)
✨Models of Knowledge and Context✨Reference in Language and Coreference ResolutionModels of Dialog (4/2/2019)
Document-level Models (4/4/2019)
Learning from/for Knowledge Graphs (4/9/2019)
Machine Reading w/ Neural Nets (4/16/2019)
Multi-task and Multilingual LearningMultitask Learning: A general model for NLP?Multi-task Multi-lingual Learning Models (4/18/2019)
Multimodal Models (4/23/2019)
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值