oxford-deepNLP
-
- L2a Word Level Semantics
- L3 Language Modeling and RNNs I
- L4 Language Modeling and RNNs II
- L5 Text Classification
- L6 RNNs and GPUs
- L7 Conditional Language Modeling
- L8 Conditional Language Modeling with Attention
- L9 Speech Recognition
- L10 Text to Speech
- L11 Question Answering
- L12 Memory Lecture
- L13 Linguistics
L2a Word Level Semantics
( Word2Vec == PMI matrix factorization of count based models)
Count-based methods
Neural Embedding Models: C&W
Embed all words in a sentence with E、Shallow convolution over embeddings、Minimise hinge loss
Neural Embedding Models: CBow
Embed context words、 Add them、Minimize Negative Log Likelihood、
Neural Embedding
Target word predicts context word、Embed target word
Task-based Embedding Learning
directly train embeddings jointly with the parameters of the network which uses them
Embeddings matrix can be learned from scratch, or initialised with pre-learned embeddings(fine-tuning)
Applications
- Text categorisation
- Natural language generation( language modeling \ conditional language modeling)
- Natural lan