Lecture 1 – Introduction and Word Vectors(19winter)
Readings1:Efficient Estimation of Word Representations in Vector Space
Readings2:Distributed Representations of Words and Phrases and their Compositionality
1. What is Natural Language Processing(NLP)
NLP levels
1.1 What's special about human language?
What’s so special about human (natural) language? Human language is a system specifically constructed to convey meaning, and is not produced by a physical manifestation of any kind. In that way, it is very different from vision or any other machine learning task.
Most words are just symbols for an extra-linguistic entity : the word is a signifier that maps to a signified (idea or thing).
For instance, the word "rocket" refers to the concept of a rocket, and by extension can designate an instance of a rocket. There are some exceptions, when we use words and letters for expressive signaling, like in "Whooompaa". On top of this, the symbols of language can be encoded in several modalities : voice, gesture, writing, etc that are transmitted via continuous signals to the brain, which itself appears to encode things in a continuous manner. (A lot of work in philosophy of language and linguistics has been done to conceptualize human language and distinguish words from their references, meanings, etc. Among others, see works by Wittgenstein, Frege, Russell and Mill.)
1.2 NLP Applications
There are different levels of tasks in NLP, from speech processing to semantic interpretation and discourse processing. The goal of NLP is to be able to design algorithms to allow computers to "understand" natural language in order to perform some task. Example tasks come in varying level of difficulty:
Easy
• Spell Checking
• Keyword Search
• Finding Synonyms
Medium
• Parsing information from websites, documents, etc.
Hard
• Machine Translation (e.g. Translate Chinese text to English)
• Semantic Analysis (What is the meaning of query statement?)
• Coreference (e.g. What does "he" or "it" refer to given a document?)
• Question Answering (e.g. Answering Jeopardy questions).
NLP in industry
1.3 How to represent words?
The first and arguably most important common denominator across all NLP tasks is how we represent words as input to any of our models. Much of the earlier NLP work that we will not cover treats words as atomic symbols. To perform well on most NLP tasks we first need to have some notion of similarity and difference between words. With word vectors, we can quite easily encode this ability in the vectors themselves (using distance measures such as Jaccard, Cosine, Euclidean, etc)
1.4 Problems with resources like WordNet
2. What's Deep Learning(DL)?
Deep Learning vs Machine Learning
What's Deep Learning(DL)?
On the history of and term "Deep Learning"
Reasons for Exploring Deep Learning
Deep Learning for Speech
Deep Learning for Computer Vision
3. Course Logistics
4. Why is NLP hard?
5. Deep NLP = Deep Learning + NLP
Word meaning as a neural word vector - visualization(2D representation)
Word similarities
Representation of NLP levels: Morphology
NLP Tools: Parsing for semantic structure
Representation of NLP levels: Semantic