第六讲 syntax grammar and dependency parsing
#我是真没咋看懂这一讲
1. Syntactic Structure: Consistency and Dependency
Constituency = phrase structure grammar = context-free grammars(CFGs)
Phrase structure organizes words into nested constituents. e.g. VP->V PP
Dependency structure shows which words depend on which other words.
Ambiguity example:
Scientists study whales from space.
study from space v.s. whales from space.
A key parsing decision is how we attach various constituents.
2. Dependency Grammar
We firstly talk about dependency grammar.
Dependency syntax postulates that syntactic structure consists of relations between lexical items, normally binary asymmetric relations called dependencies.
fake ROOT.
Dependency Parsing
- Usually
- only one word is dependent of ROOT
- no cycles
This makes the dependencies a tree.
- Whether arrows can cross. if no cross -> projective tree; otherwise non-projective tree. We could move the phrases in sentence to make a non-projective tree projective.
We can’t reconstruct sentences based on this tree without other infomation.
Method of Dependency Parsing
- Dynamic programming
- Graph algorithms
- Constraint Satisfaction
- Transition-based parsing
3. Research highlight
omit
4. Transition-based dependency parsing
- A simple form of greedy discriminative dependency parser
- The parser does a sequence of bottom up actions.
Arc-standard transition-based parser
Three actions: shift left-arc right-arc
Example: I ate fish
- [root] [I ate fish]
- [root I] [ate fish] -> shift
- [root I ate] [fish] -> shift
- [root I ate] A+=nsubj(ate->I) -> left-arc
- [root ate fish][] -> shift
- [root ate] -> A+=obj(ate->fish) right-arc
- [root] -> A+=root(ROOT->ate) right-arc
- finished.
How to choose action? Train a classifier use features like POS/top of stack word, etc.
Evaluation:
only arrow/ arraw with label.
5. Neural dependency parsing
omit