1.semantic slot filling: CFG
context-free grammer:
it's a rule-based approach
rules are manually maked
rule-based approach usually has high precision but low recall
2.semantic slot filling: CRF
bulid some machine learning system
need training corpus with some markup
after having training data, do some feature engineering
3.semantic slot filling: LSTM
big training corpus
no feature generation, feeding your words just as one hot encoders
defining the model
training and inference
DL vs. traditional NLP
why do we need to study traditional NLP?
① perform good enough in many tasks. example: sequence labeling
② allow us not to be blinded with the hype. example: word2vec/distributional semantics
③ can help to further improve DL models. example: word alignment priors in machine translation
why do we need to study DL in N