![](https://img-blog.csdnimg.cn/20201014180756925.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
stanford网课
文章平均质量分 97
LinMoson
这个作者很懒,什么都没留下…
展开
-
Stanford NLP4
Lesson 18 我们想要理解meaning of larger phrases,比如snowboarder=person on a snowboard 人类理解的方式是:理解每个components meaning,然后将它们组合起来 interpret the meaning of larger text units - entities,descriptive terms,facts,arguments,stories - by semantic composition of small eleme原创 2021-02-23 20:08:36 · 577 阅读 · 0 评论 -
Stanford NLP3
Lesson 13 Representation for a word 早年间,supervised neural network,效果还不如一些feature classifier(SVM之类的) 后来训练unsupervised neural network,效果赶上feature classifier了,但是花费的时间很长(7weeks) 如果再加一点hand-crafted features,准确率还能进一步提升 后来,我们可以train on supervised small corpus,找到d原创 2021-02-23 20:07:55 · 579 阅读 · 0 评论 -
Stanford NLP2
Lesson 7:Neural Machine Translation 根据chain rule,如果前三个都很小,gradient vanish Consider the gradient of the loss J(i)(θ)J^{(i)}(\theta)J(i)(θ) on step i,with respect to the hidden state h(j)h^{(j)}h(j) at step j ∂J(i)(θ)∂h(j)=∂J(i)(θ)∂h(i)∏j≤t≤i∂h(t)(θ)∂h(t−1)原创 2021-02-23 20:06:52 · 313 阅读 · 0 评论 -
Stanford NLP1
Class1 Intro we don’t know how others interpret our words, 我们能做的就是get better at guessing how your words affect others, or make them feel sth. like what we want they to feel 重点 we want to represent word’s meaning.We relate meaning to idea. 尝试 利用WordNet这个工原创 2020-11-26 16:03:32 · 244 阅读 · 0 评论