作者:stanford
来源:EMNLP2017
本文贡献:公布了一批SF关系分类的语料,the TAC KBP Relation Extraction Dataset(TACRED),共有119474examples,存放于LDC
创新:将位置注意力机制与LSTM结合
已有工作的问题;
1、Although modern sequence models such as Long Short-Term Memory(LSTM) networks have gating mechanisms to control the relative influence of each individual word to the final sentence representation (Hochreiter and Schmidhuber, 1997), these controls are not explicitly conditioned on the entire sentence being classified;
2、Most existing work either does not explicitly model the positions of entities (i.e., subject and object) in the sequence, or models the positions only within a local region.
position encoder:
最后在z上加个全连接层,然后softmax分类