6Attention-based LSTM for Aspect-level Sentiment Classification(2020.10.22)

Attention-based LSTM for Aspect-level Sentiment Classification

基于注意的LSTM用于Aspect级别的情感分类

一、Abstract

  • Aspect-level sentiment classification is a fine-grained task in sentiment analysis. Since it provides more complete and in-depth results, aspect-level sentiment analysis has received much attention these years.
    面向Aspect的情感分类是情感分析中的一项细粒度任务。由于Aspect级情感分析能够提供更全面、更深入的结果,近年来受到了广泛的关注。
  • In this paper, we reveal that the sentiment polarity of a sentence is not only determined by the content but is also highly related to the concerned aspect.
    在本文中,我们揭示了句子的情感极性不仅由内容决定,而且与相关的Aspect高度相关。
    For instance, “The appetizers are ok, but the service is slow”, for aspect taste, the polarity is positive while for service, the polarity is negative.
    例如,“开胃菜还可以,但是服务很慢。”,对于特征‘taste’,极性为正极,而对于‘service’,极性为负极。
  • Therefore, it is worthwhile to explore the connection between an aspect and the content of a sentence.因此,‘aspect’与句子内容之间的联系是值得探讨的。
  • To this end, we propose an Attention-based Long Short-Term Memory Network for aspect-level sentiment classification.
    为此,我们提出了一种‘基于注意力’的‘LSTM网络’,用于方面级别的情感分类。
  • The attention mechanism can concentrate on different parts of a sentence when different aspects are taken as input.
    当以不同的特征aspect作为输入时,注意力机制可以专注于句子的不同部分。
  • We experiment on the SemEval 2014 dataset and results show that our model achieves state-of-the-art performance on aspect-level sentiment classification.
    我们在SemEval 2014数据集上进行了实验,实验结果表明,我们的模型在aspect级别的情感分类上取得了最好的性能。

二、Introduction

三、Related Work

  • 3.1 Sentiment Classification at Aspect-level
    As we mentioned before, aspect-level sentiment classification is a fine-grained classification task.
    如前所述,aspect级的情感分类是一项细粒度的分类任务。
  • 3.2 Sentiment Classification with Neural Networks
    TD-LSTM and TC-LSTM (Tang et al., 2015a), which took target information into consideration, achieved state-of-the-art performance in target-dependent sentiment classification.
    TD-LSTM和TC-LSTM(Tang et al.,2015a)考虑了’目标信息’,在’目标依赖‘情感分类中取得了最好的性能。TC-LSTM通过对目标短语包含的词向量求平均来获得目标向量,但是仅对目标短语的词嵌入求平均值不足以表示目标短语的语义,从而导致性能欠佳。

四、Attention-based LSTM with Aspect Embedding(带有aspect嵌入的基于注意力的LSTM)ATAE-LSTM

  • 4.1 Long Short-term Memory (LSTM)
  • 4.2 LSTM with Aspect Embedding(AE-LSTM)
  1. To make the best use of aspect information, we propose to learn an embedding vector for each aspect.为了最大限度地利用aspect信息,我们建议为每个aspect学习一个嵌入向量。
  • 4.3 Attention-based LSTM (AT-LSTM)
    在这里插入图片描述
    在这里插入图片描述

  • 4.4 Attention-based LSTM with Aspect Embedding (ATAE-LSTM)

  1. In order to better take advantage of aspect information, we append the input aspect embedding into each word input vector.
    为了更好地利用aspect信息,我们将输入aspect嵌入到每个单词输入向量中。该模型的结构如图3所示。通过这种方式,输出隐藏表示(h1,h2,…,hN)可以具有来自输入aspect(Va)的信息。因此,在接下来的计算注意力权重的步骤中,可以对单词和输入方面之间的相互依赖性进行建模。

在这里插入图片描述

  • 4.5 Model Training
    其中目标函数(损失函数)是交叉熵损失。

在这里插入图片描述

  • AT-LSTM:
  • AE-LSTM:
  • ATAE-LSTM:

五、Experiment

在这里插入图片描述在这里插入图片描述

  • 5.1 Dataset
  • 5.2 Task Definition
  • Aspect-level Classification
  • Aspect-Term-level Classification
  • 5.3 Comparison with baseline methods
    We compare our model with several baselines, in-cluding LSTM, TD-LSTM, and TC-LSTM.
  1. LSTM:标准LSTM无法捕获句子中的任何aspect信息,因此,在不同aspect,它必须具有相同的情感极性。由于它不能利用aspect信息,因此该模型的性能最差也就不足为奇了。
  2. TD-LSTM:TD-LSTM通过将aspect作为目标,可以提高情感分类器的性能。由于TD-LSTM没有注意力机制,它不能“知道”哪些词对于给定的aspect是重要的。
  3. TC-LSTM:TC-LSTM通过将目标合并到句子表示中来扩展了TD-LSTM。但是在Table 2中表现的比TD-LSTM还差。

-.In our models, we embed aspects into another vector space. The embedding vector of aspects can be learned well in the process of training.
在我们的模型中,我们将aspect嵌入到另一个向量空间中。 在训练过程中可以很好地学习aspect embedding vector
-.ATAE-LSTM not only addresses the shortcoming of the unconformity between word vectors and aspect embeddings, but also can capture the most important information in response to a given aspect.
ATAE-LSTM不仅解决了词向量和aspect嵌入不一致的缺点,而且能够捕捉到针对给定aspect的最重要的信息。
-.In addition, ATAE-LSTM can capture the important and different parts of a sentence when given different aspects.
此外,当给定不同的方面时,ATAE-LSTM可以捕获句子的重要和不同部分。

  • 5.4 Qualitative Analysis
  1. It is enlightening to analyze which words decide the sentiment polarity of the sentence given an aspect.We can obtain the attention weightαin Equation 8and visualize the attention weights accordingly.
    分析哪些词决定了一个aspect的句子的情感极性,具有启发性。我们可以在等式8中获得注意力权重α,并相应地可视化注意力权重.
  2. Figure 4 shows the representation of how attention focuses on words with the influence of a given aspect.
    图4显示了如何在给定aspect的影响下将注意力集中在单词上的表示。
  3. Besides, the attention can detect multiple keywords if more than one keyword is existing. In Figure 4 (b),tastless‘ and ’too sweet‘ are both detected.
    此外,如果存在多个关键字,则注意力可以检测多个关键字。 在图4(B)中,“tastless”和“too sweet”都被检测到。
  • 5.5 Case Study
  1. In sentence (a), “The appetizers are ok, but the service is slow.” there are two aspects food and service.
    -Our model can discriminate different sentiment polarities with different aspects.我们的模型可以从不同aspect区分不同的情感极.
    -In sentence (b), “I highly recommend it for not just its superb cuisine, but also for its friendly owners and staff.”there is a negation(否定词) word not.
    Our model can obtain correct polarity, not affected by the negation word who doesn’t represent negation here.我们的模型可以获得正确的极性,不受此处表示否定的否定词的影响
    -In the last instance (C ),“The service, however, is a peg or two below the quality of food (horrible bartenders), and the clientele, for the most part, are rowdy, loud-mouthed commuters (this could ex-plain the bad attitudes from the staff) getting loaded for an AC/DC concert or a Knicks game.”“但是,这项服务的质量比食物质量差(糟糕的调酒师)大约一两个钉子,而且顾客大多是粗鲁,大声的通勤者(这可以缓解工作人员的不良态度) 可以进行AC / DC音乐会或尼克斯比赛。”
    the sentence has a long and complicated structure so that existing parser may hardly obtain correct parsing trees.
    句子结构复杂冗长,现有的句法分析器很难得到正确的句法分析树。因此,基于树的神经网络模型很难正确预测极性。而我们的基于注意力的LSTM在注意机制和aspect嵌入的帮助下,可以很好地处理这些句子。在这里插入图片描述

六、Conclusion and Future Work

  • In this paper, we have proposed attention-based LSTMs for aspect-level sentiment classification.
    在本文中,我们提出了基于注意力的LSTM用于aspect级别的情感分类。
  • The key idea of these proposals are to learn aspect embeddings and let aspects participate in computing attention weights.
    这些方案的核心思想是学习aspect嵌入,并让aspect参与计算注意力attention权重。
  • Our proposed models can concentrate on different parts of a sentence when different aspects are given so that they are more competitive for aspect-level classification. Experiments show that our proposed models, AE-LSTM and ATAE-LSTM, obtain superior performance over the baseline models.
    当给出不同的方面时,我们提出的模型可以集中在句子的不同部分,这样它们在aspect级别的分类中更具竞争力。 实验表明,我们提出的模型AE-LSTM和ATAE-LSTM获得了优于基准模型的性能。
  • Though the proposals have shown potentials for aspect-level sentiment analysis, different aspects are input separately.
    虽然这些建议显示了aspect级别的情感分析的潜力,但是不同的方面是分开输入的。
  • As future work, an interesting and possible direction would be to model more than one aspect simultaneously with the attention mechanism.
    在将来的工作中,一个有趣且可能的方向是在注意力机制同时对多个方面进行建模。
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值