2020-11-04

Exploiting BERT for End-to-End Aspect-based Sentiment Analysis

利用BERT进行end-to-end的ASBA

Abstract

  1. In this paper, we investigate the modeling power of contextualized embeddings from pre-trained language models, e.g. BERT, on theE2E-ABSA task.
    在本文中,我们研究了来自预训练语言模型(例如BERT)的上下文嵌入在E2E-ABSA任务上的建模能力。
  2. Specifically, we build a series of simple yet insightful neural base-lines to deal with E2E-ABSA.
    具体地说,我们建立了一系列简单但有洞察力的神经基线来处理E2E-ABSA。
  3. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works.
    实验结果表明,即使使用简单的线性分类层,基于BERT的体系结构也能胜过最新的作品。
  4. Besides, we also standardize the comparative study by consistently utilizing a hold-out development dataset for model selection, which is largely ignored by previous works.
    此外,我们还通过始终如一地利用支持开发数据集进行模型选择来标准化比较研究,而先前的工作在很大程度上忽略了这一点。
    因此,我们的工作可以作为E2E-ABSA的基于BERT的基准。

一、Introduction

二、Model

  • In this paper, we focus on the aspect term-level End-to-End Aspect-Based Sentiment Analysis (E2E-ABSA) problem setting.
    在本文中,我们重点关注aspect term级别的end-to-end基于方面的情感分析(E2E-ABSA)问题设置。
  • This task canbe formulated as a sequence labeling problem.
    此任务可以表述为序列标记问题。
  • The overall architecture of our model is depicted in Figure 1.

在这里插入图片描述

  • 2.1 BERT as Embedding Layer
    1).First of all, we pack the input features
    as H0={e1,···,eT}, where et(t∈[1,T]) is the combination of the token embedding, position embedding and segment embedding corresponding to the input token xt.
    2).然后把这三层叠加的输出经过L个Transformer层。
  • 2.2 Design of Downstream Model
    1).After obtaining the BERT representations, we de-sign a neural layer, called E2E-ABSA layer inFigure 1, on top of BERT embedding layer forsolving the task of E2E-ABSA.
    在得到BERT表示后,我们在BERT嵌入层之上设计了一个神经层,称为图1中的E2E-ABSA层,用于解决E2E-ABSA的任务。
    2).We investigate several different design for the E2E-ABSA layer, namely, linear layer, recurrent neural networks, self-attention networks, and conditional random fields layer.
    我们研究了E2E-ABSA层的几种不同设计,即线性层,递归神经网络,自注意网络和条件随机场层。

在这里插入图片描述

  • 2.1)we firstly employ BERT component with L transformer layers to calculate the corresponding contextualized representations HL={hL1,···,hLT} ∈RT×dimh for the input tokens where dimh denotes the dimension of the representation vector.
    我们首先使用具有L个转换器层的BERT分量来计算输入tokens对应的上下文表示HL={hL1,···,hlt} ∈RT×dimh,其中dimh表示表示向量的维数。
  • 2.2)Then, the contextualized representations are fed to the task-specific layers to predict the tag sequence y = {y1,···,yT}. The possible values of the tag yt areB-{POS,NEG,NEU},I-{POS,NEG,NEU},E-{POS,NEG,NEU},S-{POS,NEG,NEU} or O, denoting the beginning of aspect, inside of aspect, end of aspect, single-word aspect, with positive, negative or neutral sentiment respectively, as well as outside of aspect.
    然后,将上下文表示馈送到任务特定层以预测标签序列y={y1,···,yt}。 标签yt的可能值是B-{POS,NEG,neu},I-{POS,NEG,neu},E-{POS,NEG,neu},S-{POS,NEG,neu}或O,分别表示aspect的开始,aspect的内部,aspect的结尾,aspect单词,分别带有积极的,消极的或中性的情绪,以及aspect外部
  • 3)SAN:One variant is composed of asimple self-attention layer and residual connec-tion (He et al., 2016), dubbed as “SAN”.
    SAN是作者自己定义的,是一个简单自注意力layer和残余连接的变体。

在这里插入图片描述

  • 4)TFM:Another variant is a transformer layer(dubbed as “TFM”),另一个变体是一个transformer 层。which has the same architecture with the transformer encoder layer in the BERT.它与BERT中的transformer编码层具有相同的架构。
    在这里插入图片描述

在这里插入图片描述

在这里插入图片描述

三、Experiment

在这里插入图片描述

四、Conclusion

  1. In this paper, we investigate the effectiveness of BERT embedding component on the task of End-to-End Aspect-Based Sentiment Analysis (E2E-ABSA).
    在本文中,我们研究了BERT嵌入组件在基于端到端基于方面的情感分析(E2E-ABSA)任务中的有效性。
  2. Specifically, we explore to couple theBERT embedding component with various neural models and conduct extensive experiments on two benchmark datasets.
    具体来说,我们探索将BERT嵌入组件与各种神经模型耦合,并在两个基准数据集上进行广泛的实验。
  3. The experimental results demonstrate the superiority of BERT-based models on capturing aspect-based sentiment and their robustness to overfitting.
    实验结果证明了基于BERT的模型在捕获基于方面的情感方面的优越性以及对过度拟合的鲁棒性。
    在这里插入图片描述

五、文献

  1. Peng Chen, Zhongqian Sun, Lidong Bing, and WeiYang. 2017. Recurrent attention network on mem-ory for aspect sentiment analysis. InEMNLP, pages452–461.
  2. Feifan Fan, Yansong Feng, and Dongyan Zhao. 2018.Multi-grained attention network for aspect-levelsentiment classification. InEMNLP, pages 3433–3442.
  3. Zhifang Fan, Zhen Wu, Xin-Yu Dai, Shujian Huang,and Jiajun Chen. 2019.Target-oriented opinionwords extraction with target-fused neural sequencelabeling. InNAACL, pages 2509–2518.
  4. Ruidan He, Wee Sun Lee, Hwee Tou Ng, and DanielDahlmeier. 2019. An interactive multi-task learningnetwork for end-to-end aspect-based sentiment anal-ysis. InACL, pages 504–515.
  5. Jeremy Howard and Sebastian Ruder. 2018. Universallanguage model fine-tuning for text classification. InACL, pages 328–339.
  6. Mengting Hu, Shiwan Zhao, Honglei Guo, RenhongCheng, and Zhong Su. 2019a. Learning to detectopinion snippet for aspect-based sentiment analysis.arXiv preprint arXiv:1909.11297.
  7. Minghao Hu, Yuxing Peng, Zhen Huang, DongshengLi, and Yiwei Lv. 2019b. Open-domain targetedsentiment analysis via span-based extraction andclassification. InACL, pages 537–546.
  8. Binxuan Huang and Kathleen Carley. 2018. Parameter-ized convolutional neural networks for aspect levelsentiment classification. InEMNLP, pages 1091–1096.
  9. Zhiheng Huang, Wei Xu, and Kai Yu. 2015. Bidirec-tional lstm-crf models for sequence tagging.arXivpreprint arXiv:1508.01991.
  10. John D Lafferty, Andrew McCallum, and Fernando CNPereira. 2001. Conditional random fields: Prob-abilistic models for segmenting and labeling se-quence data. InICML, pages 282–289.
  11. Guillaume Lample, Miguel Ballesteros, Sandeep Sub-ramanian, Kazuya Kawakami, and Chris Dyer. 2016.Neural architectures for named entity recognition.InNAACL, pages 260–270.
  12. Zeyang Lei, Yujiu Yang, Min Yang, Wei Zhao, JunGuo, and Yi Liu. 2019. A human-like semantic cog-nition network for aspect-level sentiment classification. InAAAI, pages 6650–6657.
  13. Hao Li and Wei Lu. 2017. Learning latent sentimentscopes for entity-level sentiment analysis. InAAAI,pages 3482–3489.
  14. Hao Li and Wei Lu. 2019.Learning explicit andimplicit structures for targeted sentiment analysis.arXiv preprint arXiv:1909.07593.
  15. Xin Li, Lidong Bing, Wai Lam, and Bei Shi. 2018.Transformation networks for target-oriented senti-ment classification. InACL, pages 946–956.
  16. Zheng Li, Ying Wei, Yu Zhang, Xiang Zhang, and XinLi. 2019b. Exploiting coarse-to-fine task transfer foraspect-level sentiment classification. InAAAI, pages4253–4260.
  17. Zhouhan Lin, Minwei Feng, Cicero Nogueira dos San-tos, Mo Yu, Bing Xiang, Bowen Zhou, and YoshuaBengio. 2017. A structured self-attentive sentenceembedding. InICLR.
  18. Bing Liu. 2012. Sentiment analysis and opinion mining.Synthesis lectures on human language tech-nologies, 5(1):1–167.
  19. Jiangming Liu and Yue Zhang. 2017. Attention mod-eling for targeted sentiment. InEACL, pages 572–577.
  20. Liyuan Liu, Jingbo Shang, Xiang Ren, Frank F Xu,Huan Gui, Jian Peng, and Jiawei Han. 2018. Em-power sequence labeling with task-aware neural lan-guage model. InAAAI, pages 5253–5260.
  21. Huaishao Luo, Tianrui Li, Bing Liu, and Junbo Zhang.2019. DOER: Dual cross-shared RNN for aspectterm-polarity co-extraction.InACL, pages 591–601.
  22. Dehong Ma, Sujian Li, and Houfeng Wang. 2018a.Joint learning for targeted sentiment analysis. InEMNLP, pages 4737–4742.
  23. Dehong Ma, Sujian Li, Xiaodong Zhang, and HoufengWang. 2017.Interactive attention networks foraspect-level sentiment classification.InIJCAI,pages 4068–4074.
  24. Xuezhe Ma and Eduard Hovy. 2016.End-to-endsequence labeling via bi-directional LSTM-CNNs-CRF. InACL, pages 1064–1074.
  25. Yukun Ma, Haiyun Peng, and Erik Cambria. 2018b.Targeted aspect-based sentiment analysis via em-bedding commonsense knowledge into an attentivelstm. InAAAI.
  26. Navonil Majumder, Soujanya Poria, Alexander Gel-bukh, Md. Shad Akhtar, Erik Cambria, and Asif Ek-bal. 2018. IARM: Inter-aspect relation modelingwith memory networks in aspect-based sentimentanalysis. InEMNLP, pages 3402–3411.
  27. Maria Pontiki, Dimitris Galanis, Haris Papageorgiou,Ion Androutsopoulos, Suresh Manandhar, Moham-mad AL-Smadi, Mahmoud Al-Ayyoub, YanyanZhao, Bing Qin, Orph ́ee De Clercq, V ́eroniqueHoste, Marianna Apidianaki, Xavier Tannier, Na-talia Loukachevitch, Evgeniy Kotelnikov, Nuria Bel,Salud Mar ́ıa Jim ́enez-Zafra, and G ̈uls ̧en Eryi ̆git.2016. SemEval-2016 task 5: Aspect based senti-ment analysis. InSemEval, pages 19–30.
  28. Maria Pontiki, Dimitris Galanis, Haris Papageorgiou,Suresh Manandhar, and Ion Androutsopoulos. 2015.SemEval-2015 task 12: Aspect based sentimentanalysis. InSemEval, pages 486–495.
  29. Maria Pontiki, Dimitris Galanis, John Pavlopoulos,Harris Papageorgiou, Ion Androutsopoulos, andSuresh Manandhar. 2014. SemEval-2014 task 4:Aspect based sentiment analysis. InSemEval, pages27–35.
  30. Alexander Rietzler, Sebastian Stabinger, Paul Opitz,and Stefan Engl. 2019. Adapt or get left behind:Domain adaptation through bert language modelfinetuning for aspect-target sentiment classification.arXiv preprint arXiv:1908.11860.
  31. Martin Schmitt, Simon Steinheber, Konrad Schreiber,and Benjamin Roth. 2018. Joint aspect and polar-ity classification for aspect-based sentiment analysiswith end-to-end neural networks. InEMNLP, pages1109–1114.
  32. Tao Shen, Tianyi Zhou, Guodong Long, Jing Jiang,Shirui Pan, and Chengqi Zhang. 2018. Disan: Di-rectional self-attention network for rnn/cnn-free lan-guage understanding. InAAAI.
  33. Youwei Song, Jiahai Wang, Tao Jiang, Zhiyue Liu, andYanghui Rao. 2019. Attentional encoder networkfor targeted sentiment classification.arXiv preprintarXiv:1902.09314.
  34. Chi Sun, Luyao Huang, and Xipeng Qiu. 2019. Uti-lizing BERT for aspect-based sentiment analysis viaconstructing auxiliary sentence. InNAACL, pages380–385.
  35. Duyu Tang, Bing Qin, and Ting Liu. 2016. Aspectlevel sentiment classification with deep memory net-work. InEMNLP, pages 214–224.
  36. Yi Tay, Luu Anh Tuan, and Siu Cheung Hui. 2018.Learning to attend via word-aspect associative fu-sion for aspect-based sentiment analysis. InThirty-Second AAAI Conference on Artificial Intelligence.
  37. Shuai Wang, Sahisnu Mazumder, Bing Liu, MianweiZhou, and Yi Chang. 2018. Target-sensitive mem-ory networks for aspect sentiment classification. InACL, pages 957–967.
  38. Yequan Wang, Minlie Huang, Xiaoyan Zhu, andLi Zhao. 2016. Attention-based LSTM for aspect-level sentiment classification.InEMNLP, pages606–615.
  39. Hu Xu, Bing Liu, Lei Shu, and Philip Yu. 2019. BERTpost-training for review reading comprehension andaspect-based sentiment analysis. InNAACL, pages2324–2335.
  40. Wei Xue and Tao Li. 2018. Aspect based sentimentanalysis with gated convolutional networks. InACL,pages 2514–2523.
  41. Jianfei Yu and Jing Jiang. 2019. Adapting bert fortarget-oriented multimodal sentiment classification.InIJCAI, pages 5408–5414.
  42. Chen Zhang, Qiuchi Li, and Dawei Song. 2019.Aspect-based sentiment classification with aspect-specific graph convolutional networks.arXivpreprint arXiv:1909.03477.
  43. Jie Zhou, Jimmy Xiangji Huang, Qin Chen, Qin-min Vivian Hu, Tingting Wang, and Liang He. 2019.Deep learning for aspect-level sentiment classifica-tion: Survey, vision and challenges.IEEE Access.
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值