GCN 论文英语表达总结

本文汇总了GCN(Graph Convolutional Networks)在自然语言处理领域的关键英文表达,涉及依赖树、多层架构、自注意力机制等。通过对经典论文的摘录,探讨了GCN如何利用句法信息、长距离依赖以及自循环等概念,以提升模型性能。同时,文章对比了GCN与其他模型如注意力机制、RNN的优缺点,并展示了GCN在语义角色标注、情感分析等任务中的应用。
摘要由CSDN通过智能技术生成

!猫在家里看论文,写论文的日子真爽

!我常常自嘲自己的英文写的很像老太太的裹脚布,又臭又长

!主要是将一些GCN的英文表达方式记录下来,收藏起来慢慢学习

!会给出论文题目,还有一些小小的note

-------------------------------------------------------一条开始认真脸的分界线---------------------------------------------------------

 

Aspect-based Sentiment Classification with Aspect-specific Graph Convolutional Networks 

1. To tackle this problem, we propose to build a Graph Convolutional Network (GCN) over the dependency tree of a sentence to exploit syntactical information and word dependencies. 

注意over 和 exploit 的使用

2. GCN has a multi-layer architecture, with each layer encoding and updating the representation of nodes in the graph using features of immediate neighbors. 

注意multi-layer的使用,

以及用with 的使用

这句话常常需要用来表示多层的GCN

3. Furthermore, following the idea of self-looping in Kipf and Welling (2017), each word is manually set adjacent to itself, i.e. the diagonal values of A are all ones. 

Following the idea of …

the diagonal values of A are all ones. 对角线为1的矩阵A

set adjacent to itself 设置自链接

4. Experimental results have indicated that GCN brings benefit to the overall performance by leveraging both syntactical infor- mation and long-range word dependencies. 

Bing benefit to

Leverage 可以翻译为利用的意思

5. While attention-based models are promising, they are insufficient to capture syntactical dependencies between context words and the aspect within a sentence. 

这里描述了attention-based的缺陷,不能充分地捕捉句子的句法依赖,其实还是由于word与word之间距离远,而 导致的,其实也不能完全这么说吧,self attention 会考虑句内所有word的attention,可能能解决一些远距离的信息丢失问题吧。

While 是尽管

SEMI-SUPERVISED CLASSIFICATION WITH GRAPH CONVOLUTIONAL NETWORKS 

1. Our contributions are two-fold. Firstly, we introduce a simple and well-behaved layer-wise prop- agation rule for neural network models which operate directly on graphs and show how it can be motivated from a first-order approximation of spectral graph convolutions (Hammond et al., 2011). Secondly, we demonstrate how this form of a graph-based neural network model can be used for fast and scalable semi-supervised classification of nodes in a graph. Experiments on a number of datasets demonstrate that our model compares favorably both in classification accuracy and effi- ciency (measured in wall-clock time) against state-of-the-art methods for semi-supervised learning. 

经典GCN是这样来描述

从本质上讲,GCN 是谱图卷积(spectral graph convolution) 的局部一阶近似(localized first-order approximation)。GCN的另一个特点在于其模型规模会随图中边的数量的增长而线性增长。总的来说,GCN 可以用于对局部图结构与节点特征进行编码。

2. Semantic role labeling (SRL)can be informally described as the task of discovering who did what to whom

之前在任务定义,形式化时常常会用 is formalized as ……或者是 is define as ……problem

其实也可以使用 is described as the task of    …..被描述为这样….的任务

GRAPH ATTENTION NETWORKS 

1. In its most general formulation, the model allows every node to attend on every other node, dropping all structural information. We inject the graph structure into the mechanism by performing masked attention—we only compute eij for nodes j ∈ Ni, where Ni is some neighborhood of nodei in the graph. 

这里介绍了GAT的两种机制,一种是每个节点考虑图中所有节点的影响,这是极端情况,忽略了结构信息。

另外一种则是只考虑节点i领域范围内的节点。

注意表达方式

every node to attend on every other node 来表达节点相互attend的感觉

Drop all structural information. 尤其是drop的使用,这里有比较多normal的词,比如ignore,lose

injectsth into sth by sth 将某种机制,某种结构通过某种方式注入到….

mas

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值