- 博客(1)
- 资源 (4)
- 收藏
- 关注
转载 Granular Computing(粒计算)
Granular Computing(粒计算) http://blog.csdn.net/chl033/article/details/4137964 (1).就国内外描述粒计算的三个词: Granularity 、Granule 和Granulation可以看出究竟什么是粒计算。 • “Granularity”被解释为中文词意“粒度”.这在早期的关于粒度研究的文献中是用这个词,如 A....
2019-05-09 15:45:45 2998 1
2019_Online Meta-Learning.pdf
Online Learning有点像自动控制系统,但又不尽相同,二者的区别是:Online Learning的优化目标是整体的损失函数最小化,而自动控制系统要求最终结果与期望值的偏差最小。
Online Learning训练过程也需要优化一个目标函数(红框标注的),但是和其他的训练方法不同,Online Learning要求快速求出目标函数的最优解,最好是能有解析解。
一般的做法有两种:Bayesian Online Learning和Follow The Regularized Leader。
2020-05-03
node2vec: Scalable Feature Learning for Networks
本文借鉴word2vec提出了node2vec,通过maximize the likelihood of preserving network neighborhoods of nodes in a d-dimensional feature space得到特征表示。利用二阶随机游走产生节点社区。
2020-05-03
KDD2019_A Representation Learning Framework for Property Graphs.pdf
PGE采用主流的inductive模型进行邻居聚集。最后实验中对该方法的有效性进行了详细的分析,并通过在实际数据集上的节点分类和链路预测等benchmark应用中,展示了PGE如何比最新的graph embedding方法获得更好的embedding结果,从而验证了PGE的性能。
2020-05-03
_DeepGCNs Can GCNs Go as Deep as CNNs.pdf
Convolutional Neural Networks (CNNs) achieve impressive performance in a wide variety of fields. Their success
benefited from a massive boost when very deep CNN models
were able to be reliably trained. Despite their merits, CNNs
fail to properly address problems with non-Euclidean data.
To overcome this challenge, Graph Convolutional Networks
(GCNs) build graphs to represent non-Euclidean data, borrow concepts from CNNs, and apply them in training. GCNs
show promising results, but they are usually limited to very
shallow models due to the vanishing gradient problem
2020-04-08
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人