Attributed KG|KG + LLM
- Wu, Yanrong, and Zhichun Wang. “Knowledge graph embedding with numeric attributes of entities.” Proceedings of The Third Workshop on Representation Learning for NLP. 2018.
将KG triple分为relational triples 和 triples of entity attributes, 设计two component models: structure embedding model和attribute embedding model。
The structure embedding model is a translational distance model that preserves the knowledge of entity relations; the attribute embedding model is a regression-based model that preserves the knowledge of entity attributes. Two component models are jointly optimized to get the embeddings of entities, relations, and attributes.
分relation查看link prediction performance是否有提升。
- Trisedya, Bayu Distiawan, Jianzhong Qi, and Rui Zhang. “Entity alignment between knowledge graphs using attribute embeddings.” Proceedings of the AAAI conference on artificial intelligence. Vol. 33. No. 01. 2019.
Joint Emedding from structural Embedding and attribute Embedding
- Yao, Liang, Chengsheng Mao, and Yuan Luo. “KG-BERT: BERT for knowledge graph completion.” arXiv preprint arXiv:1909.03193 (2019).‘
相当于将head enity[sep]relation[sep]tail entity作为句子输入pretrained bertclassifier进行分类问题。
用sigmoid作为激活函数,作为输出层的激活函数,使用 crossentropy(作为损失函数,就是将最后分类层的每个输出节点使用sigmoid激活函数激活,然后对每个输出节点和对应的标签计算交叉熵损失函数。
- Runfeng, Xie, et al. “LKPNR: LLM and KG for Personalized News Recommendation Framework.” arXiv preprint arXiv:2308.12028 (2023).
news recommendation的setting + LLM/KG-Augmented Encoder
- Trajanoska, Milena, Riste Stojanov, and Dimitar Trajanov. “Enhancing Knowledge Graph Construction Using Large Language Models.” arXiv preprint arXiv:2305.04676 (2023).
Use chatgpt to extract relations from collected new articles./Use chatgpt to generate KG
- Xie, Xin, et al. “Lambdakg: A library for pre-trained language model-based knowledge graph embeddings.” CoRR (2022).
discrimination methods: Xhrpair Xtail
Xhr pair = [CLS] Xh[SEP] Xr[SEP]
Xtail = [CLS] Xt[SEP].
generation methods:
X = [CLS]Xh[Entity h] [SEP] Xr [SEP] [MASK] [SEP]
- Nayyeri, Mojtaba, et al. “Integrating Knowledge Graph embedding and pretrained Language Models in Hypercomplex Spaces.” arXiv preprint arXiv:2208.02743 (2022).