Key Words:
NLP, LLM, Generative Pre-training, KGs, Roadmap, Bidirectional Reasoning
Abstract:
LLMs are black models and can't capture and access factual knowledge. KGs are structured knowledge models that explicitly store rich factual knowledge. The combinations of KGs and LLMs have three frameworks,
-
KG-enhanced LLMs, pre-training and inference stages to provide external knowledge, used for analyzing LLMs and providing interpretability.
-
LLM - augmented KGs, KG embedding, KG completion, KG construction, KG-to text generation, KGQA.
-
Synergized LLMs+KGs, enhance performance in knowledge representation and reasoning.
Background
Introduction of LLMs
Encoder-only LLMs
Use the encoder to encode the sentence and understand the relationships between words.
Predict the mask words in an input sentence. Text classification, named entity recognition.</