Home  | Publications | LJM+23

GenTKG: Generative Forecasting on Temporal Knowledge Graph

MCML Authors

Abstract

The rapid advancements in large language models (LLMs) have ignited interest in the realm of the temporal knowledge graph (TKG) domain, where conventional carefully designed embedding-based and rule-based models dominate. The question remains open of whether pre-trained LLMs can understand structured temporal relational data and replace them as the foundation model for temporal relational forecasting. Therefore, we bring temporal knowledge forecasting into the generative setting. However, challenges occur in the huge chasms between complex graph data structure and sequential natural expressions LLMs can handle, and between the enormous data volume of TKGs and heavy computation costs of finetuning LLMs. To address these challenges, we propose a novel retrieval augmented generation framework named GenTKG combining a temporal logical rule-based retrieval strategy and lightweight few-shot parameter-efficient instruction tuning to solve the above challenges. Extensive experiments have shown that GenTKG is a simple but effective, efficient, and generalizable approach that outperforms conventional methods on temporal relational forecasting with extremely limited computation. Our work opens a new frontier for the temporal knowledge graph domain.

inproceedings


TGL 2023 @NeurIPS 2023

Workshop Temporal Graph Learning at the 37th Conference on Neural Information Processing Systems. New Orleans, LA, USA, Dec 10-16, 2023.

Authors

R. Liao • X. Jia • Y. MaV. Tresp

Links

URL

Research Area

 A3 | Computational Models

BibTeXKey: LJM+23

Back to Top