ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs
Yassir Lairgi, Ludovic Moncla, Khalid Benabdeslem, Rémy Cazabet, Pierre Cléau
2025-10-29
Summary
This paper introduces a new method, called ATOM, for building and updating knowledge graphs that track information over time. These knowledge graphs are created from regular text, like news articles or reports, and are designed to be more adaptable to changing information than older methods.
What's the problem?
Currently, building knowledge graphs from text has two main issues. First, most methods create a static snapshot of knowledge, meaning they don't easily update as new information appears. Real-world facts change, so this is a big limitation. Second, newer methods that try to avoid needing lots of specific training data often produce inconsistent results – running them multiple times can give you different answers, and they might miss important details.
What's the solution?
ATOM tackles these problems by breaking down text into very small, independent pieces of information, which they call 'atomic' facts. It then builds a knowledge graph from these facts, carefully noting both *when* the information was observed and *when* it was actually true. Finally, it merges these smaller graphs efficiently, allowing for quick updates and a more complete picture of evolving knowledge.
Why it matters?
This research is important because it provides a way to automatically keep knowledge graphs current with new information, and do so reliably and quickly. This is crucial for applications like real-time data analysis, understanding how events unfold over time, and creating systems that can remember and reason about changing facts, all without needing a ton of pre-existing knowledge or constant manual updates.
Abstract
In today's rapidly expanding data landscape, knowledge extraction from unstructured text is vital for real-time analytics, temporal inference, and dynamic memory frameworks. However, traditional static knowledge graph (KG) construction often overlooks the dynamic and time-sensitive nature of real-world data, limiting adaptability to continuous changes. Moreover, recent zero- or few-shot approaches that avoid domain-specific fine-tuning or reliance on prebuilt ontologies often suffer from instability across multiple runs, as well as incomplete coverage of key facts. To address these challenges, we introduce ATOM (AdapTive and OptiMized), a few-shot and scalable approach that builds and continuously updates Temporal Knowledge Graphs (TKGs) from unstructured texts. ATOM splits input documents into minimal, self-contained "atomic" facts, improving extraction exhaustivity and stability. Then, it constructs atomic TKGs from these facts while employing a dual-time modeling that distinguishes when information is observed from when it is valid. The resulting atomic TKGs are subsequently merged in parallel. Empirical evaluations demonstrate that ATOM achieves ~18% higher exhaustivity, ~17% better stability, and over 90% latency reduction compared to baseline methods, demonstrating a strong scalability potential for dynamic TKG construction.