JL1-CD: A New Benchmark for Remote Sensing Change Detection and a Robust Multi-Teacher Knowledge Distillation Framework
Ziyuan Liu, Ruifei Zhu, Long Gao, Yuanxiu Zhou, Jingyu Ma, Yuantao Gu
2025-02-24
Summary
This paper talks about a new dataset called JL1-CD and a special AI teaching method called MTKD, both designed to improve how computers detect changes in satellite images of Earth.
What's the problem?
Scientists have been using AI to spot changes in satellite images, like new buildings or deforestation, but they face two big challenges. First, there aren't many good, freely available sets of high-quality satellite images for training AI. Second, it's hard to make AI that can consistently detect changes in different types of images.
What's the solution?
The researchers created JL1-CD, a new collection of 5,000 pairs of very detailed satellite images. They also developed MTKD, a clever way to teach AI using multiple 'teacher' models that work together to train a 'student' model. They tested this on their new JL1-CD dataset and another dataset called SYSU-CD.
Why it matters?
This matters because better change detection in satellite images can help us track things like urban growth, deforestation, or damage from natural disasters more accurately. The new dataset gives scientists better tools to work with, and the MTKD method helps create more reliable AI for this task. This could lead to improved monitoring of our changing planet and better decision-making in areas like urban planning and environmental protection.
Abstract
Deep learning has achieved significant success in the field of remote sensing image change detection (CD), yet two major challenges remain: the scarcity of sub-meter, all-inclusive open-source CD datasets, and the difficulty of achieving consistent and satisfactory detection results across images with varying change areas. To address these issues, we introduce the JL1-CD dataset, which contains 5,000 pairs of 512 x 512 pixel images with a resolution of 0.5 to 0.75 meters. Additionally, we propose a multi-teacher knowledge distillation (MTKD) framework for CD. Experimental results on the JL1-CD and SYSU-CD datasets demonstrate that the MTKD framework significantly improves the performance of CD models with various network architectures and parameter sizes, achieving new state-of-the-art results. The code is available at https://github.com/circleLZY/MTKD-CD.