FSG-Net: Frequency-Spatial Synergistic Gated Network for High-Resolution Remote Sensing Change Detection
Zhongxiang Xie, Shuangxi Miao, Yuhan Jiang, Zhewei Zhang, Jing Yao, Xuecao Li, Jianxi Huang, Pedram Ghamisi
2025-09-19
Summary
This paper introduces a new computer vision model, called FSG-Net, designed to more accurately detect changes in high-resolution satellite or aerial images of the Earth's surface.
What's the problem?
Detecting real changes in images over time is tricky because things that *look* different to a computer aren't always actual changes. For example, shadows moving or seasons changing can cause the computer to think something has changed when it hasn't – these are 'false alarms'. Also, computers often struggle to combine detailed, specific information from the images with broader, more general understandings of what's in the image, leading to blurry or inaccurate change detections.
What's the solution?
FSG-Net tackles these problems in a few key ways. First, it analyzes the images using 'frequency domain' techniques, which helps it filter out those misleading changes caused by things like lighting or seasons. Then, it focuses on the important parts of the image that *are* likely to represent real changes. Finally, it cleverly combines the big-picture understanding with the fine details to create a clear and precise map of what has changed.
Why it matters?
This research is important because accurately detecting changes in images has many real-world applications, like monitoring deforestation, tracking urban growth, or assessing disaster damage. By improving the accuracy of change detection, FSG-Net can help us better understand and respond to changes happening on our planet, and the model achieves top performance on standard tests compared to other existing methods.
Abstract
Change detection from high-resolution remote sensing images lies as a cornerstone of Earth observation applications, yet its efficacy is often compromised by two critical challenges. First, false alarms are prevalent as models misinterpret radiometric variations from temporal shifts (e.g., illumination, season) as genuine changes. Second, a non-negligible semantic gap between deep abstract features and shallow detail-rich features tends to obstruct their effective fusion, culminating in poorly delineated boundaries. To step further in addressing these issues, we propose the Frequency-Spatial Synergistic Gated Network (FSG-Net), a novel paradigm that aims to systematically disentangle semantic changes from nuisance variations. Specifically, FSG-Net first operates in the frequency domain, where a Discrepancy-Aware Wavelet Interaction Module (DAWIM) adaptively mitigates pseudo-changes by discerningly processing different frequency components. Subsequently, the refined features are enhanced in the spatial domain by a Synergistic Temporal-Spatial Attention Module (STSAM), which amplifies the saliency of genuine change regions. To finally bridge the semantic gap, a Lightweight Gated Fusion Unit (LGFU) leverages high-level semantics to selectively gate and integrate crucial details from shallow layers. Comprehensive experiments on the CDD, GZ-CD, and LEVIR-CD benchmarks validate the superiority of FSG-Net, establishing a new state-of-the-art with F1-scores of 94.16%, 89.51%, and 91.27%, respectively. The code will be made available at https://github.com/zxXie-Air/FSG-Net after a possible publication.