Time Travelling Pixels: Bitemporal Features Integration with Foundation Model for Remote Sensing Image Change Detection

Keyan Chen1
Chenyang Liu1
Wenyuan Li2
Zili Liu1,3
Hao Chen3
Haotian Zhang1
Zhengxia Zou1
Zhenwei Shi ✉ 1

Beihang University1
University of Hong Kong2
Shanghai AI Laboratory3

Code [GitHub]
Demo [HuggingFace]
Paper [arXiv]
Cite [BibTeX]



Change detection, a prominent research area in remote sensing, is pivotal in observing and analyzing surface transformations. Despite significant advancements achieved through deep learning-based methods, executing high-precision change detection in spatio-temporally complex remote sensing scenarios still presents a substantial challenge. The recent emergence of foundation models, with their powerful universality and generalization capabilities, offers potential solutions. However, bridging the gap of data and tasks remains a significant obstacle. In this paper, we introduce Time Travelling Pixels (TTP), a novel approach that integrates the latent knowledge of the SAM foundation model into change detection. This method effectively addresses the domain shift in general knowledge transfer and the challenge of expressing homogeneous and heterogeneous characteristics of multi-temporal images. The state-of-the-art results obtained on the LEVIR-CD underscore the efficacy of the TTP. The Code is available at https://kychen.me/TTP.


We exploit the general segmentation capabilities of the SAM to construct a change detection network, TTP. TTP is primarily composed of three components: a foundational model backbone based on low-rank fine-tuning; a time-traveling activation gate interposed between dual-temporal features; and an efficient multi-level decoding head. The structure is depicted in the above figure.

Quantitative Results on LEVIR-CD


Based on a template by Phillip Isola and Richard Zhang.