Title
Constucting A Unified Framework For Multi-Source Remotely Sensed Data Fusion
Abstract
Remotely sensed data fusion which blends multi-sensor observations to generate synthetic fused data, is regarded as a possible cost-effective approach to tackle with the fixed tradeoff among satellite sensors' spatial, temporal, spectral, and angular characteristics. However, previous studies mainly focus on one-to-one fusion mode, and unified fusion studies are still limited. This paper aims to construct a unified framework for multi-source remotely sensed data fusion. Results with experimental tests using remotely sensed data including China HJ-1A CCD/HSI, MCD43A1, and MCD43A4 showed that the proposed framework was able to generate synthetic fusion with simultaneous fine spatial-, temporal-, spectral-, and angular-resolutions. To be specific, the synthetic fusion can accurately capture the temporal changes while integrating the spatial details, and combine multi-angular observation information while preserving spectral fidelity. The unified fusion framework is also flexible to be extended to arbitrary optical satellites, and hold the potential utility for making full use of available remotely sensed observations.
Year
DOI
Venue
2016
10.1109/IGARSS.2016.7729665
2016 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS)
Keywords
Field
DocType
Unified fusion, remote sensing, spatial-temporal-spectral-angular, China HJ-1A, MODIS
Computer vision,Fidelity,Satellite,Computer science,Remote sensing,Fusion,Sensor fusion,Hyperspectral imaging,Artificial intelligence,Multi-source,Image resolution
Conference
ISSN
Citations 
PageRank 
2153-6996
0
0.34
References 
Authors
7
3
Name
Order
Citations
PageRank
Bin Chen1424.93
Bo Huang254636.79
Bin Xu313323.23