Title
Fusion of Multispectral Imagery and Spectrometer Data in UAV Remote Sensing.
Abstract
High spatial resolution hyperspectral data often used in precision farming applications are not available from current satellite sensors, and difficult or expensive to acquire from standard aircraft. Alternatively, in precision farming, unmanned aerial vehicles (UAVs) are emerging as lower cost and more flexible means to acquire very high resolution imagery. Miniaturized hyperspectral sensors have been developed for UAVs, but the sensors, associated hardware, and data processing software are still cost prohibitive for use by individual farmers or small remote sensing firms. This study simulated hyperspectral image data by fusing multispectral camera imagery and spectrometer data. We mounted a multispectral camera and spectrometer, both being low cost and low weight, on a standard UAV and developed procedures for their precise data alignment, followed by fusion of the spectrometer data with the image data to produce estimated spectra for all the multispectral camera image pixels. To align the data collected from the two sensors in both the time and space domains, a post-acquisition correlation-based global optimization method was used. Data fusion, to estimate hyperspectral reflectance, was implemented using several methods for comparison. Flight data from two crop sites, one being tomatoes, and the other corn and soybeans, were used to evaluate the alignment procedure and the data fusion results. The data alignment procedure resulted in a peak R-2 between the spectrometer and camera data of 0.95 and 0.72, respectively, for the two test sites. The corresponding multispectral camera data for these space and time offsets were taken as the best match to a given spectrometer reading, and used in modelling to estimate hyperspectral imagery from the multispectral camera pixel data. Of the fusion approaches evaluated, principal component analysis (PCA) based models and Bayesian imputation reached a similar accuracy, and outperformed simple spline interpolation. Mean absolute error (MAE) between predicted and observed spectra was 17% relative to the mean of the observed spectra, and root mean squared error (RMSE) was 0.028. This approach to deriving estimated hyperspectral image data can be applied in a simple fashion at very low cost for crop assessment and monitoring within individual fields.
Year
DOI
Venue
2017
10.3390/rs9070696
REMOTE SENSING
Keywords
Field
DocType
UAV,data alignment,data fusion,precision farming,spectrometer,multispectral image
Computer vision,Multispectral image,Remote sensing,Spectrometer,Hyperspectral imaging,Sensor fusion,Pixel,Artificial intelligence,Multispectral pattern recognition,Geology,Image resolution,Data structure alignment
Journal
Volume
Issue
ISSN
9
7
2072-4292
Citations 
PageRank 
References 
3
0.42
10
Authors
4
Name
Order
Citations
PageRank
Chuiqing Zeng1233.00
King, D.J.2112.40
Murray Richardson361.15
Bo Shan430.75