Title
Fusion of LIDAR data with hyperspectral and high-resolution imagery for automation of DIRSIG scene generation
Abstract
Developing new remote sensing instruments is a costly and time consuming process. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model gives users the ability to create synthetic images for a proposed sensor before building it. However, to produce synthetic images, DIRSIG requires facetized, three-dimensional models attributed with spectral and texture information which can themselves be costly and time consuming to produce. Recent work by Walli has shown that coincident LIDAR data and high-resolution imagery can be registered and used to automatically generate the geometry and texture information needed for a DIRSIG scene. This method, called LIDAR Direct, greatly reduces the time and manpower needed to generate a scene, but still requires user interaction to attribute facets with either library or field measured spectral information. This paper builds upon that work and presents a method for autonomously generating the geometry, texture, and spectral content for a scene when coincident LIDAR data, high-resolution imagery, and HyperSpectral Imagery (HSI) of a site are available. Then the method is demonstrated on real data.
Year
DOI
Venue
2012
10.1109/AIPR.2012.6528215
AIPR
Keywords
DocType
Citations 
texture information,LIDAR Direct,synthetic image,spectral information,spectral content,DIRSIG scene generation,DIRSIG scene,high-resolution imagery,coincident LIDAR data,real data,time consuming
Conference
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Karl C. Walli111.70
Ryan Givens211.04
Michael T. Eismann332619.71