Title
Improved Deep Spectral Convolution Network For Hyperspectral Unmixing With Multinomial Mixture Kernel and Endmember Uncertainty.
Abstract
In this study, we propose a novel framework for hyperspectral unmixing by using a modular neural network structure while addressing the endmember uncertainty in our formulation. We present critical contributions throughout the manuscript: First, to improve the separability for hyperspectral data, we modify deep spectral convolution networks (DSCNs) that lead to more stable and accurate results. Second, we introduce a multinomial mixture kernel with a neural network (NN) which mimics the Gaussian Mixture Model (GMM) to estimate the abundances per-pixel by using the low-dimension representations obtained from the improved DSCN. Moreover, as formulated in the spectral variability assumption, a NN module is incorporated to capture the uncertainty term. Third, to optimize the coefficients of the multinomial model and the uncertainty term, Wasserstein GAN is exploited in particular since it has several theoretical benefits over the expectation maximization. Fourth, all neural network modules are formulated as an end-to-end hyperspectral unmixing pipeline that can be optimized with backpropagation by using a stochastic gradient-based solver. Experiments are performed on real and synthetic datasets. The results validate that the proposed method obtains state-of-the-art hyperspectral unmixing performance particularly on the real datasets compared to the baseline techniques.
Year
Venue
Field
2018
arXiv: Computer Vision and Pattern Recognition
Endmember,Pattern recognition,Expectation–maximization algorithm,Convolution,Computer science,Modular neural network,Hyperspectral imaging,Artificial intelligence,Backpropagation,Artificial neural network,Mixture model
DocType
Volume
Citations 
Journal
abs/1808.01104
1
PageRank 
References 
Authors
0.35
0
2
Name
Order
Citations
PageRank
Savas Ozkan1399.48
Gozde Bozdagi Akar212920.15