Abstract | ||
---|---|---|
Infrared and visible image fusion, as a powerful tool for the object detection and recognition, has developed with the advent of various imaging modalities. However, resulting images of traditional methods are always difficult to compromise between multimodalities. This paper has solved this problem by a variable-weight fusion rule based on the non-sub sampled contourlet transform (NSCT). The original images are combined in the multiscaled space and the fused image is obtained in the bio-inspired feature frame. Validation experiments on infrared and visible images are for two purposes: the comparison among different fusion rules and the impact of the multiscaled analysis in infrared and visible image fusion. In order to evaluate the proposed method, information entropy (IE), standard deviation (STD), spatial frequency (SF) and mutual information (MI) are adopted to compare with Lap lace, wavelet, and NSCT et al. Results are shown that all evaluation value of the proposed method is higher than that of other methods, and it is a better image fusion method. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1109/IIH-MSP.2013.109 | IIH-MSP |
Keywords | Field | DocType |
image fusion, feature extraction, Bio-inspired feature, infrared image | Object detection,Computer vision,Image fusion,Pattern recognition,Feature detection (computer vision),Computer science,Feature (computer vision),Fusion rules,Feature extraction,Mutual information,Artificial intelligence,Contourlet | Conference |
Citations | PageRank | References |
0 | 0.34 | 5 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Suxia Xing | 1 | 0 | 0.34 |
Yumei Li | 2 | 0 | 0.34 |
Tianhua Chen | 3 | 4 | 0.84 |
Li Yang | 4 | 359 | 63.68 |