Non-subsampled shearlet transform remote sensing image fusion combined with parameter-adaptive PCNN

In order to solve the problem that the parameters of pulse-coupled neural network can't be adjusted adaptively in pan-sharpening image fusion, a non-subsampled shearlet transform remote sensing image fusion method based on the combination of parametric-adaptive pulse coupled neural network mode...

Description complète

Enregistré dans:
Détails bibliographiques
Auteurs principaux: CHENG Feifei, FU Zhitao, HUANG Liang, CHEN Pengdi, HUANG Kun
Format: article
Langue:ZH
Publié: Surveying and Mapping Press 2021
Sujets:
Accès en ligne:https://doaj.org/article/b3b5ceb1c43341599dc84f3c4fbfe817
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
Description
Résumé:In order to solve the problem that the parameters of pulse-coupled neural network can't be adjusted adaptively in pan-sharpening image fusion, a non-subsampled shearlet transform remote sensing image fusion method based on the combination of parametric-adaptive pulse coupled neural network model and energy-attributing fusion strategy is proposed. First, the high and low frequency coefficients are obtained by extracting the Y luminance component of the multispectral image YUV color space transform and transforming it with the panchromatic image. Then, aiming at the low-frequency sub-band coefficients are fused by the EA method, the high-frequency sub-band coefficients are obtained by the PA-PCNN model to determine the optimal PCNN model, and then the high-frequency sub-band coefficients are fused; finally, the fusion image is obtained by inverse transformation of NSST and YUV. In this paper, six objective quality indexes, such as spatial frequency, relative dimensionless global error, ERGAS, correlation coefficient, visual information fidelity for fusion, gradient-based fusion performance and structural similarity index, are selected to evaluate the spectral and spatial detail information of the fused images, compared with SE, DGIF, COF and PA-PCNN fusion methods, the proposed method is validated by using multiple sets of high-and low-resolution panchromatic and multispectral remote sensing images, the results show that this method is generally superior to the traditional fusion method of panchromatic and multispectral remote sensing images in objective evaluation and visual perception.