Adaptive Deep Co-Occurrence Feature Learning Based on Classifier-Fusion for Remote Sensing Scene Classification
Remote sensing scene classification has numerous applications on land cover land use. However, classifying the scene images into their correct categories is a challenging task. This challenge is attributable to the diverse semantics of remote sensing images. This nature of remote sensing images make...
Guardado en:
Autores principales: | , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/8375e9d955184c62b7a659acda78357a |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | Remote sensing scene classification has numerous applications on land cover land use. However, classifying the scene images into their correct categories is a challenging task. This challenge is attributable to the diverse semantics of remote sensing images. This nature of remote sensing images makes the task of effective feature extraction and learning complex. Effective image feature representation is essential in image analysis and interpretation for accurate scene image classification with machine learning algorithms. The recent literature shows that convolutional neural networks are mighty in feature extraction for remote sensing scene classification. Additionally, recent literature shows that classifier-fusion attains superior results than individual classifiers. This article proposes the adaptive deep co-accordance feature learning (ADCFL). The ADCFL method utilizes a convolutional neural network to extract spatial feature information from an image in a co-occurrence manner with filters, and then this information is fed to the multigrain forest for feature learning and classification through majority votes with ensemble classifiers. An evaluation of the effectiveness of ADCFL is conducted on the public datasets Resisc45 and Ucmerced. The classification accuracy results attained by the ADCFL demonstrate that the proposed method achieves improved results. |
---|