Learned Hyperspectral Compression Using a Student’s T Hyperprior

Hyperspectral compression is one of the most common techniques in hyperspectral image processing. Most recent learned image compression methods have exhibited excellent rate-distortion performance for natural images, but they have not been fully explored for hyperspectral compression tasks. In this...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Yuanyuan Guo, Yanwen Chong, Yun Ding, Shaoming Pan, Xiaolin Gu
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
Q
Acceso en línea:https://doaj.org/article/bad49450caac49589fd4181c8fbd0e72
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:bad49450caac49589fd4181c8fbd0e72
record_format dspace
spelling oai:doaj.org-article:bad49450caac49589fd4181c8fbd0e722021-11-11T18:55:16ZLearned Hyperspectral Compression Using a Student’s T Hyperprior10.3390/rs132143902072-4292https://doaj.org/article/bad49450caac49589fd4181c8fbd0e722021-10-01T00:00:00Zhttps://www.mdpi.com/2072-4292/13/21/4390https://doaj.org/toc/2072-4292Hyperspectral compression is one of the most common techniques in hyperspectral image processing. Most recent learned image compression methods have exhibited excellent rate-distortion performance for natural images, but they have not been fully explored for hyperspectral compression tasks. In this paper, we propose a trainable network architecture for hyperspectral compression tasks, which not only considers the anisotropic characteristic of hyperspectral images but also embeds an accurate entropy model using the non-Gaussian prior knowledge of hyperspectral images and nonlinear transform. Specifically, we first design a spatial-spectral block, involving a spatial net and a spectral net as the base components of the core autoencoder, which is more consistent with the anisotropic hyperspectral cubes than the existing compression methods based on deep learning. Then, we design a Student’s T hyperprior that merges the statistics of the latents and the side information concepts into a unified neural network to provide an accurate entropy model used for entropy coding. This not only remarkably enhances the flexibility of the entropy model by adjusting various values of the degree of freedom, but also leads to a superior rate-distortion performance. The results illustrate that the proposed compression scheme supersedes the Gaussian hyperprior universally for virtually all learned natural image codecs and the optimal linear transform coding methods for hyperspectral compression. Specifically, the proposed method provides a 1.51% to 59.95% average increase in peak signal-to-noise ratio, a 0.17% to 18.17% average increase in the structural similarity index metric and a 6.15% to 64.60% average reduction in spectral angle mapping over three public hyperspectral datasets compared to the Gaussian hyperprior and the optimal linear transform coding methods.Yuanyuan GuoYanwen ChongYun DingShaoming PanXiaolin GuMDPI AGarticleartificial neural networksentropy modelhyperspectral compressionstudent’s T distributionScienceQENRemote Sensing, Vol 13, Iss 4390, p 4390 (2021)
institution DOAJ
collection DOAJ
language EN
topic artificial neural networks
entropy model
hyperspectral compression
student’s T distribution
Science
Q
spellingShingle artificial neural networks
entropy model
hyperspectral compression
student’s T distribution
Science
Q
Yuanyuan Guo
Yanwen Chong
Yun Ding
Shaoming Pan
Xiaolin Gu
Learned Hyperspectral Compression Using a Student’s T Hyperprior
description Hyperspectral compression is one of the most common techniques in hyperspectral image processing. Most recent learned image compression methods have exhibited excellent rate-distortion performance for natural images, but they have not been fully explored for hyperspectral compression tasks. In this paper, we propose a trainable network architecture for hyperspectral compression tasks, which not only considers the anisotropic characteristic of hyperspectral images but also embeds an accurate entropy model using the non-Gaussian prior knowledge of hyperspectral images and nonlinear transform. Specifically, we first design a spatial-spectral block, involving a spatial net and a spectral net as the base components of the core autoencoder, which is more consistent with the anisotropic hyperspectral cubes than the existing compression methods based on deep learning. Then, we design a Student’s T hyperprior that merges the statistics of the latents and the side information concepts into a unified neural network to provide an accurate entropy model used for entropy coding. This not only remarkably enhances the flexibility of the entropy model by adjusting various values of the degree of freedom, but also leads to a superior rate-distortion performance. The results illustrate that the proposed compression scheme supersedes the Gaussian hyperprior universally for virtually all learned natural image codecs and the optimal linear transform coding methods for hyperspectral compression. Specifically, the proposed method provides a 1.51% to 59.95% average increase in peak signal-to-noise ratio, a 0.17% to 18.17% average increase in the structural similarity index metric and a 6.15% to 64.60% average reduction in spectral angle mapping over three public hyperspectral datasets compared to the Gaussian hyperprior and the optimal linear transform coding methods.
format article
author Yuanyuan Guo
Yanwen Chong
Yun Ding
Shaoming Pan
Xiaolin Gu
author_facet Yuanyuan Guo
Yanwen Chong
Yun Ding
Shaoming Pan
Xiaolin Gu
author_sort Yuanyuan Guo
title Learned Hyperspectral Compression Using a Student’s T Hyperprior
title_short Learned Hyperspectral Compression Using a Student’s T Hyperprior
title_full Learned Hyperspectral Compression Using a Student’s T Hyperprior
title_fullStr Learned Hyperspectral Compression Using a Student’s T Hyperprior
title_full_unstemmed Learned Hyperspectral Compression Using a Student’s T Hyperprior
title_sort learned hyperspectral compression using a student’s t hyperprior
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/bad49450caac49589fd4181c8fbd0e72
work_keys_str_mv AT yuanyuanguo learnedhyperspectralcompressionusingastudentsthyperprior
AT yanwenchong learnedhyperspectralcompressionusingastudentsthyperprior
AT yunding learnedhyperspectralcompressionusingastudentsthyperprior
AT shaomingpan learnedhyperspectralcompressionusingastudentsthyperprior
AT xiaolingu learnedhyperspectralcompressionusingastudentsthyperprior
_version_ 1718431668362018816