DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation

Rapid progress on disaster detection and assessment has been achieved with the development of deep-learning techniques and the wide applications of remote sensing images. However, it is still a great challenge to train an accurate and robust disaster detection network due to the class imbalance of e...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Xue Rui, Yang Cao, Xin Yuan, Yu Kang, Weiguo Song
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
GAN
Q
Acceso en línea:https://doaj.org/article/c0619a0b74ed40bdb748e1496cd3b225
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:c0619a0b74ed40bdb748e1496cd3b225
record_format dspace
spelling oai:doaj.org-article:c0619a0b74ed40bdb748e1496cd3b2252021-11-11T18:52:56ZDisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation10.3390/rs132142842072-4292https://doaj.org/article/c0619a0b74ed40bdb748e1496cd3b2252021-10-01T00:00:00Zhttps://www.mdpi.com/2072-4292/13/21/4284https://doaj.org/toc/2072-4292Rapid progress on disaster detection and assessment has been achieved with the development of deep-learning techniques and the wide applications of remote sensing images. However, it is still a great challenge to train an accurate and robust disaster detection network due to the class imbalance of existing data sets and the lack of training data. This paper aims at synthesizing disaster remote sensing images with multiple disaster types and different building damage with generative adversarial networks (GANs), making up for the shortcomings of the existing data sets. However, existing models are inefficient in multi-disaster image translation due to the diversity of disaster and inevitably change building-irrelevant regions caused by directly operating on the whole image. Thus, we propose two models: disaster translation GAN can generate disaster images for multiple disaster types using only a single model, which uses an attribute to represent disaster types and a reconstruction process to further ensure the effect of the generator; damaged building generation GAN is a mask-guided image generation model, which can only alter the attribute-specific region while keeping the attribute-irrelevant region unchanged. Qualitative and quantitative experiments demonstrate the validity of the proposed methods. Further experimental results on the damaged building assessment model show the effectiveness of the proposed models and the superiority compared with other data augmentation methods.Xue RuiYang CaoXin YuanYu KangWeiguo SongMDPI AGarticleGANimage generationdata augmentationremote sensing disaster imageScienceQENRemote Sensing, Vol 13, Iss 4284, p 4284 (2021)
institution DOAJ
collection DOAJ
language EN
topic GAN
image generation
data augmentation
remote sensing disaster image
Science
Q
spellingShingle GAN
image generation
data augmentation
remote sensing disaster image
Science
Q
Xue Rui
Yang Cao
Xin Yuan
Yu Kang
Weiguo Song
DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation
description Rapid progress on disaster detection and assessment has been achieved with the development of deep-learning techniques and the wide applications of remote sensing images. However, it is still a great challenge to train an accurate and robust disaster detection network due to the class imbalance of existing data sets and the lack of training data. This paper aims at synthesizing disaster remote sensing images with multiple disaster types and different building damage with generative adversarial networks (GANs), making up for the shortcomings of the existing data sets. However, existing models are inefficient in multi-disaster image translation due to the diversity of disaster and inevitably change building-irrelevant regions caused by directly operating on the whole image. Thus, we propose two models: disaster translation GAN can generate disaster images for multiple disaster types using only a single model, which uses an attribute to represent disaster types and a reconstruction process to further ensure the effect of the generator; damaged building generation GAN is a mask-guided image generation model, which can only alter the attribute-specific region while keeping the attribute-irrelevant region unchanged. Qualitative and quantitative experiments demonstrate the validity of the proposed methods. Further experimental results on the damaged building assessment model show the effectiveness of the proposed models and the superiority compared with other data augmentation methods.
format article
author Xue Rui
Yang Cao
Xin Yuan
Yu Kang
Weiguo Song
author_facet Xue Rui
Yang Cao
Xin Yuan
Yu Kang
Weiguo Song
author_sort Xue Rui
title DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation
title_short DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation
title_full DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation
title_fullStr DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation
title_full_unstemmed DisasterGAN: Generative Adversarial Networks for Remote Sensing Disaster Image Generation
title_sort disastergan: generative adversarial networks for remote sensing disaster image generation
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/c0619a0b74ed40bdb748e1496cd3b225
work_keys_str_mv AT xuerui disastergangenerativeadversarialnetworksforremotesensingdisasterimagegeneration
AT yangcao disastergangenerativeadversarialnetworksforremotesensingdisasterimagegeneration
AT xinyuan disastergangenerativeadversarialnetworksforremotesensingdisasterimagegeneration
AT yukang disastergangenerativeadversarialnetworksforremotesensingdisasterimagegeneration
AT weiguosong disastergangenerativeadversarialnetworksforremotesensingdisasterimagegeneration
_version_ 1718431725659357184