FastDerainNet: A Deep Learning Algorithm for Single Image Deraining

Existing neural network-based methods for de-raining single images exhibit dissatisfactory results owing to the inefficient propagation of features when objects with sizes and shapes similar to those of rain streaks are present in images. Furthermore, existing methods do not consider that the abunda...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Xiuwen Wang, Zhiwei Li, Hongtao Shan, Zhiyuan Tian, Yuanhong Ren, Wuneng Zhou
Formato: article
Lenguaje:EN
Publicado: IEEE 2020
Materias:
Acceso en línea:https://doaj.org/article/24873bd045b2484eb0e879414213feba
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:Existing neural network-based methods for de-raining single images exhibit dissatisfactory results owing to the inefficient propagation of features when objects with sizes and shapes similar to those of rain streaks are present in images. Furthermore, existing methods do not consider that the abundant information included in rain streaked images could interfere with the training process. To overcome these limitations, in this paper, we propose a deep residual learning algorithm called FastDerainNet for removing rain streaks from single images. We design a deep convolutional neural network architecture, based on a deep residual network called the share-source residual module (SSRM), by substituting the origins of all shortcut connections for one point. To further improve the de-raining performance, we adopt the SSRM as the parameter layers in FastDerainNet and use image decomposition to modify the loss function. Finally, we train FastDerainNet on a synthetic dataset. By learning the residual mapping between rainy and clean image detail layers, it is able to reduce the mapping range and simplify the training process. Experiments on both synthetic and real-world images demonstrate that the proposed method achieves increased performance with regard to de-raining, in addition to preserving original details, in comparison with other state-of-the-art methods.