Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN

Convolutional neural networks (CNN), recurrent neural networks (RNN), attention, and their variants are extensively applied in the sentiment analysis, and the effect of fusion model is expected to be better. However, fusion model is confronted with some problems such as complicated structure, excess...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Qiannan Zhu, Xiaofan Jiang, Renzhen Ye
Formato: article
Lenguaje:EN
Publicado: IEEE 2021
Materias:
Acceso en línea:https://doaj.org/article/7238031c83f244d6be63df3ef0b37e04
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:7238031c83f244d6be63df3ef0b37e04
record_format dspace
spelling oai:doaj.org-article:7238031c83f244d6be63df3ef0b37e042021-11-18T00:10:54ZSentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN2169-353610.1109/ACCESS.2021.3118537https://doaj.org/article/7238031c83f244d6be63df3ef0b37e042021-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9562511/https://doaj.org/toc/2169-3536Convolutional neural networks (CNN), recurrent neural networks (RNN), attention, and their variants are extensively applied in the sentiment analysis, and the effect of fusion model is expected to be better. However, fusion model is confronted with some problems such as complicated structure, excessive trainable parameters, and long training time. The classification effect of traditional model with cross entropy loss as loss function is undesirable since sample category imbalance as well as ease and difficulty of sample classification is not taken into account. In order to solve these problems, the model BiGRU-Att-HCNN is proposed on the basis of bidirectional gated recurrent unit (BiGRU), attention, and hybrid convolutional neural networks. In this model, BiGRU and self-attention are combined to acquire global information, and key information weight is supplemented. Two parallel convolutions (dilated convolution and standard convolution) are used to obtain multi-scale characteristic information with relatively less parameters, and the standard convolution is replaced with depthwise separable convolution with two-step calculations. Traditional max-pooling and average-pooling are discarded, and global average pooling is applied to substitute the pooling layer and the fully-connected layer simultaneously, making it possible to substantially decrease the number of model parameters and reduce over-fitting. In our model, focal loss is used as the loss function to tackle the problems of unbalanced sample categories and hard samples. Experimental results illustrate that in terms of multiple indicators, our model outperforms the 15 benchmark models, even with intermediate number of trainable parameters.Qiannan ZhuXiaofan JiangRenzhen YeIEEEarticleBidirectional gated recurrent unitdepthwise separable convolutiondilated convolutionfocal lossself-attention mechanismElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 9, Pp 149077-149088 (2021)
institution DOAJ
collection DOAJ
language EN
topic Bidirectional gated recurrent unit
depthwise separable convolution
dilated convolution
focal loss
self-attention mechanism
Electrical engineering. Electronics. Nuclear engineering
TK1-9971
spellingShingle Bidirectional gated recurrent unit
depthwise separable convolution
dilated convolution
focal loss
self-attention mechanism
Electrical engineering. Electronics. Nuclear engineering
TK1-9971
Qiannan Zhu
Xiaofan Jiang
Renzhen Ye
Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN
description Convolutional neural networks (CNN), recurrent neural networks (RNN), attention, and their variants are extensively applied in the sentiment analysis, and the effect of fusion model is expected to be better. However, fusion model is confronted with some problems such as complicated structure, excessive trainable parameters, and long training time. The classification effect of traditional model with cross entropy loss as loss function is undesirable since sample category imbalance as well as ease and difficulty of sample classification is not taken into account. In order to solve these problems, the model BiGRU-Att-HCNN is proposed on the basis of bidirectional gated recurrent unit (BiGRU), attention, and hybrid convolutional neural networks. In this model, BiGRU and self-attention are combined to acquire global information, and key information weight is supplemented. Two parallel convolutions (dilated convolution and standard convolution) are used to obtain multi-scale characteristic information with relatively less parameters, and the standard convolution is replaced with depthwise separable convolution with two-step calculations. Traditional max-pooling and average-pooling are discarded, and global average pooling is applied to substitute the pooling layer and the fully-connected layer simultaneously, making it possible to substantially decrease the number of model parameters and reduce over-fitting. In our model, focal loss is used as the loss function to tackle the problems of unbalanced sample categories and hard samples. Experimental results illustrate that in terms of multiple indicators, our model outperforms the 15 benchmark models, even with intermediate number of trainable parameters.
format article
author Qiannan Zhu
Xiaofan Jiang
Renzhen Ye
author_facet Qiannan Zhu
Xiaofan Jiang
Renzhen Ye
author_sort Qiannan Zhu
title Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN
title_short Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN
title_full Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN
title_fullStr Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN
title_full_unstemmed Sentiment Analysis of Review Text Based on BiGRU-Attention and Hybrid CNN
title_sort sentiment analysis of review text based on bigru-attention and hybrid cnn
publisher IEEE
publishDate 2021
url https://doaj.org/article/7238031c83f244d6be63df3ef0b37e04
work_keys_str_mv AT qiannanzhu sentimentanalysisofreviewtextbasedonbigruattentionandhybridcnn
AT xiaofanjiang sentimentanalysisofreviewtextbasedonbigruattentionandhybridcnn
AT renzhenye sentimentanalysisofreviewtextbasedonbigruattentionandhybridcnn
_version_ 1718425169887756288