In text sentiment analysis, a crucial challenge is that conventional word
vectors fail to capture lexical ambiguity. The Gated Recurrent Unit
(GRU), an advanced variant of RNN, is extensively utilized in natural
language processing tasks such as information filtering, sentiment
analysis, machine translation, and speech recognition. GRU can retain
sequential information, but it lacks the ability to focus on the most
relevant features of a sequence. Therefore, this paper introduces a
novel text sentiment analysis-based RNN approach, a Recurrent
Attention Unit (RAU), which incorporates an attention gate directly
within the traditional GRU cell. This addition enhances GRU’s
capacity to retain long-term information and selectively concentrates
on critical elements in sequential data. Furthermore, this study
integrates an improved Self-Attention technique (SA) with RA-GRU
known as SA+RA-GRU. The improved self-attention technique is
executed to reallocate the weights of deep text sequences. While
attention techniques have recently become a significant innovation in
deep learning, their precise impact on sentiment analysis has yet to be
fully evaluated. The experimental findings show that the proposed
approach SA+RA-GRU attains an accuracy of 92.17%, and 82.38% on
the IMDB, and MR datasets, and outperformed traditional approaches.
Moreover, the SA+RA-GRU model demonstrates excellent
generalization and robust performance.
Dhurgham Ali Mohammed1, Kalyani A. Patel2 University of Kufa, Iraq1, K. S. School of Business Management and Information Technology, Gujarat University, India2
Sentiment Analysis, RNNs, GRU, Recurrent Attention Unit, Self- Attention Mechanism, Deep Learning
January | February | March | April | May | June | July | August | September | October | November | December |
6 | 9 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Published By : ICTACT
Published In :
ICTACT Journal on Soft Computing ( Volume: 15 , Issue: 4 , Pages: 3737 - 3745 )
Date of Publication :
January 2025
Page Views :
52
Full Text Views :
18
|