Open Access

Multichannel Approach for Sentiment Analysis Using Stack of Neural Network with Lexicon Based Padding and Attention Mechanism


Cite

Sentiment analysis (SA) has been an important focus of study in the fields of computational linguistics and data analysis for a decade. Recently, promising results have been achieved when applying DNN models to sentiment analysis tasks. Long short-term memory (LSTM) models, as well as its derivatives like gated recurrent unit (GRU), are becoming increasingly popular in neural architecture used for sentiment analysis. Using these models in the feature extraction layer of a DNN results in a high dimensional feature space, despite the fact that the models can handle sequences of arbitrary length. Another problem with these models is that they weight each feature equally. Natural language processing (NLP) makes use of word embeddings created with word2vec. For many NLP jobs, deep neural networks have become the method of choice. Traditional deep networks are not dependable in storing contextual information, so dealing with sequential data like text and sound was a nightmare for such networks. This research proposes multichannel word embedding and employing stack of neural networks with lexicon-based padding and attention mechanism (MCSNNLA) method for SA. Using convolution neural network (CNN), Bi-LSTM, and the attention process in mind, this approach to sentiment analysis is described. One embedding layer, two convolution layers with max-pooling, one LSTM layer, and two fully connected (FC) layers make up the proposed technique, which is tailored for sentence-level SA. To address the shortcomings of prior SA models for product reviews, the MCSNNLA model integrates the aforementioned sentiment lexicon with deep learning technologies. The MCSNNLA model combines the strengths of emotion lexicons with those of deep learning. To begin, the reviews are processed with the sentiment lexicon in order to enhance the sentiment features. The experimental findings show that the model has the potential to greatly improve text SA performance.

eISSN:
2255-8691
Language:
English