Reference Hub42
Sentiment Analysis in the Light of LSTM Recurrent Neural Networks

Sentiment Analysis in the Light of LSTM Recurrent Neural Networks

Subarno Pal, Soumadip Ghosh, Amitava Nag
Copyright: © 2018 |Volume: 9 |Issue: 1 |Pages: 7
ISSN: 1947-9093|EISSN: 1947-9107|EISBN13: 9781522544685|DOI: 10.4018/IJSE.2018010103
Cite Article Cite Article

MLA

Pal, Subarno, et al. "Sentiment Analysis in the Light of LSTM Recurrent Neural Networks." IJSE vol.9, no.1 2018: pp.33-39. http://doi.org/10.4018/IJSE.2018010103

APA

Pal, S., Ghosh, S., & Nag, A. (2018). Sentiment Analysis in the Light of LSTM Recurrent Neural Networks. International Journal of Synthetic Emotions (IJSE), 9(1), 33-39. http://doi.org/10.4018/IJSE.2018010103

Chicago

Pal, Subarno, Soumadip Ghosh, and Amitava Nag. "Sentiment Analysis in the Light of LSTM Recurrent Neural Networks," International Journal of Synthetic Emotions (IJSE) 9, no.1: 33-39. http://doi.org/10.4018/IJSE.2018010103

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

Long short-term memory (LSTM) is a special type of recurrent neural network (RNN) architecture that was designed over simple RNNs for modeling temporal sequences and their long-range dependencies more accurately. In this article, the authors work with different types of LSTM architectures for sentiment analysis of movie reviews. It has been showed that LSTM RNNs are more effective than deep neural networks and conventional RNNs for sentiment analysis. Here, the authors explore different architectures associated with LSTM models to study their relative performance on sentiment analysis. A simple LSTM is first constructed and its performance is studied. On subsequent stages, the LSTM layer is stacked one upon another which shows an increase in accuracy. Later the LSTM layers were made bidirectional to convey data both forward and backward in the network. The authors hereby show that a layered deep LSTM with bidirectional connections has better performance in terms of accuracy compared to the simpler versions of LSTM used here.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.