⚠ Official Notice: www.ijisrt.com is the official website of the International Journal of Innovative Science and Research Technology (IJISRT) Journal for research paper submission and publication. Please beware of fake or duplicate websites using the IJISRT name.



Contextual Emotion Classification in Text Using Hybrid Word2Vec-BiLSTM Deep Learning Architecture


Authors : Nilla Sivasrinu; Patinavalasa Durga Prasad; Suneel Kumar Duvvuri

Volume/Issue : Volume 11 - 2026, Issue 3 - March


Google Scholar : https://tinyurl.com/2sps6ms8

Scribd : https://tinyurl.com/52w75ja8

DOI : https://doi.org/10.38124/ijisrt/26mar2017

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : Textual Emotion Recognition is one of the most recent directions in Natural Language Processing (NLP) due to the widespread adoption of internet-based services such as Twitter or forum communities for instant message exchange. Since there are no visible signals such as facial expressions and tone in text mode (face-to-face is easier to infer, because in that we receive both nonverbal as well as verbal cues), Emotion Detection from Textual Content as a Context-Aware System. Traditional machine learning techniques, such as Naïve Bayes and Logistic Regression, rely on manual feature extraction methods, such as Bag-of-Words (BoW) and TF-IDF. Although effective to some extent, these approaches fail to capture semantic meaning and contextual dependencies, limiting their performance in handling complex linguistic patterns. To overcome these limitations, this research proposes a hybrid deep learning model that combines Word2Vec (CBOW) embeddings with a Bidirectional Long Short-Term Memory (Bi-LSTM) network. Word2Vec converts text into dense vector representations, while Bi-LSTM captures contextual information by processing sequences in both directions. The model is trained on a large dataset of over 416,123 labelled samples across six emotion categories.

Keywords : Emotion Detection, Natural Language Processing (NLP), Deep Learning, Word2Vec (CBOW), Bidirectional LSTM (BiLSTM), Text Classification, Sentiment Analysis, Machine Learning.

References :

  1. Q. Huang and T. Hain, “Exploration of Audio Quality Assessment and Anomaly Localisation Using Attention Models,” May 2020, [Online]. Available: http://arxiv.org/abs/2005.08053
  2. H. Rashkin, E. M. Smith, M. Li, and Y.-L. Boureau, “Towards Empathetic Open-domain Conversation Models: a New Benchmark and Dataset.”
  3. A. Joulin, E. Grave, P. Bojanowski, and T. Mikolov, “Bag of Tricks for Efficient Text Classification,” Aug. 2016, [Online]. Available: http://arxiv.org/abs/1607.01759
  4. X. Rong, “word2vec Parameter Learning Explained,” Jun. 2016, [Online]. Available: http://arxiv.org/abs/1411.2738
  5. Y. Scherrer, “Recovering dialect geography from an unaligned comparable corpus.” [Online]. Available: www.archimob.ch.
  6. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to Sequence Learning with Neural Networks,” Dec. 2014, [Online]. Available: http://arxiv.org/abs/1409.3215
  7. A. Graves and J. Schmidhuber, “Framewise Phoneme Classification with Bidirectional LSTM and Other Neural Network Architectures.”
  8. L. Zhang, S. Wang, and B. Liu, “Deep Learning for Sentiment Analysis: A Survey.”
  9. E. Cambria and B. White, “Jumping NLP curves: A review of natural language processing research,” 2014, Institute of Electrical and Electronics Engineers Inc. doi: 10.1109/MCI.2014.2307227.
  10. M. Zhang, Kristian, N. Jensen, and B. Plank, “KOMPETENCER: Fine-grained Skill Classification in Danish Job Postings via Distant Supervision and Transfer Learning,” 2022. [Online]. Available: https://github.com/jjzha/kompetencer
  11. M. Johnson and T. M. Khoshgoftaar, “Medicare fraud detection using neural networks,” J. Big Data, vol. 6, no. 1, Dec. 2019, doi: 10.1186/s40537-019-0225-0.
  12. Z. Huang, W. Xu, and K. Yu, “Bidirectional LSTM-CRF Models for Sequence Tagging,” Aug. 2015, [Online]. Available: http://arxiv.org/abs/1508.01991
  13. Deng and D. Yu, “Deep learning: Methods and applications,” 2013, Now Publishers Inc. doi: 10.1561/2000000039.
  14. P. Zhou et al., “Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification.”
  15. E. Saravia, H.-C. T. Liu, Y.-H. Huang, J. Wu, and Y.-S. Chen, “CARER: Contextualized Affect Representations for Emotion Recognition.”
  16. J. Pennington, R. Socher, and C. D. Manning, “GloVe: Global Vectors for Word Representation.” [Online]. Available: http://nlp.
  17. S. Tai, R. Socher, and C. D. Manning, “Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks,” May 2015, [Online]. Available: http://arxiv.org/abs/1503.00075
  18. Van Der Maaten and G. Hinton, “Visualizing Data using t-SNE,” 2008.
  19. J. Bollen, H. Mao, and X.-J. Zeng, “Twitter mood predicts the stock market,” Oct. 2010, doi: 10.1016/j.jocs.2010.12.007.
  20. Srivastava, G. Hinton, A. Krizhevsky, and R. Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks from Overfitting,” 2014.
  21. G. Coppersmith, M. Dredze, and C. Harman, “Quantifying Mental Health Signals in Twitter.” [Online]. Available: https://code.google.com/p/cld2/
  22. J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” May 2019, [Online]. Available: http://arxiv.org/abs/1810.04805
  23. Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” May 27, 2015, Nature Publishing Group. doi: 10.1038/nature14539.
  24. W. Yin, K. Kann, M. Yu, and H. Schütze, “Comparative Study of CNN and RNN for Natural Language Processing,” Feb. 2017, [Online]. Available: http://arxiv.org/abs/1702.01923
  25. F. Almeida and G. Xexéo, “Word Embeddings: A Survey,” May 2023, [Online]. Available: http://arxiv.org/abs/1901.09069
  26. Y. Kim, “Convolutional Neural Networks for Sentence Classification,” Sep. 2014, [Online]. Available: http://arxiv.org/abs/1408.5882
  27. T. Young, D. Hazarika, S. Poria, and E. Cambria, “Recent Trends in Deep Learning Based Natural Language Processing,” Nov. 2018, [Online]. Available: http://arxiv.org/abs/1708.02709
  28. J.-U. Lee, E. Schwan, and C. M. Meyer, “Manipulating the Difficulty of C-Tests.” [Online]. Available: https://www.ukp.tu-darmstadt.de
  29. C. Sun, L. Huang, and X. Qiu, “Utilizing BERT for Aspect-Based Sentiment Analysis via Constructing Auxiliary Sentence.” [Online]. Available: https://github.com/uclmr/jack/tree/master
  30. S. Bodapati, H. Yun, and Y. Al-Onaizan, “Robustness to Capitalization Errors in Named Entity Recognition.”
  31. S. Keskar, B. McCann, L. R. Varshney, C. Xiong, and R. Socher, “CTRL: A Conditional Transformer Language Model for Controllable Generation,” Sep. 2019, [Online]. Available: http://arxiv.org/abs/1909.05858
  32. D.-K. Nguyen, V. Goswami, and X. Chen, “MoVie: Revisiting Modulated Convolutions for Visual Counting and Beyond,” Oct. 2020, [Online]. Available: http://arxiv.org/abs/2004.11883
  33. S. Y. Feng et al., “A Survey of Data Augmentation Approaches for NLP,” Dec. 2021, [Online]. Available: http://arxiv.org/abs/2105.03075
  34. A. M. Price-Whelan et al., “Binary companions of evolved stars in APOGEE DR14: Search method and catalog of ~5,000 companions,” Apr. 2018, doi: 10.3847/1538-3881/aac387.
  35. Kidger, J. Morrill, J. Foster, and T. Lyons, “Neural Controlled Differential Equations for Irregular Time Series,” Nov. 2020, [Online]. Available: http://arxiv.org/abs/2005.08926
  36. S. Lin, J. Hilton, and O. Evans, “TruthfulQA: Measuring How Models Mimic Human Falsehoods,” May 2022, [Online]. Available: http://arxiv.org/abs/2109.07958
  37. Thoppilan et al., “LaMDA: Language Models for Dialog Applications,” Feb. 2022, [Online]. Available: http://arxiv.org/abs/2201.08239
  38. M. Garley and J. Hockenmaier, “Beefmoves: Dissemination, Diversity, and Dynamics of English Borrowings in a German Hip Hop Forum,” Association for Computational Linguistics, 2012.
  39. T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient Estimation of Word Representations in Vector Space,” Sep. 2013, [Online]. Available: http://arxiv.org/abs/1301.3781
  40. X. Zhang and Y. LeCun, “Text Understanding from Scratch,” Apr. 2016, [Online]. Available: http://arxiv.org/abs/1502.01710
  41. P. Liu, X. Qiu, and X. Huang, “Recurrent Neural Network for Text Classification with Multi-Task Learning,” May 2016, [Online]. Available: http://arxiv.org/abs/1605.05101
  42. H. Zhang, H. Wang, Y. Cao, C. Shen, and Y. Li, “Robust Data Hiding Using Inverse Gradient Attention,” Oct. 2022, [Online]. Available: http://arxiv.org/abs/2011.10850
  43. A. Joshi, P. Bhattacharyya, and M. J. Carman, “Automatic Sarcasm Detection: A Survey,” Sep. 2016, [Online]. Available: http://arxiv.org/abs/1602.03426

Textual Emotion Recognition is one of the most recent directions in Natural Language Processing (NLP) due to the widespread adoption of internet-based services such as Twitter or forum communities for instant message exchange. Since there are no visible signals such as facial expressions and tone in text mode (face-to-face is easier to infer, because in that we receive both nonverbal as well as verbal cues), Emotion Detection from Textual Content as a Context-Aware System. Traditional machine learning techniques, such as Naïve Bayes and Logistic Regression, rely on manual feature extraction methods, such as Bag-of-Words (BoW) and TF-IDF. Although effective to some extent, these approaches fail to capture semantic meaning and contextual dependencies, limiting their performance in handling complex linguistic patterns. To overcome these limitations, this research proposes a hybrid deep learning model that combines Word2Vec (CBOW) embeddings with a Bidirectional Long Short-Term Memory (Bi-LSTM) network. Word2Vec converts text into dense vector representations, while Bi-LSTM captures contextual information by processing sequences in both directions. The model is trained on a large dataset of over 416,123 labelled samples across six emotion categories.

Keywords : Emotion Detection, Natural Language Processing (NLP), Deep Learning, Word2Vec (CBOW), Bidirectional LSTM (BiLSTM), Text Classification, Sentiment Analysis, Machine Learning.

Paper Submission Last Date
30 - April - 2026

SUBMIT YOUR PAPER CALL FOR PAPERS
Video Explanation for Published paper

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe