Quantum neural networks for multimodal sentiment, emotion, and sarcasm analysis

  • Jaiteg Singh
  • , Kamalpreet Singh Bhangu
  • , Abdulrhman Alkhanifer
  • , Ahmad Ali AlZubi
  • , Farman Ali

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Sentiment, emotion, and sarcasm analysis in multimodal dialogues is crucial for understanding the underlying intentions and attitudes expressed by individuals. Traditional methods often struggle to capture the full intensity of these polarities, leading to less accurate results. To address this limitation, we propose a quantum-inspired approach leveraging Quantum Neural Networks (QNNs) for enhanced classification and intensity analysis. A key component of our method is the Variational Quantum Eigensolver (VQE), a hybrid quantum-classical algorithm that optimizes the parameters of the QNN by minimizing the eigenvalues of a Hamiltonian system. This optimization enables the network to learn complex relationships in multimodal data more effectively. Our approach surpasses state-of-the-art methods, achieving up to 7.5 % higher accuracy and 6.8 % greater precision. Experiments on benchmark datasets such as MUStARD, Memotion, CMU-MOSEI, and MELD demonstrate its effectiveness, with an F1-score of 87.3 % on CMU-MOSEI. This method is particularly beneficial in domains like social media, customer support, and entertainment, where both verbal and non-verbal cues play a critical role in accurate sentiment analysis.

Original languageEnglish
Pages (from-to)170-187
Number of pages18
JournalAlexandria Engineering Journal
Volume124
DOIs
StatePublished - Jun 2025

Keywords

  • Emotion quantification
  • Multimodal dialogue
  • Quantum cognition
  • Quantum Neural Networks (QNN)
  • Variational Quantum Eigensolver (VQE)

Fingerprint

Dive into the research topics of 'Quantum neural networks for multimodal sentiment, emotion, and sarcasm analysis'. Together they form a unique fingerprint.

Cite this