ISSN 2394-5125
 


    An Intelligent Framework for CNN-Based Multimodal Sentiment Analysis (2020)


    Lakshmi Manikyamba
    JCR. 2020: 4632-4644

    Abstract

    There are several social media analytics activities that rely heavily on sentiment analysis of user-generated textual information from the internet. Adding pictures and videos to one's social media posts has become common practice among today's internet users as a means of conveying one's point of view and experiences. Large-scale text and visual data sentiment analysis helps extract user feelings about brands or subjects. An intelligent multi-modal sentiment analysis framework is urgently needed to effectively mine information from several modalities in order to keep up with the explosion of massive multimodal data. Previous research mostly dealt on either textual or visual material. By using a "convolutional neural network" ("CNN") to extract features from several modalities, this study aims to provide a novel framework for multi-modal sentiment analysis. To showcase the sophistication of our models, we conduct "multi-model sentiment analysis" on two publicly accessible �datasets,� �including text,� �audio,� and �visual input.�

    Description

    �Sentiment analysis,� �multimodal data,� �CNN,� �SVM�

    » PDF

    Volume & Issue

    Volume 7 Issue-4