Skip to content

Context-aware-Interactive-Attention-for-Multi-modal-Sentiment-and Emotion-Analysis

License

Notifications You must be signed in to change notification settings

GussailRaat/EMNLP-19-IIM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Context-aware Interactive Attention for Multi-modal Sentiment and Emotion Analysis

Code for the paper Context-aware-Interactive-Attention-for-Multi-modal-Sentiment-and Emotion-Analysis (EMNLP 2019).

For the evaluation of our proposed CIA approach, we employ five multi-modal benchmark datasets i.e., YouTube, MOUD, ICT-MMMO, CMU-MOSI, and CMU-MOSEI.

Dataset

  • You can access these datasets from here or

  • You can download datasets from here.

  • Download the dataset from given link and set the path in the code accordingly and make two folder (i) results and (ii) weights

How to Run:

For YouTube dataset:

For trimodal -->> python trimodal_YouTube.py

========================

--versions--

python: 2.7

keras: 2.2.2

tensorflow: 1.9.0

About

Context-aware-Interactive-Attention-for-Multi-modal-Sentiment-and Emotion-Analysis

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published