Recognition of Emotions using Energy Based Bimodal Information Fusion and Correlation

Author
Keywords
Abstract
Multi-sensor information fusion is a rapidly developing research area which forms the backbone of numerous essential technologies such as intelligent robotic control, sensor networks, video and image processing and many more. In this paper, we have developed a novel technique to analyze and correlate human emotions expressed in voice tone & facial expression. Audio and video streams captured to populate audio and video bimodal data sets to sense the expressed emotions in voice tone and facial expression respectively. An energy based mapping is being done to overcome the inherent heterogeneity of the recorded bi-modal signal. The fusion process uses sampled and mapped energy signal of both modalities’s data stream and further recognize the overall emotional component using Support Vector Machine (SVM) classifier with the accuracy 93.06%.
Year of Publication
2014
Journal
International Journal of Interactive Multimedia and Artificial Intelligence
Volume
2
Issue
Special Issue on Multisensor User Tracking and Analytics to Improve Education and other Application Fields
Number
7
Number of Pages
17-21
Date Published
09/2014
ISSN Number
1989-1660
Citation Key
URL
http://www.ijimai.org/JOURNAL/sites/default/files/files/2014/09/ijimai20142_7_2_pdf_26941.pdf
DOI
10.9781/ijimai.2014.272
Attachment