Affect intensity estimation using multiple modalities

Document Type

Conference Proceeding

Publication Date

1-1-2014

Abstract

One of the challenges in affect recognition is accurate estimation of the emotion intensity level. This research proposes development of an affect intensity estimation model based on a weighted sum of classification confidence levels, displacement of feature points and speed of feature point motion. The parameters of the model were calculated from data captured using multiple modalities such as face, body posture, hand movement and speech. A preliminary study was conducted to compare the accuracy of the model with the annotated intensity levels. An emotion intensity scale ranging from 0 to 1 along the arousal dimension in the emotion space was used. Results indicated speech and hand modality significantly contributed in improving accuracy in emotion intensity estimation using the proposed model.

Publication Source (Journal or Book title)

Proceedings of the 27th International Florida Artificial Intelligence Research Society Conference, FLAIRS 2014

First Page

130

Last Page

133

This document is currently not available here.

Share

COinS