Leveraging the Bayesian Filtering Paradigm for Vision-Based Facial Affective State Estimation This publication appears in: IEEE Transactions on Affective Computing Authors: M. Oveneke, I. Gonzalez, V. Enescu, D. Jiang and H. Sahli Volume: 9 Issue: 4 Pages: 463-477 Publication Date: Oct. 2018
Abstract: Estimating a persons affective state from facial information is an essential capability for social interaction. Automatizingsuch a capability has therefore increasingly driven multidisciplinary research for the past decades. At the heart of this issue are verychallenging signal processing and artificial intelligence problems driven by the inherent complexity of human affect. We thereforepropose a principled framework for designing automated systems capable of continuously estimating the human affective state from anincoming stream of images. First, we model human affect as a dynamical system and define the affective state in terms of valence,arousal and their higher-order derivatives. We then pose the affective state estimation problem as a Bayesian filtering problem andprovide a solution based on Kalman filtering (KF) for probabilistic reasoning over time, combined with multiple instance sparseGaussian processes (MI-SGP) for inferring affect-related measurements from image sequences. We quantitatively and qualitativelyevaluate our proposed framework on the AVEC 2012 and AVEC 2014 benchmark datasets and obtain state-of-the-art results using thebaseline features as input to our MI-SGP-KF model. We therefore believe that leveraging the Bayesian filtering paradigm can pave theway for further enhancing the design of automated systems for affective state estimation.
|