Facial Expression Analysis in Neuropsychiatric Disorders

Facial expressions provide a window into the affective state, cognitive activity, temperament and perhaps, personality and psychopathology of an individual. With the increasing use of facial expressions in the clinical investigation of neuropsychiatric disorders affecting the perception and expression of emotions, affect recognition has proved more tractable for quantitative research while difficulties in quantification of expressions have emerged as major obstacles for progress in research in this area. As clinicians currently rely on purely manual and typically subjective methods of rating expressions, clinical research in schizophrenia and affective disorders has focused on the perception and recognition capabilities of the patients compared to healthy controls, and not so much on the way in which patients express emotions differently from healthy controls. The development of objective automated methods of expression quantification from 2D and 3D image data is the primary goal of this project.

The specific goals of the project are:

  1. To develop a computer-assisted method using 2D images of the face and faces modeled as 3D surfaces, for quantifying changes in facial expression (a) between various intensities of the same emotion and (b) across individuals expressing the same or different emotions. We are using high dimensional shape transformations to relate different expression states.
Faces 2
Expression displacement from neutral to happiness. The left two images are neutral (a) and happy (b) facial images. The expression displacement field (c) characterizes the motion of facial regions using vectors. (d) is a quantification of this displacement field and represents a spatial profile of the expansion and contraction of the facial regions, as part of th expression change.
  1.  To develop a fully automated method for expression quantification using video sequences of the face undergoing a change in expression. We will incorporate the spatial information with the inherent temporal information of the video to define a framework for fine and continuous tracking and quantification of the expression change. We hypothesize that the expression quantification provided by this full spatio-temporal analysis of facial expression will be more comprehensive and distinctive than the comparison between two or three distinct states. We are using combination of geometric and texture features to model the face which is then tracked through a video. We obtain a probabilistic profile of  changes in different expressions.

  1. To extensively validate these methods and test their applicability in clinical investigations. Specifically, we propose:
  • To validate the methods against currently established clinical scales of expression rating.
  • To distinguish between patients clinically diagnosed with “flat” or “inappropriate” affect and healthy controls, and to validate our findings against clinically established results.
  • To study the ability of our method to obtain quantitative measures that go beyond what is currently feasible with clinically established techniques, with emphasis on detecting subtle affect abnormalities which are expected to characterize and differentiate healthy individuals  from members of  “high risk” populations (such as family members of patients). These measures could potentially serve as endophenotypic markers for genetic studies

We are currently using manifold based methods to incorporate information from diffferent expression trajectories created from the videos to determine group differences between patients and controls.

[[+hidden]]