Perelman School of Medicine at the University of Pennsylvania

Section for Biomedical Image Analysis (SBIA)

participating with CBICA

Facial Expression Analysis in Neuropsychiatric Disorders

Facial expressions provide a window into the affective state, cognitive activity, temperament and perhaps, personality and psychopathology of an individual. With the increasing use of facial expressions in the clinical investigation of neuropsychiatric disorders affecting the perception and expression of emotions, affect recognition has proved more tractable for quantitative research while difficulties in quantification of expressions have emerged as major obstacles for progress in research in this area. As clinicians currently rely on purely manual and typically subjective methods of rating expressions, clinical research in schizophrenia and affective disorders has focused on the perception and recognition capabilities of the patients compared to healthy controls, and not so much on the way in which patients express emotions differently from healthy controls. The development of objective automated methods of expression quantification from 2D and 3D image data is the primary goal of this project.

The specific goals of the project are:


  1. To develop a computer-assisted method using 2D images of the face and faces modeled as 3D surfaces, for quantifying changes in facial expression (a) between various intensities of the same emotion and (b) across individuals expressing the same or different emotions. We are using high dimensional shape transformations to relate different expression states.
Expression displacement from neutral to happiness. The left two images are neutral (a) and happy (b) facial images. The expression displacement field (c) characterizes the motion of facial regions using vectors. (d) is a quantification of this displacement field and represents a spatial profile of the expansion and contraction of the facial regions, as part of th expression change.
  1.  To develop a fully automated method for expression quantification using video sequences of the face undergoing a change in expression. We will incorporate the spatial information with the inherent temporal information of the video to define a framework for fine and continuous tracking and quantification of the expression change. We hypothesize that the expression quantification provided by this full spatio-temporal analysis of facial expression will be more comprehensive and distinctive than the comparison between two or three distinct states. We are using combination of geometric and texture features to model the face which is then tracked through a video. We obtain a probabilistic profile of  changes in different expressions.

  1. To extensively validate these methods and test their applicability in clinical investigations. Specifically, we propose:
  • To validate the methods against currently established clinical scales of expression rating.
  • To distinguish between patients clinically diagnosed with “flat” or “inappropriate” affect and healthy controls, and to validate our findings against clinically established results.
  • To study the ability of our method to obtain quantitative measures that go beyond what is currently feasible with clinically established techniques, with emphasis on detecting subtle affect abnormalities which are expected to characterize and differentiate healthy individuals  from members of  “high risk” populations (such as family members of patients). These measures could potentially serve as endophenotypic markers for genetic studies

We are currently using manifold based methods to incorporate information from diffferent expression trajectories created from the videos to determine group differences between patients and controls.


  1. Ragini Verma, Christos Davatzikos, James Loughead, Tim Indersmitten, Ranliang Hu, Christian Kohler, Raquel E. Gur and Ruben C. Gur, "Quantification of facial expressions using high-dimensional shape transformations", Journal of Neuroscience Methods, Volume 141, Issue 1, January 2005.
  2. C. G. Kohler, E.A. Martin, N. Stolar, F. S. Barrett, R. Verma, C. Brensinger, W. Bilker, R. E. Gur and R. C. Gur, “Static Posed and Evoked Facial Expressions of Emotions in Schizophrenia”, Schizophrenia Research, in press, 2008.
  3. Peng Wang, Christian Kohler, Elizabeth Martin, Neal Stolar, Ragini Verma, “Learning-based Analysis of Emotional Impairments in Schizophrenia”, IEEE Computer Society Workshop on Mathematical Methods in Biomedical Image Analysis (MMBIA), Anchorage, Alaska, June 27 - 28, 2008.
  4. P. Wang, F. Barrett, E. Martin, M. Milanova, R. E Gur, R. C Gur, C. Kohler and Ragini Verma, “Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders”, Journal of Neuroscience Methods, 168 (1): 224-238, February 2008.
  5. Peng Wang, Fred Barrett, Christian Kohler, Raquel E. Gur, Ruben C. Gur, Ragini Verma : "Quantifying Facial Expression Abnormality in Schizophrenia" IEEE Conference on Computer Vision and Pattern Recognition (CVPR) June 2007.
  6. Christian G. Kohler, Elizabeth A. Martin, Marina Milonova, Peng Wang, Ragini Verma, Colleen M. Brensinger, Warren Bilker, Raquel E. Gur, Ruben C. Gur : "Dynamic evoked facial expressions of emotions in schizophrenia" Schizophrenia Research, 30-39, Vol. 105, No. 1-3, 2008
  7. Christopher Alvino, Christian Kohler, Frederick Barrett, Raquel E. Gur, Ruben C. Gur, Ragini Verma : "Computerized Measurement of Facial Expression of Emotions in Schizophrenia" Journal of Neuroscience Methods, 350-361, Vol. 163, No. 2, 2007
  8. Jihun Hamm, Christian G. Kohler, Ruben C Gur, Ragini Verma : "Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders."J Neurosci Methods. 2011 Sep 15;200(2):237-56. Epub 2011 Jun 29.