AUDITORIUM PRESENTATION

AUDITORIUM PRESENTATION

A Virtual Character System for Understanding Complex Changes in Mood and Affect

Nicholas Woolridge, University of Toronto, Department of Surgery, Division of Biomedical Communication, Toronto, Canada

ABSTRACT:

The World Health Organization now ranks affective disorders (ADs hereafter: depression and bipolar disorder, plus variants) fourth among the leading causes of global illness burden, with some 20% of all individuals affected and 60-75% of all suicides occurring in AD sufferers. Moreover, the AD illness burden is rising in modern health care systems, where less than half who suffer major depression and just 60% of those afflicted with bipolar disorder receive some medical attention. Improving education of health care providers, patients, and families in the recognition of the symptoms and course of ADs remains a key goal. Recent work has established that the temporal progression of ADs is highly complex, involving intricate cascades of change in mood quality and mood intensity on many time scales, rather than a simple shift in average mood level. Technology-based multimedia may improve the medical student's grasp of these complex temporal patterns and dynamical properties. In order to better document the temporal patterns of mood change in AD we are following a control group of normal (N=20) and rapidly cycling bipolar (N=20) subjects for a period of 18 months using a newly designed, 17-item mood symptom rating questionnaire. Each item is presented using a visual analog scale on a PalmOS-enabled cellular telephone and activates twice each day. After each session, subject response data is automatically encrypted and uploaded by cell phone channel to a central database for processing and analysis.

In this report we introduce a method that uses real-time, computer-generated facial animation to visualize the meaning of these complex, multivariate data for mood change. The method works by translating the mood data time series into changes in emotional expression on a 3D virtual face. An idealized computer-generated 3D head has been designed and implemented in the Cinema 4D XL animation system. Key frame poses representing a wide variety of facial mood behaviors have been implemented for the virtual character. A "slicer interface" then maps a specified stream of mood time series data onto a pathway or trajectory through the space of key expressive poses, providing real-time immersion of the observer in the mood change pattern. By adjusting the relation between the animation rate and the rate of the passage through the mood data time series, dynamic patterns of mood change that normally evolve over months or years can be readily visualized in the laboratory or classroom setting.

Our presentation will review our wireless-protocol mood database design, present the facial animation models and method, demonstrate the real-time animation system, and summarize our plans for assessment of the method's efficacy as an AD teaching aid.

Work supported in part by the Bell University Laboratories program in the Faculty of Medicine.

BENEFIT TO PARTICIPANTS ATTENDING SESSION: Our presentation will review our protocol for wireless collection of mood data, present a system for visualization of mood data through facial animation, and summarize our plans for assessment of the method's efficacy as an affective disorders teaching aid.

Charles J. Lumsden
MSB, 7th Floor
1 King's College Circle
Toronto, ON
M5S 1A8
Phone: 416-978-7178
Fax: 416-978-3701
Email: charles.lumsden@utoronto.ca

CO-AUTHORS:
Nicholas Woolridge
David Kreindler
BMC
1 King's College Circle
Toronto, ON
Phone: 416-978-3910
Fax: 416-978-6891
Email: n.woolridge@utoronto.ca
david.kreindler@utoronto.ca
http://www.surg.med.utoronto.ca/BMC/BMCfaculty/Woolridge.html