UTAIL

English UT

Home

News

Member

Mission

Research

Publications

Joining Lab

Access

Links

Square
edgeedge

Year: 2008-
Member:
Anna Gruebler
Kenji Suzuki
Partners:
- Institute for Developmental Study
- Private Company (Cosmetics)
Tags:
- Social Playware
- Cybernics
- Augmented Human

 
Emotion Reader
A wearable interface for reading facial expressions

 

Facial expressions play a significant role in interpersonal information exchange by providing additional information about the emotional state or intention of the person displaying them. The traditional approach to recognizing emotional facial expression uses video and photographic cameras and subsequently computer vision algorithms to identify facial expressions. These methods however require a camera directed at the person’s face and have little tolerance against occlusion or changes in lighting conditions or camera angles.

The purpose of this study is to create an emotion reading system that can recognize the subject’s emotions in real-time and be able to display the output in different formats. The recognition of facial expressions can also be achieved through the analysis of the facial surface electromyographic (EMG) signals. Additionally it must be unobtrusive to the user, not inhibit expressions and work in an environment that has changing lighting conditions and position of the subject. This work proposes the use of “cross-talk” to get information about facial expressions from bioelectrical potentials captured on areas on the side of the face making emotion reading unobtrusive. Because of the mixed nature of cross-talk it is necessary to transform the sampled signal. We propose a classification method by combining two techniques: Independent Component Analysis (ICA) to transform the signals into independent components and an Artificial Neural Network (ANN) to achieve accurate facial expression identification.

The goal of this research is to develop an emotional communication aid to improve human-human communication. The Emotion Reader has applications in several areas, especially in therapy and assistive technology. It can be used to provide biofeedback to patients during rehabilitation and a quantitative smile evaluation during smile training. Additionally it can aid the visually impaired: the listener is able to perceive the speaker’s facial expressions, through alternative forms of communication such as audio or vibro-tactile stimulation. It is also a tool in e-learning, distance communication and computer games because it can transmit the facial expression automatically without the need of high bandwidth. Another application lies in increasing the quality of life for patients suffering from facial paralysis, where the signals obtained from the healthy side of the face can be used to control a robot mask that produces an artificial smile on the paralyzed side.


 


This work is partly supported by Grants-in-Aid for Scientific Research, MEXT, Japan.


This study was supported in part by the Global COE Program on "Cybernics: fusion of human, machine, and information systems.”

BAMISThis research is partly supported by BAMIS Center, University of Tsukuba.

     
Publications
  • Gruebler, A., and Suzuki, K., Design of a Wearable Device for Reading Positive Expressions from Facial EMG Signals, IEEE Transactions on Affective Computing, 5(3):227-237, 2014.
  • Gruebler, A., and Suzuki, K., "Analysis of Social Smile Sharing Using a Wearable Device that Captures Distal Electromyographic Signals," Proc. of Third International Conference on Emerging Security Technologies, pp. 178-181, 2012.
  • Gruebler, A., Suzuki, K., "Measurement of Distal EMG Signals Using a Wearable Device for Reading Facial Expressions," Proc. of Annual International Conference of the IEEE EMBS, pp. 4594-4597, 2010.
  • Gruebler, A., Suzuki, K., "A wearable device for human-robot emotional interaction using distal facial EMG signals," Proc. of International Session of the 28th Annual Conference of the Robotics Society of Japan (RSJ 2010), 2010.
  • Gruebler, A., Suzuki, K., “A Wearable Interface for Reading Facial Expressions based on Bioelectrical Signals,” Proc. of Intl Conf. on Kansei Engineering and Emotional Research, Online Proceedings, 2010.
   
     
Related Projects

 


  © 2005-2011 Artificial Intelligent Laboratory, University of Tsukuba, Japan