Russian Researchers Propose New Approach to Studying Facial Emotion Recognition
Researchers of the HSE University and the Southern Federal University (SFedU) have tested a new method for studying the perception of facial emotional expressions. They suggest that asking subjects to recognise emotional expressions from dynamic video clips rather than static photographs can improve the accuracy of findings, eg in psychiatric and neurological studies. The paper is published in Applied Sciences.
Charles Darwin was one of the first scientists to explicitly describe facial expression in animals and its important evolutionary function. Recognising conspecifics' facial emotions helps animals adapt their behaviour to the environment. In human culture, the ability to recognise others' emotions through facial expression is essential in both interpersonal and professional contexts.
Among other methods, brain activity during perception and recognition of facial expressions is studied using event-related potentials (ERPs) of electroencephalograms (EEG).
In most such studies, subjects are asked to recognise emotions in static snapshots of human faces. A team of HSE and SFedU researchers suggest using animated facial images instead for a more realistic setup.
Using morphing technology to create a smooth transitional visual effect, the researchers produced 48 videos, each consisting of 12 frames showing a gradual change from a neutral expression to one of the basic emotions: anger, sadness, joy, disgust, surprise, or fear. As a result of using computer animation, all the videos are identical in style and duration—something that would be impossible to achieve when filming a real person.
The study involved 112 subjects, 59% men and 41% women, who watched 144 dynamic morphs made from 56 coloured, full-face photographs of four men and four women, and then named each recognised emotion verbally into a microphone.
Vladimir Kosonogov, head of the International Laboratory of Social Neurobiology
'We have only recently started using verbal labelling of emotions into a microphone. Before this, we used to ask respondents to press one of six buttons to indicate the emotion they could see. But that setup was inconvenient: remembering which of the six buttons to press was a struggle.'
The researchers recoded the subjects' ERPs of EEG from 32 electrodes. The study participants had been assessed for speed and accuracy of facial emotion recognition and split into two groups of higher and lower performance. The method of using dynamic morphs proved to be effective for both subgroups: the expected responses were detected in the brain’s occipital and temporal lobes, the former associated with vision in general and the latter with facial recognition.
The researchers were able to estimate the reaction time (how fast an expression was recognised) and the cognitive load (allocation of brain resources) for each type of facial expression. It was found that faster accurate recognition of emotional expressions required a greater allocation of processing resources, and that subjects with lower recognition accuracy needed to allocate even more brain resources for successful performance.
'The proposed method could facilitate studies focused on abnormal processing of facial expressions – a characteristic of people with emotional disorders, such as depression and anxiety, and those with autism, schizophrenia and ADHD. These individuals tend to have a reduced ability to recognise emotions from other people's faces', notes Kosonogov.
Vladimir Kosonogov
Senior Research Fellow, International Laboratory of Social Neurobiology