• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Russian Scientists Teach AI to Analyse Emotions of Participants at Online Events

Russian Scientists Teach AI to Analyse Emotions of Participants at Online Events

© iStock

HSE researchers have proposed a new neural network method for recognising emotions and people's engagement. The algorithms are based on the analysis of video images of faces and significantly outperform existing single models. The developed models are suitable for low-performance equipment, including mobile devices. The results can be implemented into video conferencing tools and online learning systems to analyse the engagement and emotions of participants. The results of the study were published in IEEE Transactions on Affective Computing.

The COVID-19 pandemic resulted in the active development of online video conferencing tools and e-learning systems. Artificial intelligence technologies can help teachers remotely monitor the engagement of event participants. Currently, algorithms for analysing student behaviour and identifying engagement in the online environment are being studied by experts in the field of data mining for education. Automatic methods based on computer vision technologies are the most widespread analysis tools. The researchers believe that the quality of many e-learning systems can be greatly influenced by identifying the emotions and involvement of participants based on video analytics.

HSE researchers have developed a new neural network algorithm for recognising emotions and engagement from video images of faces as part of the project “Neural network algorithms for analysing the dynamics of students' emotional state and engagement based on video imaging data” of the HSE AI Centre.

The researchers taught the neural network to extract emotional features based on a specially developed robust optimisation technique and process only the most important areas of the face. First, the neural network detects faces and extracts their emotional features, then groups the faces of each participant. Then, with the help of specially trained effective neural network models, the emotional features of each selected person are extracted, aggregated using statistical functions, and classified. At the final stage, the fragments of the video tutorial with the most pronounced emotions and different degrees of involvement of each listener are visualised. As a result, the researchers managed to create a new model that determines the emotions of each person and their engagement for several faces in the video at once.

Andrey Savchenko

Andrey Savchenko, Project Head, Professor at the Department of Information Systems and Technologies

‘We have demonstrated that the proposed algorithms outperform existing single models in terms of accuracy for several datasets. At the same time, unlike most well-known technologies, the developed models can engage in real-time video processing even on low-performance devices, including the mobile phones of each participant of an online event. Together with Ilya Makarov from the Artificial Intelligence Research Institute, we have created an easy-to-use computer programme that allows you to process a video recording of a webinar or online class and get a set of video clips with the emotional features of each participant.’

The results can be implemented into video conferencing and online learning software to analyse the engagement and emotions of participants. Thus, during the preliminary testing of the online course, we can use students’ reactions to understand which parts of the lecture were most interesting, and what turned out to be difficult to understand and needs to be corrected. Currently, the researchers are working to integrate the developed models into the Jazz by Sber video conferencing service. The videos collected from open sources for this project will allow researchers to take a step towards creating a service for identifying emotions and engaging students in online events.