METHOD OF ASSESSING USER ENGAGEMENT Russian patent published in 2025 - IPC G06V40/16 

Abstract RU 2837315 C1

FIELD: physics.

SUBSTANCE: invention relates to a method for assessing user engagement. Method comprises steps of: obtaining a video stream from a video camera in real time; extracting frames from the video stream by the computing device; each extracted frame is processed using a machine learning model based on a neural network (NN), wherein during said processing key points of pupils and cheekbones are extracted in form of 2D coordinates, after which, on the processed frame, using the computing device: converting extracted 2D coordinates into 3D vectors and determining the change in the position of these 3D vectors to determine the direction of rotation of the head, with respect to the information display device; calculating Euclidean distances between points and angles between vectors to determine the size of the pupils and the direction of sight; measuring the time during which the user’s gaze is stably focused on the information display device, if the user looks at the information display device, the time increases, and when distracted, it is reset and the counting starts again; measuring the number of changes in facial expressions of the user during the measured period to determine emotional involvement; determining the intensity of facial expression changes when the user looks at the information display device to determine the reaction; based on the obtained data, the user involvement indicator is calculated by: normalizing each variable so that their values are in the same range, calculating a weighted sum of normalized values of gaze focusing duration, products of frequency of changes of facial expressions and intensity of changes of facial expressions, as well as spatial changes of head position, obtained sum is divided by number of parameters to obtain average value, the result is multiplied by 100 to convert it into a percentage; the final value of the involvement is determined as the weighted average value of the normalized parameters multiplied by 100 and limited by range from 0 to 100.

EFFECT: high accuracy of determining user involvement.

1 cl

Similar patents RU2837315C1

Title Year Author Number
METHOD AND SYSTEM FOR CALCULATING ATTENTIVENESS INDEX 2024
  • Kuryan Sergej Mikhajlovich
RU2837379C1
METHOD AND SYSTEM FOR DETERMINING SYNTHETICALLY MODIFIED FACE IMAGES ON VIDEO 2021
  • Vyshegorodtsev Kirill Evgenevich
  • Balashov Aleksandr Viktorovich
  • Velmozhin Grigorij Alekseevich
  • Sysoev Valentin Valerevich
RU2768797C1
METHOD AND SYSTEM FOR EVALUATING QUALITY OF CUSTOMER SERVICE BASED ON ANALYSIS OF VIDEO AND AUDIO STREAMS USING MACHINE LEARNING TOOLS 2018
  • Maslov Aleksej Yurevich
RU2703969C1
METHOD AND SYSTEM FOR AUTOMATED GENERATION OF VIDEO STREAM WITH DIGITAL AVATAR BASED ON TEXT 2020
  • Zyryanov Aleksandr Vladimirovich
  • Kurilenkov Aleksandr Nikolaevich
  • Ivlenkov Sergej Vladimirovich
  • Levin Maksim Aleksandrovich
RU2748779C1
METHOD OF PROCESSING A TWO-DIMENSIONAL IMAGE AND A USER COMPUTING DEVICE THEREOF 2018
  • Glazistov Ivan Viktorovich
  • Karacharov Ivan Olegovich
  • Shcherbinin Andrey Yurievich
  • Kurilin Ilya Vasilievich
RU2703327C1
METHOD AND SYSTEM FOR REAL-TIME RECOGNITION AND ANALYSIS OF USER MOVEMENTS 2022
  • Bolshakov Emil Yuryevich
RU2801426C1
DEVICE FOR DETECTING SIGNS OF ANTISOCIAL BEHAVIOUR 2023
  • Vukolov Aleksandr Vladimirovich
  • Dolgij Aleksandr Igorevich
  • Kudyukin Vladimir Valerevich
  • Khakiev Zelimkhan Bagauddinovich
RU2798280C1
VIDEO PROCESSING METHOD FOR VISUAL SEARCH PURPOSES 2018
  • Podlesnyj Sergej Yurevich
  • Kucherenko Aleksej Valentinovich
RU2693994C1
METHOD AND SYSTEM FOR RECOGNIZING FACES AND CONSTRUCTING A ROUTE USING AUGMENTED REALITY TOOL 2019
  • Kudiyarov Dmitrij Sergeevich
  • Balashov Aleksandr Viktorovich
  • Evgrashin Aleksandr Sergeevich
RU2712417C1
METHOD OF DISPLAYING THREE-DIMENSIONAL FACE OF THE OBJECT AND DEVICE FOR IT 2017
  • Yugaj Evgenij Borisovich
RU2671990C1

RU 2 837 315 C1

Authors

Kuryan Sergej Mikhajlovich

Dates

2025-03-28Published

2024-06-27Filed