FIELD: physics.
SUBSTANCE: invention relates to a method for assessing user engagement. Method comprises steps of: obtaining a video stream from a video camera in real time; extracting frames from the video stream by the computing device; each extracted frame is processed using a machine learning model based on a neural network (NN), wherein during said processing key points of pupils and cheekbones are extracted in form of 2D coordinates, after which, on the processed frame, using the computing device: converting extracted 2D coordinates into 3D vectors and determining the change in the position of these 3D vectors to determine the direction of rotation of the head, with respect to the information display device; calculating Euclidean distances between points and angles between vectors to determine the size of the pupils and the direction of sight; measuring the time during which the user’s gaze is stably focused on the information display device, if the user looks at the information display device, the time increases, and when distracted, it is reset and the counting starts again; measuring the number of changes in facial expressions of the user during the measured period to determine emotional involvement; determining the intensity of facial expression changes when the user looks at the information display device to determine the reaction; based on the obtained data, the user involvement indicator is calculated by: normalizing each variable so that their values are in the same range, calculating a weighted sum of normalized values of gaze focusing duration, products of frequency of changes of facial expressions and intensity of changes of facial expressions, as well as spatial changes of head position, obtained sum is divided by number of parameters to obtain average value, the result is multiplied by 100 to convert it into a percentage; the final value of the involvement is determined as the weighted average value of the normalized parameters multiplied by 100 and limited by range from 0 to 100.
EFFECT: high accuracy of determining user involvement.
1 cl
Title | Year | Author | Number |
---|---|---|---|
METHOD AND SYSTEM FOR CALCULATING ATTENTIVENESS INDEX | 2024 |
|
RU2837379C1 |
METHOD AND SYSTEM FOR DETERMINING SYNTHETICALLY MODIFIED FACE IMAGES ON VIDEO | 2021 |
|
RU2768797C1 |
METHOD AND SYSTEM FOR EVALUATING QUALITY OF CUSTOMER SERVICE BASED ON ANALYSIS OF VIDEO AND AUDIO STREAMS USING MACHINE LEARNING TOOLS | 2018 |
|
RU2703969C1 |
METHOD AND SYSTEM FOR AUTOMATED GENERATION OF VIDEO STREAM WITH DIGITAL AVATAR BASED ON TEXT | 2020 |
|
RU2748779C1 |
METHOD OF PROCESSING A TWO-DIMENSIONAL IMAGE AND A USER COMPUTING DEVICE THEREOF | 2018 |
|
RU2703327C1 |
METHOD AND SYSTEM FOR REAL-TIME RECOGNITION AND ANALYSIS OF USER MOVEMENTS | 2022 |
|
RU2801426C1 |
DEVICE FOR DETECTING SIGNS OF ANTISOCIAL BEHAVIOUR | 2023 |
|
RU2798280C1 |
VIDEO PROCESSING METHOD FOR VISUAL SEARCH PURPOSES | 2018 |
|
RU2693994C1 |
METHOD AND SYSTEM FOR RECOGNIZING FACES AND CONSTRUCTING A ROUTE USING AUGMENTED REALITY TOOL | 2019 |
|
RU2712417C1 |
METHOD OF DISPLAYING THREE-DIMENSIONAL FACE OF THE OBJECT AND DEVICE FOR IT | 2017 |
|
RU2671990C1 |
Authors
Dates
2025-03-28—Published
2024-06-27—Filed