FIELD: medicine.
SUBSTANCE: wearable augmented reality forming device is attached to the subject, which is part of the research software and hardware complex. The researcher chooses a mode of motor or exercise test from the complex program library and introduces it to the subject. Then, a sequence of two light pulses of 50 ms duration, separated by a pause equal to 150 ms, repeating through a constant time interval of 1.5 s, is applied to the device that forms the augmented reality. In the first stage of measurements, the duration of a pause between two light pulses is reduced for each subsequent presentation discretely with a predetermined constant step of 0.5 ms until the subject determines the moment of the fusion of two light pulses into one. At the second stage of measurements, the duration of the interpulse interval between the light pulses in the pair is increased for each subsequent presentation discretely with a predetermined constant step of 0.1 ms, until the subject determines the moment of the sensation of separation of two light pulses in the pair. Duration of the interpulse interval between two light pulses in a pair at the moment of the sensation of two light pulses separation is recorded. The time of perception of human visual information is taken equal to the sum of the duration of the light pulse and the duration of the pause between two light pulses at the moment of the sensation of the separation of two light pulses determined in the second stage of measurements. Further, during the execution of the motor or load test under stationary conditions or ground, water or flight movements periodically with a given period, the subject is presented with light pulses with the last fixed duration of the pause. If the subject feels fusion of two light pulses in a pair into one, the duration of the interpulse interval between two light pulses in the pair is increased for each subsequent presentation discretely with a given constant step of 0.1 ms, until the subject determines the moment of sensation of separation of two light pulses in the pair. Duration of the interpulse interval between two light pulses in a pair at the moment of the sensation of two light pulses separation is recorded. If the subject feels two light pulses in the pair separate, the duration of the interpulse interval between two light pulses in the pair is reduced for each subsequent presentation discretely with a given constant step of 0.1 ms, until the subject determines the moment of fusion of two light pulses in a pair into one. Duration of the interpulse interval between two light pulses in a pair at the moment of the sensation of two light pulses separation is recorded. The fixation of the duration of a pause is made according to the conditional actions or signals of the subject, which are fixed by the software and hardware complex or by the researcher. When the subject is removed to a considerable distance from the researcher, communication and information exchange is carried out via the radio channel.
EFFECT: method allows to determine the time of perception of visual information in the process of performing a motor or exercise test in conditions of augmented reality.
1 cl
Title | Year | Author | Number |
---|---|---|---|
METHOD OF RANKING ATHLETES ACCORDING TO THE TIME OF VISUAL PERCEPTION | 2017 |
|
RU2647999C1 |
METHOD FOR DETERMINATION OF VISUAL ANALYZER ACTIVATION TIME | 2016 |
|
RU2626597C1 |
METHOD OF SELECTING INDIVIDUALS BY RESOLUTION OF VISUAL EVENTS IN TIME | 2017 |
|
RU2657198C1 |
METHOD OF RANKING THE ATHLETES ON THE RESOLVING POWER OF VISUAL EVENTS IN TIME | 2017 |
|
RU2657386C1 |
METHOD FOR HUMAN VISION SYSTEM RESPONSE TIME DEFINITION | 2016 |
|
RU2622180C1 |
METHOD FOR HUMAN VISION SYSTEM RESPONSE TIME DEFINITION | 2016 |
|
RU2626686C1 |
METHOD FOR DETECTING LABILITY OF HUMAN VISION SYSTEM | 2003 |
|
RU2233115C1 |
METHOD FOR DETERMINING HUMAN VISION SYSTEM PERSISTENCE TIME | 2004 |
|
RU2252701C1 |
METHOD OF RANKING ATHLETES ACCORDING TO THE TIME OF VISUAL ANALYZER EXCITATION | 2017 |
|
RU2647997C1 |
METHOD FOR DETERMINING VISION INFORMATION PERCEPTION TIME | 2002 |
|
RU2209030C1 |
Authors
Dates
2017-07-31—Published
2016-02-10—Filed