FIELD: data processing.
SUBSTANCE: invention relates to automatic video stream analysis. In the method, a stream of images of the observed space is provided, containing a sequence of raster frames; statistics of parameters of each pixel of the mentioned frames is formed; pixel-by-pixel frame prediction is formed, including a fast background and a slow background, image of the current frame is received; parameters of each pixel of said current frame are compared with the parameters of the corresponding pixel of said frame prediction; candidate for a fixed object is formed as part of said current frame based on the pixels of said current frame, the parameters of which differ slightly from the parameters of the corresponding pixels of the above-mentioned fast background, but are significantly different from the parameters of the pixels of the above-mentioned slow background; conclusion about the detection of the left object is made by forming a spatially-connected set of pixels of a candidate for a fixed object, provided that said set corresponds to specified conditions.
EFFECT: technical result is to improve the quality of object detection.
1 cl, 3 dwg
Title | Year | Author | Number |
---|---|---|---|
VIDEO STREAM ANALYSIS METHOD | 2018 |
|
RU2676026C1 |
METHOD OF IDENTIFICATION OF OBJECT IN A VIDEO STREAM | 2018 |
|
RU2676029C1 |
APPARATUS FOR PROCESSING VIDEO INFORMATION OF SECURITY ALARM SYSTEM | 2009 |
|
RU2484531C2 |
METHOD AND SYSTEM FOR ANALYZING STATIC OBJECTS IN A VIDEO STREAM | 2020 |
|
RU2723900C1 |
METHOD FOR DETECTING HOLOGRAPHIC PROTECTION ON DOCUMENTS IN A VIDEO STREAM | 2021 |
|
RU2771005C1 |
METHOD AND SYSTEM FOR DISPLAYING SCALED SCENES IN REAL TIME | 2015 |
|
RU2606875C2 |
METHOD AND SYSTEM FOR DETECTING SMALL OR THIN OBJECTS ON IMAGES (VERSIONS) | 2013 |
|
RU2546600C2 |
SCALABILITY TECHNIQUES BASED ON CONTENT INFORMATION | 2006 |
|
RU2378790C1 |
METHOD OF DETECTING AND LOCALIZING TEXT FORMS ON IMAGES | 2016 |
|
RU2697737C2 |
METHOD OF COMPARING DIGITAL IMAGES | 2017 |
|
RU2673396C1 |
Authors
Dates
2018-12-25—Published
2018-03-14—Filed