FIELD: physics.
SUBSTANCE: invention relates to the computer equipment. Device for generating a three-dimensional space model comprises an image acquisition interface configured to obtain image data generated by the image capturing device, said image data representing an observation result, so that there is a relative movement between the three-dimensional space and the image capturing device in time, and a simulation mechanism configured to process image data obtained by the image acquisition interface, and to calculate a three-dimensional model of three-dimensional space, where the simulation mechanism comprises a model segmenter, configured to segment the three-dimensional model and break it into at least active and inactive parts based on at least one model property, wherein the simulation mechanism is configured to use the active parts of the three-dimensional model in order to update the model over time, and the alignment mechanism is configured to align the active parts of the three-dimensional model with inactive parts of the three-dimensional model in time.
EFFECT: technical result consists in improvement of accuracy and stability of formed three-dimensional model.
23 cl, 16 dwg
Title | Year | Author | Number |
---|---|---|---|
SYSTEM FOR CONSTRUCTING THREE-DIMENSIONAL SPACE MODEL | 2023 |
|
RU2812950C1 |
SYSTEM AND METHOD FOR OBJECT TRACKING | 2004 |
|
RU2370817C2 |
DEVICE AND METHOD FOR PREDICTION AUTOFOCUS FOR AN OBJECT | 2021 |
|
RU2778355C1 |
MEDICAL IMAGE PROCESSING DEVICE, MEDICAL IMAGE PROCESSING METHOD AND DATA CARRIER | 2019 |
|
RU2762146C1 |
COMPUTER-AIDED GENERATION OF TARGET IMAGE | 2010 |
|
RU2560340C2 |
DEVICE, METHOD AND SYSTEM FOR RECONSTRUCTING 3D-MODEL OF OBJECT | 2015 |
|
RU2642167C2 |
IMAGE PROCESSING | 2017 |
|
RU2746431C2 |
AUTOMATIC ONLINE COMBINATION OF ROBOT AND IMAGES | 2012 |
|
RU2624107C2 |
EMBODIMENT OF VISUAL REPRESENTATION USING STUDIED INPUT FROM USER | 2010 |
|
RU2554548C2 |
METHOD AND DEVICE FOR DISPLAYING IMAGES BASED ON AUGMENTED REALITY AND MEDIUM FOR STORING INFORMATION | 2021 |
|
RU2801917C1 |
Authors
Dates
2020-02-05—Published
2016-05-17—Filed