FIELD: image processing.
SUBSTANCE: proposed method of interpolating video frames comprises steps of: obtaining at least two video key frames; on one of key frames areas of repeating structures are detected; evaluating motion between said key frame and the interpolated frame by supplying the key frames and areas of repeating structures to the trained motion estimation neural network, wherein the value of the error function is calculated when training the neural network; interpolated frame is obtained by performing motion compensation using said key frame and the obtained motion vectors.
EFFECT: high quality of interpolated frames due to more accurate estimation of motion in areas of repeating structures taken into account during frame interpolation.
29 cl, 14 dwg, 1 tbl
Title | Year | Author | Number |
---|---|---|---|
METHOD FOR REGULARIZING GRADIENTS WHEN TRAINING DEEP NEURAL NETWORK IN ALGORITHM FOR INCREASING FRAME RATE OF VIDEO IMAGE WITH IMAGE DEFORMATION OPERATION AND DEVICE AND MACHINE-READABLE MEDIUM REALIZING SAID METHOD | 2024 |
|
RU2838424C1 |
FRC OCCLUSION PROCESSING WITH DEEP LEARNING | 2020 |
|
RU2747965C1 |
FRAME RATE CONVERSION METHOD SUPPORTING FRAME INTERPOLATION REPLACEMENT WITH MOTION COMPENSATION BY LINEAR FRAME COMBINATION AND DEVICE IMPLEMENTING IT | 2022 |
|
RU2786784C1 |
METHOD FOR SPATIALLY PARAMETERIZED ESTIMATION OF MOTION VECTORS | 2024 |
|
RU2839709C1 |
SYSTEM FOR GENERATING VIDEO WITH RECONSTRUCTED PHOTOREALISTIC 3D MODEL OF HUMAN, METHODS FOR SETTING UP AND OPERATING THIS SYSTEM | 2024 |
|
RU2834188C1 |
METHOD AND SYSTEM OF SUPERRESOLUTION BY COMBINED SPARSE APPROXIMATION | 2016 |
|
RU2661537C2 |
TEXTURED NEURAL AVATARS | 2019 |
|
RU2713695C1 |
VIDEO QUALITY ASSESSMENT TECHNOLOGY | 2010 |
|
RU2540846C2 |
METHOD OF CREATING FULL-LENGTH ANIMATED AVATAR OF PERSON FROM ONE IMAGE OF PERSON, COMPUTING DEVICE AND MACHINE-READABLE MEDIUM FOR IMPLEMENTATION THEREOF | 2023 |
|
RU2813485C1 |
VISUALIZATION OF RECONSTRUCTION OF 3D SCENE USING SEMANTIC REGULARIZATION OF NORMALS TSDF WHEN TRAINING NEURAL NETWORK | 2023 |
|
RU2825722C1 |
Authors
Dates
2025-03-11—Published
2024-05-20—Filed