FIELD: image decoding means.
SUBSTANCE: generating a merge candidate list for a first block. One of the merge candidates included in the merge candidate list is selected. Performing motion compensation for the first block based on the motion information of the selected candidate for merging, wherein the candidate for merging between sections included in the list of information on movement between sections, is added to the merge candidate list based on the number of spatial merge candidates and temporary merge candidates included in the merge candidate list. Merge candidates comprise a first merge candidate and a second merge candidate, and a prediction unit for the first block is generated based on the first merge candidate and the second merge candidate. First merge candidate merge_idx index information and second merge candidate merge_2nd_idx index information is obtained by analysing a bitstream, and, when the value of information on indices merge_2nd_idx is equal to value of information on indices merge_idx or greater, the value of the index of the second candidate for merging is obtained by adding 1 to the value of information on indices merge_2nd_idx.
EFFECT: high efficiency of prediction between images.
20 cl, 43 dwg, 8 tbl
Title | Year | Author | Number |
---|---|---|---|
METHOD AND DEVICE FOR CODING/DECODING IMAGE SIGNALS | 2019 |
|
RU2808720C2 |
METHOD AND DEVICE FOR CODING/DECODING IMAGE SIGNALS | 2019 |
|
RU2801863C2 |
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE SIGNALS | 2024 |
|
RU2838541C1 |
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE SIGNALS | 2023 |
|
RU2828008C1 |
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE SIGNALS | 2019 |
|
RU2818972C2 |
METHOD AND DEVICE FOR CODING/DECODING IMAGE SIGNALS | 2019 |
|
RU2801585C2 |
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE SIGNALS | 2024 |
|
RU2841600C1 |
METHOD FOR ENCODING/DECODING IMAGE SIGNALS | 2024 |
|
RU2831950C1 |
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE SIGNAL | 2024 |
|
RU2840725C1 |
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE SIGNAL | 2019 |
|
RU2817331C2 |
Authors
Dates
2025-04-16—Published
2024-09-27—Filed