Opened 5 years ago

Closed 5 years ago

#409 closed defect (fixed)

Wrong indent in semantic of sig_coeff_flag

Reported by: chhuanb Owned by:
Priority: minor Milestone:
Component: spec Version: VVC D6 vE
Keywords: Cc: ksuehring, bbross, XiangLi, fbossen, jvet@…

Description

In semantic of sig_coeff_flag, "( xC & ( (1 << log2SbW ) − 1 ), yC & ( (1 << log2SbH ) − 1 ) ) is equal to ( 0, 0 )" should be aligned with "inferSbDcSigCoeffFlag is equal to 1"

Correction:
When sig_coeff_flag[ xC ][ yC ] is not present, it is inferred as follows:
– If ( xC, yC ) is the last significant location ( LastSignificantCoeffX, LastSignificantCoeffY ) in scan order or all of the following conditions are true, sig_coeff_flag[ xC ][ yC ] is inferred to be equal to 1:

– ( xC & ( (1 << log2SbW ) − 1 ), yC & ( (1 << log2SbH ) − 1 ) ) is equal to ( 0, 0 ).
– inferSbDcSigCoeffFlag is equal to 1.
– coded_sub_block_flag[ xS ][ yS ] is equal to 1.

– Otherwise, sig_coeff_flag[ xC ][ yC ] is inferred to be equal to 0.

Change history (1)

comment:1 Changed 5 years ago by bbross

  • Resolution set to fixed
  • Status changed from new to closed

This will be fixed in the next version of draft 7 (JVET-O2001-v7).

Note: See TracTickets for help on using tickets.