drm/amd/display: When enabling CRC, disable dither & enable truncation
When user-mode is using 8bpc, the hardware represents it internally using a higher bit depth. This causes problems when comparing CRCs for color managment tests. We need to disable dithering as well, since it makes CRC values non-deterministic. It's easy to see why dithering needs to be disabled, The reason why truncation also needs to be enabled is better described with an example. Consider the folowing which tests the color transform matrix (CTM): Expected CRC = FB_A -> Degamma (Bypassed) -> CTM (Bypassed) | v Obtain CRC <- Regamma(Bypassed) Actual CRC = FB_B -> Degamma (Bypassed) -> CTM (0.5*Identity) | v Obtain CRC <- Regamma(Bypassed) FB_A contains a solid red color at half intensity (127 @ 8bpc) FB_B contains a solid red color at full intensity (255 @ 8bpc) We expect that Expected CRC = Actual CRC, but that's not the case. When the CTM is applied, the output is at half intensity, but also at a higher bit depth within hardware. i.e. 255/2 = 127.5: not representable in 8bpc, but can be at 10bpc. This causes the two CRC's to be different. The solution is to truncate the output bit depth to the same as input when enabling CRC capture. Since Linux only supports 8bpc, hard code that for now. Signed-off-by: Leo (Sunpeng) Li <sunpeng.li@amd.com> Reviewed-by: Harry Wentland <Harry.Wentland@amd.com> Signed-off-by: Alex Deucher <alexander.deucher@amd.com>
Showing
Please register or sign in to comment