Hi. We are experiencing unexpected depth buffer behaviour when setting glDepthRange with equal min and max values.In the below example a single quad is rendererone trace set glDepthRangef(0.49, 0.5) which produces expected resultsanother trace set glDepthRangef(0.5, 0.5) which produces unexpected resultsthe graphic analyzer traces can be found at https://drive.google.com/drive/folders/1e_oDplD3EyXENUuVzsrnEi17e-T_CUfC?usp=sharingGL_VENDOR = ARM, GL_RENDERER = Mali-G710, GL_VERSION = OpenGL ES 3.2 v1.r38p1-01eac0.55eb2d40cce8f18c0f57f61c686a946fResult of single quad rendering with glDepthRangef(0.49, 0.5)
Result of single quad rendering (same state, uniforms, ....) with glDepthRangef(0.5, 0.5)
Also worth noting that this issue is not reproducible on other GPU's. And the question is - whether this a known issue and what are the recommended workarounds?Thank you in advance, Aleksei
It happens also without the divide. The divide was actually introduced during debug to see, whether it could possibly mitigate the issue, but same effect - results is the same no matter whether we divide or not.
Just to confirm - we're reproduced the issue, and your existing workaround (making the min-max diff a small amount above zero) is what we would recommend as a software fix.
Thx for the confirmation!