Hi Mali devs,
We are working on an android game that perform depth texture to implement post-process effect, like fog of war. We sample depth texture and convert it into world space to do some calculation to achieve a final UV, and then use this UV to sample another texture that store fog of war data, the result is good on Qualcomm devices with Adreno320 GPU, but wrong on Mali-400.
If we ignore other steps, only sample depth texture and linearize it, it looks incorrect (banding...), some devs thinks this is a precision issue, is it true? if it's a precision problem, are there any solution to solve this problem.
Thanks.
Hi rogerwang,
That does indeed sound like a precision issue. Mali-400's fragment shader only supports fp16 bit accuracy in its arithmetic units, so assuming you are converting a D24 texture then the result will undoubtedly have fewer bits than the format you are trying to linearise. That would be consistent with the banding artefacts you describe.
> if it's a precision problem, are there any solution to solve this problem.
The Mali-400 fragment shader hardware only supports fp16 arithmetic so there is no trivial workaround to add extra bits. All I can think of is trying to change the effect you are trying to use a different algorithm which is less sensitive to the precision difference.
I've never tried on Mali, but one technique is to mangle your vertex shader (which is fp32 precision on Mali) output to emit a depth value which is designed to generate different distribution of depth values. This can, for example, make the rasterized depth a linear distribution in the Z-buffer, rather than the usual non-linear one. It's not without issues (you get less Z-fighting in the distance, but risk more close to the camera) - but might be worth a try ...
For more details, try this article: http://www.mvps.org/directx/articles/linear_z/linearz.htm
Note that the depth texture fetch will still only return an fp16, so if you error occurs there then this technique won't help.
Hope that helps,
Pete