Hi,
I'm working on an algorithm where I need to detect if the current `gl_FragCoord` is an even or odd row/column. A typical implementation would look something like `if (mod(gl_FragCoord.x, 2.0) < 1.0) { ... } else { ... }`. After running into issues with this approach a quick google search pointed me to quite some good information:
Especially this forum post is pretty much identical to the issue I'm looking into.
I've created this repository to experiment with this issue and to solve it. I'm using a MiBox MDZ-16-AB. This repository is created for Android Studio. It will setup a basic custom `GLSurfaceView` that uses the `TestGlView` that instantiates the `TestGlRenderer` class. In `TestGlRenderer` I create a simple filter shader that applies the `if (mod(..))` logic to draw different colors for odd/even columns. I create a FBO with a size of 3840 x 2160 to detect floating point precision issues. When rendering into a FBO with a color attachment of a texture with a size of 1920 x 1080 the issues is a lot less.
In the image below (3840 x 2160) you can clearly see the issue. It should show vertical red and black lines from left to right.
When rendering using a 1920 x 1080 things get a little bit better, but still not 100% correct.
In `TestGlRenderer` I create an instance of `GlRenderToTexture` which is just a thin wrapper that create a FBO with one texture attachment. In `onSurfaceCreated()` of the `TestGlRenderer` class I create an instance of `GlRenderToTexture`. In the code I've added (a commented) version which creates either a 1920x1080 or 3840x2160 FBO.
Now I'm curious what would be a workaround or solution to be able to distinguish between odd and even rows/columns?
Thanks
If you go for tiled framebuffers, for FP16 you get 10 bits of mantissa so you should only be able to accurately address 2^10 = 1024 pixel wide framebuffers using gl_FragCoord.