I am using Mali-400 GPU on Linux platform.
In My OpenGLES application, I created an EGL Image using eglCreateImageKHR in RGBA8888 format. I am sending data buffer of RGBA8888 format to the GPU.
The image is displayed properly.
When I check with "fbdev" command, I can see that my frame buffer resolution is 1920x1080 16 bpp.
Where is the conversion from RGBA8888 to RGB565 happening?
Unorm textures are converted into floating point format when they are loaded into the shader program. Fragment shaders emit floating point color outputs to the blending stage, and the blending stage will covert the output to whatever the native framebuffer format is before writing back to memory.
HTH, Pete
Thanks for the reply. will the performance improve if i send the same format as native frame buffer format?
In terms of ideal performance (ignoring cache misses, bandwidth, etc) then no - the format conversion is "free".
However if you know the output is going to be 16-bit then you are wasting bandwidth and texture cache uploading a 32-bit texture and then throwing away half of the information it contains, so yes it should help a bit in practice because you will have memory memory access to make and less data to store in the texture cache.