Hi there,
I'm trying to implement different image processing algorithms like grayscaling, filtering, quantizing etc. on Android smartphones. I've got the camera preview rendered to a quadas an external texture, it works fine with grayscaling in fragment shader. But when I try to do the same in OpenGL ES 3.1 with compute shaders, I get invalid operation error,when calling glBindImageTexture(...) before dispatch compute. According to https://www.khronos.org/registry/gles/extensions/OES/OES_EGL_image_external_essl3.txt
if I bind an external texture like GL_TEXTURE_EXTERNAL_OES and bind it with glBindImageTexture, I should be able to access the image via image2D sampler in GLSL.What am I doing wrong?
Just to help us reproduce - what device are you running on, and what is the string returned by glGetString for GL_RENDERER, and GL_VERSION?
In terms of debug - firstly:
Does your platform actually support that extension? It's relatively recent. I guess it does as your shader compile didn't throw an error. Is it possible for you share a little more code so we can see what you are trying to do? A standalone APK reproducer would be ideal if that's possible, as it's the fastest way for us to get started.
Cheers, Pete
I have 2 devices to test, a Nexus 6P (which has an Adreno 430 GPU in it) and a Samsung S6 (which has Mali-T760 MP8). The version is OpenGL ES 3.1 V@127.0.
I was trying to bind an external texture to a compute shader so that I can use the image2D sampler to access it pixel by pixel:
GLES31.glBindImageTexture(0, texture[0], 0, false, 0, GLES31.GL_READ_ONLY, GLES31.GL_RGBA8);
However this requires the texture to be immutable and yesterday I figured out that I forgot to make the glTexStorage2D(...) call to make it immutable:
texture = new int[1];GLES31.glGenTextures(1, texture, 0);GLES31.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_WRAP_S, GLES31.GL_CLAMP_TO_EDGE);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_WRAP_T, GLES31.GL_CLAMP_TO_EDGE);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_MIN_FILTER, GLES31.GL_NEAREST);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_MAG_FILTER, GLES31.GL_NEAREST);GLES31.glTexStorage2D(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 1, GLES31.GL_RGBA8, mPreviewSize.getWidth(), mPreviewSize.getHeight());
However now at the last line of my code I get GL_INVALID_ENUM (1280) error code, since it seems like glTextStorage2D is not accepting GLES11Ext.GL_TEXTURE_EXTERNAL_OES.
This extension is needed to capture camera preview and put it into a SurfaceTexture. That is what I want to pass through the compute shader for image processing.
Thanks,
David
Hi David,
For external surfaces you can't (and should not need to) allocate new storage. The texture memory has already been allocated externally by the system, and should be pre-populated with the data you need from the external media device.
The wording in the extension spec related to immutability is an "neither ..., nor ..."
"An INVALID_OPERATION error is generated if <texture> is neither the name of an immutable texture object, nor the name of an external texture object."
... so the fact that it is an external texture object should be sufficient to allow the glBindImageTexture to succeed, even if the texture is not marked as immutable.
Hi Pete,
Thanks for your help so far, I've made the changes, so now this is how I generate the texture:
textures = new int[2];GLES31.glGenTextures(2, textures, 0);GLES31.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textures[0]);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_WRAP_S, GLES31.GL_CLAMP_TO_EDGE);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_WRAP_T, GLES31.GL_CLAMP_TO_EDGE);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_MIN_FILTER, GLES31.GL_NEAREST);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_MAG_FILTER, GLES31.GL_NEAREST);GLES31.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textures[1]);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_WRAP_S, GLES31.GL_CLAMP_TO_EDGE);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_WRAP_T, GLES31.GL_CLAMP_TO_EDGE);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_MIN_FILTER, GLES31.GL_NEAREST);GLES31.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES31.GL_TEXTURE_MAG_FILTER, GLES31.GL_NEAREST);
The compute shader code:
#version 310 es
#extension GL_OES_EGL_image_external_essl3 : enable
layout(rgba8, binding = 0) uniform readonly image2D inTexture;layout(rgba8, binding = 1) uniform writeonly image2D outTexture;layout (local_size_x = 8, local_size_y = 8, local_size_z = 1) in;void main()
{
ivec2 storePos = ivec2(gl_GlobalInvocationID.xy); vec4 texColor = imageLoad(inTexture, storePos).rgba; float newPixel = .299f * texColor.r + .587f * texColor.g + .114f * texColor.b; imageStore(outTexture, storePos, vec4(newPixel, newPixel, newPixel, texColor.a));}
And this is when I try to bind the texture to the shader:
GLES31.glBindImageTexture(0, textures[0], 0, false, 0, GLES31.GL_READ_ONLY, GLES31.GL_RGBA8);GLES31.glBindImageTexture(1, textures[1], 0, false, 0, GLES31.GL_WRITE_ONLY, GLES31.GL_RGBA8);
I am receiving invalid operation (1282) error on that call, so can you help me what am I doing wrong here?
Also rendering the external texture (surfacetexture from the camera) with just a fragment shader works fine.
Eventually I managed to figure it out. First render the external texture to a normal GL_TEXTURE_2D using a frame buffer (to set up a frame buffer, check this out: http://stackoverflow.com/questions/29003414/render-camera-preview-on-a-texture-with-target-gl-texture-2d). Then you can bind the normal (immutable) texture to the compute shader, just be aware if the bindings (because GL_TEXTURE0 is occupied by the external, the input and the output texture bindings of the compute shader is 1 and 2 respectively). Then do the image processing by reading from GL_TEXTURE1 to GL_TEXTURE2 (both are created as immutable, and the first one is which was bound to the frame buffer to render the external texture into it), then finally use the 3rd texture (GL_TEXTURE2) to display the results on the screen.
Glad you got it working - it's a shame you needed a copy the external surface to an internal surface though - half defeats the point of zero copy external imports. I'll keep digging to see if I can find a way of avoiding that ...
Thanks for the advice aborges, well I did know that, but you can call it a bad habit, as I'm mainly developing in C++ and got used to write the namespaces everytime, in order to avoid misunderstandings.
Hi, fireblade
this is just a code tip, not an address to your already solved issue.
You can make your code less verbose by using static imports. This avoid that you have to type GLES31 in every single static method call.
just add it to your imports:
import static android.opengl.GLES31.*;
Example for Math:
import static java.lang.Math.*;
public class HelloWorld{
public static void main(String []args){
// no need to type Math.sin() and Math.cos() - just sin() and cos()
System.out.println("Hello static import: " + sin(0.2312f) * cos(0.2312f) );
}
Hello,
unfortunately you only gave us the "other" vendor's driver version, so I can't tell you whether your Mali device should support the requested behaviour (or won't).
The Mali graphics driver will allow binding of external targets via BindImageTexture if (and only if) the OES_EGL_image_external_essl3 extension is exposed, but not before as the driver will follow the GLES core spec (which does not allow it).
Support for OES_EGL_image_external_essl3 has been added with the r11p0 release.
Hi, I'm encountering a similar issue.I have an OpenGL texture backed by an external EGL image.I'm trying to bind it to a compute shader using glBindImageTexture, but it return INVALID_OPERATION.
Platform: Android O.
Device: Mate 10 Pro.
GPU: Mali G72.
First I create an OpenGL texture backed by EGL image:
EGLImageKHR sourceImage = eglCreateImageKHR(...)glGenTextures(1, &texture);glBindTexture(GL_TEXTURE_2D, texture);glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, sourceImage);
Then I try to bind it as a GL compute image:
glBindImageTexture(...) -> return INVALID_OPERATION
Please advise.
The texture memory has already been allocated externally by the system, and should be pre-populated with the data you need from the external media device.
In case someone like me ends up here with a similar or related problem, the issue for me was accessing external texture in compute shader as follows
glBindImageTexture(texture_loc, texture_id, 0, false, 0, GL_READ_ONLY, format); -------------------------------------------------------------- layout(rgba8, binding = 0) uniform readonly image2D inTexture; ... vec4 texColor = imageLoad(inTexture, storePos).rgba;
Instead of this you should access and use them as we do in vertex/fragment shaders as follows
glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture_id); glUniform1i(texture_loc, 0); -------------------------------------------------------------- uniform samplerExternalOES sTexture ... vec4 cameraColor = texture(sTexture, vec2(float(gl_GlobalInvocationID.x)/width,float(gl_GlobalInvocationID.y)/height));
I'm working on a similar issue. Thanks all of you, now everything is almost clear to me. Just 1 thing I'm wondering... Could we output yuv format texture in the compute shader? Of course samplerExternalOES sTexture can't help in this case, so we might have to use image load/store (i.e, image2D...). However, the texture must be immutable and it seems that there is no yuv layout qualifier.
Thus, this seems to be impossible, but I've read the issue 10 on this link
https://www.khronos.org/registry/OpenGL/extensions/OES/OES_EGL_image_external_essl3.txt
and it said that image load/store may be supported.
Could anyone explain this?