This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Problems getting SurfaceTexture work with Android Videoplayer on MALI 400

Note: This was originally posted on 5th April 2013 at http://forums.arm.com

Hi,

I've written an application which is rendering video into a texture basically using the code found as attachment.

This example works perferctly fine on a mobile device and on a tablet - both based on a Tegra. I've then tried the same code on a MALI 400 based android system (Minix Neo5) and I can see that the video plays but I get mainly a black screen with some garbage on it (just 3 or 4 lines)



public class VideoSurfaceView extends GLSurfaceView {

    VideoRender mRenderer;
    private MediaPlayer mMediaPlayer = null;

    public VideoSurfaceView(Context context, MediaPlayer mp) {
        super(context);

        setEGLContextClientVersion(2);
        mMediaPlayer = mp;
        mRenderer = new VideoRender(context);
        setRenderer(mRenderer);
        mRenderer.setMediaPlayer(mMediaPlayer);
    }

    @Override
    public void onResume() {
        queueEvent(new Runnable(){
                public void run() {
                    mRenderer.setMediaPlayer(mMediaPlayer);
                }});

        super.onResume();
    }

    private static class VideoRender
        implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener {
        private static String TAG = "VideoRender";

        private static final int FLOAT_SIZE_BYTES = 4;
        private static final int TRIANGLE_VERTICES_DATA_STRIDE_BYTES = 5 * FLOAT_SIZE_BYTES;
        private static final int TRIANGLE_VERTICES_DATA_POS_OFFSET = 0;
        private static final int TRIANGLE_VERTICES_DATA_UV_OFFSET = 3;
        private final float[] mTriangleVerticesData = {
            // X, Y, Z, U, V
            -1.0f, -1.0f, 0, 0.f, 0.f,
            1.0f, -1.0f, 0, 1.f, 0.f,
            -1.0f,  1.0f, 0, 0.f, 1.f,
            1.0f,  1.0f, 0, 1.f, 1.f,
        };

        private FloatBuffer mTriangleVertices;

        private final String mVertexShader =
                "uniform mat4 uMVPMatrix;\n" +
                "uniform mat4 uSTMatrix;\n" +
                "attribute vec4 aPosition;\n" +
                "attribute vec4 aTextureCoord;\n" +
                "varying vec2 vTextureCoord;\n" +
                "void main() {\n" +
                "  gl_Position = uMVPMatrix * aPosition;\n" +
                "  vTextureCoord = (uSTMatrix * aTextureCoord).xy;\n" +
                "}\n";

        private final String mFragmentShader =
                "#extension GL_OES_EGL_image_external : require\n" +
                "precision mediump float;\n" +
                "varying vec2 vTextureCoord;\n" +
                "uniform samplerExternalOES sTexture;\n" +
                "void main() {\n" +
                "  gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
                "}\n";

        private float[] mMVPMatrix = new float[16];
        private float[] mSTMatrix = new float[16];

        private int mProgram;
        private int mTextureID;
        private int muMVPMatrixHandle;
        private int muSTMatrixHandle;
        private int maPositionHandle;
        private int maTextureHandle;

        private SurfaceTexture mSurface;
        private boolean updateSurface = false;

        private static int GL_TEXTURE_EXTERNAL_OES = 0x8D65;

        private MediaPlayer mMediaPlayer;

        public VideoRender(Context context) {
            mTriangleVertices = ByteBuffer.allocateDirect(
                mTriangleVerticesData.length * FLOAT_SIZE_BYTES)
                    .order(ByteOrder.nativeOrder()).asFloatBuffer();
            mTriangleVertices.put(mTriangleVerticesData).position(0);

            Matrix.setIdentityM(mSTMatrix, 0);
        }

        public void setMediaPlayer(MediaPlayer player) {
            mMediaPlayer = player;
        }

        public void onDrawFrame(GL10 glUnused) {
            synchronized(this) {
                if (updateSurface) {
                    mSurface.updateTexImage();
                    mSurface.getTransformMatrix(mSTMatrix);
                    updateSurface = false;
                }
            }

            GLES20.glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
            GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);

            GLES20.glUseProgram(mProgram);
            checkGlError("glUseProgram");

            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, mTextureID);

            mTriangleVertices.position(TRIANGLE_VERTICES_DATA_POS_OFFSET);
            GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false,
                TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
            checkGlError("glVertexAttribPointer maPosition");
            GLES20.glEnableVertexAttribArray(maPositionHandle);
            checkGlError("glEnableVertexAttribArray maPositionHandle");

            mTriangleVertices.position(TRIANGLE_VERTICES_DATA_UV_OFFSET);
            GLES20.glVertexAttribPointer(maTextureHandle, 3, GLES20.GL_FLOAT, false,
                TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
            checkGlError("glVertexAttribPointer maTextureHandle");
            GLES20.glEnableVertexAttribArray(maTextureHandle);
            checkGlError("glEnableVertexAttribArray maTextureHandle");

            Matrix.setIdentityM(mMVPMatrix, 0);
            GLES20.glUniformMatrix4fv(muMVPMatrixHandle, 1, false, mMVPMatrix, 0);
            GLES20.glUniformMatrix4fv(muSTMatrixHandle, 1, false, mSTMatrix, 0);

            GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
            checkGlError("glDrawArrays");
            GLES20.glFinish();

        }

        public void onSurfaceChanged(GL10 glUnused, int width, int height) {

        }

        public void onSurfaceCreated(GL10 glUnused, EGLConfig config) {
            mProgram = createProgram(mVertexShader, mFragmentShader);
            if (mProgram == 0) {
                return;
            }
            maPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
            checkGlError("glGetAttribLocation aPosition");
            if (maPositionHandle == -1) {
                throw new RuntimeException("Could not get attrib location for aPosition");
            }
            maTextureHandle = GLES20.glGetAttribLocation(mProgram, "aTextureCoord");
            checkGlError("glGetAttribLocation aTextureCoord");
            if (maTextureHandle == -1) {
                throw new RuntimeException("Could not get attrib location for aTextureCoord");
            }

            muMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
            checkGlError("glGetUniformLocation uMVPMatrix");
            if (muMVPMatrixHandle == -1) {
                throw new RuntimeException("Could not get attrib location for uMVPMatrix");
            }

            muSTMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uSTMatrix");
            checkGlError("glGetUniformLocation uSTMatrix");
            if (muSTMatrixHandle == -1) {
                throw new RuntimeException("Could not get attrib location for uSTMatrix");
            }


            int[] textures = new int[1];
            GLES20.glGenTextures(1, textures, 0);

            mTextureID = textures[0];
            GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, mTextureID);
            checkGlError("glBindTexture mTextureID");

            GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
                                GLES20.GL_NEAREST);
            GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
                                GLES20.GL_LINEAR);

            /*
          * Create the SurfaceTexture that will feed this textureID,
          * and pass it to the MediaPlayer
          */
            mSurface = new SurfaceTexture(mTextureID);
            mSurface.setOnFrameAvailableListener(this);

            Surface surface = new Surface(mSurface);
            mMediaPlayer.setSurface(surface);
            surface.release();

            try {
                mMediaPlayer.prepare();
            } catch (IOException t) {
                Log.e(TAG, "media player prepare failed");
            }

            synchronized(this) {
                updateSurface = false;
            }

            mMediaPlayer.start();
        }

        synchronized public void onFrameAvailable(SurfaceTexture surface) {
            updateSurface = true;
        }

        private int loadShader(int shaderType, String source) {
            int shader = GLES20.glCreateShader(shaderType);
            if (shader != 0) {
                GLES20.glShaderSource(shader, source);
                GLES20.glCompileShader(shader);
                int[] compiled = new int[1];
                GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);
                if (compiled[0] == 0) {
                    Log.e(TAG, "Could not compile shader " + shaderType + ":");
                    Log.e(TAG, GLES20.glGetShaderInfoLog(shader));
                    GLES20.glDeleteShader(shader);
                    shader = 0;
                }
            }
            return shader;
        }

        private int createProgram(String vertexSource, String fragmentSource) {
            int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexSource);
            if (vertexShader == 0) {
                return 0;
            }
            int pixelShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource);
            if (pixelShader == 0) {
                return 0;
            }

            int program = GLES20.glCreateProgram();
            if (program != 0) {
                GLES20.glAttachShader(program, vertexShader);
                checkGlError("glAttachShader");
                GLES20.glAttachShader(program, pixelShader);
                checkGlError("glAttachShader");
                GLES20.glLinkProgram(program);
                int[] linkStatus = new int[1];
                GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linkStatus, 0);
                if (linkStatus[0] != GLES20.GL_TRUE) {
                    Log.e(TAG, "Could not link program: ");
                    Log.e(TAG, GLES20.glGetProgramInfoLog(program));
                    GLES20.glDeleteProgram(program);
                    program = 0;
                }
            }
            return program;
        }

        private void checkGlError(String op) {
            int error;
            while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
                Log.e(TAG, op + ": glError " + error);
                throw new RuntimeException(op + ": glError " + error);
            }
        }

    }  // End of class VideoRender.

}  // End of class VideoSurfaceView.
[size=2]
[/size]

I've tried to use the OpenGL ES tracer to check what is copied from the SurfaceTexture back to the texture itself but it looks like the Tracer cannot show it correctly (neither on the tegra systems nor on the tegra devices. Note that I don't get any OpenGL error nor any warnings whatsover. It looks like everything is working fine.

Is there any other tool that can show me what if anything is copied to the TEXTURE_EXTERNAL ? All TEXTURE_2D are available on the tracer.

One strange thing I've noticed on the MALI system is that it can't play high-def videos at all. And my videos 1280x720 report on the debuger that they are using a SoftwareRenderer. Perhaps that's why the frames are not copied correctly to the OGL_IMAGE that will be used as a texture.

I guess I need some better understanding of what's going on under the hood, but I expected an error telling the system can't do it or it should simply work, even if the performance would be bad.

I'm attaching the source code for the testing app I've written.
  • Note: This was originally posted on 5th April 2013 at http://forums.arm.com

    Hi,

    interesting question. Can you tell us which version of Android is on the Mali-400 platform?

    Perhaps the quickest test would be to try the APK on another Mali-400 device to understand whether it's perhaps a system integration issue on the specific device you have, or a generic problem with how the code runs on Mali-400 devices.

    Would you be able to post a binary APK of your simple example?

    Thanks, Pete
  • Note: This was originally posted on 5th April 2013 at http://forums.arm.com

    Hi Pete,

    Many thanks for your reply. I'm getting crazy trying to find the problem.
    I've tested it on a minix neo5 running 4.1.1 and on a giadatech Q11 running 4.0.3.
    Perhaps their "flavor" of android is broken ?

    i'm attaching the debug signed app. I had to split it in two rar files as I have a video in resources that makes my file larger.
    Please note that I've changed it .txt, please remove the ending.
  • Note: This was originally posted on 18th April 2013 at http://forums.arm.com

    I could confirm the error reported by Marco with two devices:
    - Gadmei E8-3D (Android 4.0.4)
    - M3 Enjoy TV Box (Android 4.0.4)

    It works fine on Samsung Galaxy Tab 2 10.1.

    As far as I can tell from my own (but very similar) test application:
    - the video is playing "behind" the OpenGL scene (just set the OpenGL transparent)
    - The onFrameAvailable() method is also only called once

    Please find attached my screen. The video should be placed on the rotating quad in the center.
  • Note: This was originally posted on 19th April 2013 at http://forums.arm.com


    I could confirm the error reported by Marco with two devices:
    - Gadmei E8-3D (Android 4.0.4)
    - M3 Enjoy TV Box (Android 4.0.4)


    Thanks for the reply, meanwhile I've tested on an additional MALI 400  device:

    - Point of View 10" Protab 3XXL, Android 4.1

    No success here as well. Same problem.
    The Samsung Galaxy Tab 2 10.1  has a PowerVR! So I guess it's a Mali problem as the test app runs on nVidia Tegra 3 as well.

    EDIT:
    By the way, I've played around with:

    [size=2][/size]
                GLES20.glDisable(GLES20.GL_CULL_FACE);
                GLES20.glFrontFace(GLES20.GL_CCW);


    No success.

    Edit 22.04
    I've now tested on a google nexus 10 (Mali T-604) and it works fine.

  • Note: This was originally posted on 29th April 2013 at http://forums.arm.com

    Hi,

    Someone from the Mali Developer Relations team will have a look at the apk you sent and we will see if we can provide you any more assistance. I will update you with any progress.

    Kind regards,
    Stephen
  • Note: This was originally posted on 27th May 2013 at http://forums.arm.com

    Hello,

    Are there any news about this topic? Mali GPU 400 seems quite common in Android TV set top boxes and there would be really nice to embed videos in the OpenGL ES world!

    Thank you in advance!
  • Note: This was originally posted on 21st June 2013 at http://forums.arm.com

    Hi marco3d,

    Have you been able to fix the problem?
  • Note: This was originally posted on 4th July 2013 at http://forums.arm.com

    Hello,

    I think we're experiencing the same problem - on a Rikomagic MK802III which is Android 4.1.1.
    We get the few lines of garbage at the top where the video texture should be.

    We also observe a regular 'stutter' when animating simple shapes in Open GL -
    every half-second or so (it's nice and smooth between the stutters).  I'm not sure if
    this is related to the video issue or not.

    It looks like there hasn't been an official response on this thread since the end of April.
    It would be really handy to get one please :-)

    The current behaviour is not acceptable for our application and we need to know
    whether all chipsets with a Mail400 are affected, and whether there's a fix
    (or a fix planned)?

    Thanks,

    Geoff.
  • Note: This was originally posted on 5th July 2013 at http://forums.arm.com

    Hi Geoff, all,

    I think S3DE's post, and this thread:

    http://stackoverflow.com/questions/15721450/problems-getting-surfacetexture-work-with-android-videoplayer-on-mali-400

    are pointing to the same conclusion. I don't believe the video decoder has been fully integrated into Android on those devices, hence video being decoded only to a back buffer not into an OpenGL ES texture object.

    I'm afraid the integration of the video decode hardware into OpenGL ES is outside of our capability. The ARM Mali driver supports such an operation, but it must be correctly integrated into the system by the platform builder.

    Attached is a video of marco3d's test APK running on a stock Samsung Galaxy S2 (Mali 400) and producing the same visual result as we saw on a Nexus 4, Nexus 7 and Nexus 10.



    I think it will be necesary to contact the device manufacturers and ask whether they can support the integration of their video decode hardware into the OpenGL ES driver. It may also be that the device manufacturer needs to contact their System on Chip vendor to ask them to do the integration since they were the company who combined the video decode hardware and 3D accelerator. It depends who did the integration of the video decode hardware and the 3D accelerator driver into Android.

    My suspicion is that a mobile chipset such as that in the S2 has correctly integrated the video decode hardware to allow use cases such as mapping the frames into an OpenGL ES texture, but in more Set Top Box oriented platforms (such as those listed above) the video decoding has only been configured to decode to a "back buffer" which would be good enough for most STB use cases (including having a GPU overlay an Electronic Programme Guide using an alpha channel).

    HTH, Pete
  • Note: This was originally posted on 26th July 2013 at http://forums.arm.com

    Hi,
          have a similar problem on HP Slate 7 2801. But interestingly while an mp4 file stored on the device will only give a black texture, the exact same file when streamed (from my web site) displays perfectly.

    Is this just the slate or is it common to other mali devices?

    Now anyone know a webserver that can be used internal to an app to serve up video files?  Might be required if we are to support mali GPU!

    Regards

    Gerard.
  • Hi Everybody,

    I'm reviving this post, as this problem is still out there and I still have no solution for it.

    I've Mali -T764 based android 4.4.2 from OpenHour called Chameleon. It's fast and works well but they still have the same problem we discussed here.

    The video runs in the background, and it's NOT copied to the surface (I can see it by changing my shader to add transparency)

    The problem is what Pete says: "Mali supports it, but it must be correctly integrated into the system by the platform builder".

    Here is the problem, these manufacturers are just ignoring it. It should be there, it's simply not implemented.

    Besides, there is a second irritating thing on many of the devices I've testet: They use the Log in the wrong way and log verbose messages as erros which is really a pain.

    Anyway, this is a hint before buying no name hardware.

    I'm still looking for a way to get this done. I think the only way is to get the frames and save them to the texture myself.

    The performance will be really bad, it must however work.

    - Marco (marco3d - originally)