I got the hardware video decoder working on my firefly/rk3288. I can blit the yuv/nv21 output frame to rgb frame memory using fireflys dedicated 2d acceleration hardware. All I need right now is a memory pointer, pointing directly to the texture memory, in order to use the decoded video frame in shaders. In android I would create a GraphicBuffer object and give it to eglCreateImageKHR() and use glEGLImageTargetTexture2DOES. Right now I use glTexSubImage2D to copy and I get about 12 fps for 1920x1080 frames, too slow for showing a movie.
What is the best way to do this for firefly linux 3.10 with fbdev mali t760 driver? How can I simply get a pointer sharing CPU/GPU texture memory? And is there maybe some sample code available?
Right now I stuck and waiting for X11 mali drivers that support XCreatePixmap that could hopefully bring me further. My goal is XBMC, mplayer, ... running on linux on firefly.
Any help would really be appreciated! TNX!
accelerated video / video processor (VPU) running on linux on RK3288 / firefly - FreakTab.com
Hi, I have the same issue, with slightly different parameters: I'm running Android 4.4.4 on my Firefly and wish to render ultra high resolution video (3840 x 2160) at 60 fps. Any sort of copy would make this unfeasible.
Based on research on the net and these forums, it seems this task can be accomplished with zero copy by directly accessing the physical pointer to the texture memory via UMP or DMABUF. Is there documentation on how to go about achieving this?
EDIT: Seems I can answer my own question as Android implementation of GraphicsBuffer provides this.
Using direct textures on Android
Using GL_OES_EGL_image_external on Android