Arm Community
Arm Community
  • Site
  • User
  • Site
  • Search
  • User
Arm Community blogs
Arm Community blogs
Mobile, Graphics, and Gaming blog The Movie Vision App: Part 2
  • Blogs
  • Mentions
  • Sub-Groups
  • Tags
  • Jump...
  • Cancel
More blogs in Arm Community blogs
  • AI blog

  • Announcements

  • Architectures and Processors blog

  • Automotive blog

  • Embedded and Microcontrollers blog

  • Internet of Things (IoT) blog

  • Laptops and Desktops blog

  • Mobile, Graphics, and Gaming blog

  • Operating Systems blog

  • Servers and Cloud Computing blog

  • SoC Design and Simulation blog

  • Tools, Software and IDEs blog

Tags
  • OpenCL
  • Mali-T600
  • Mali
  • gpu_compute
  • Mali-T700
  • renderscript
  • compute_shaders
Actions
  • RSS
  • More
  • Cancel
Related blog posts
Related forum threads

The Movie Vision App: Part 2

Eric Gowland
Eric Gowland
November 11, 2014
5 minute read time.

Introduction

Following on from The Movie Vision App: Part 1, in Part 2, we’ll immediately move on and discuss two more image filters implemented for the project.

Movie Visions Filters: “An Old Man Dies…”

oldmandies.png

This filter is only slightly more complex than the “I’ll be back” effect described in the previous blog. The camera preview image is filtered to a black and white, grainy ‘comic book’ style, but any objects detected as red retain their colour.

The Java portion of the filter does the standard RenderScript initialisation. A Script Intrinsic is used to convert the YUV camera preview data to an RGB array, and a second custom script applies the actual visual effect.

/*
Function: root
param uchar4 *in        The current RGB pixel of the inout allocation.
param uchar4 *out      The current RGB pixel of the output allocation.
param uint32_t x        The X position of the current pixel.
param uint32_t          The Y position of the current pixel.
*/
void root(const uchar4 *in, uchar4 *out, uint32_t x, uint32_t y){
    //The black and white output char.
    uchar4 bw;
    //Range between -120 and 120, 120 being the highest contrast.
    //We're applying this to make a high-contrast image.
    int contrast = 120;
    float factor = (255 * (contrast + 255)) / (255 * (255 - contrast));
    int c = trunc(factor * (in->r - 128) + 128)-50;
    if(c >= 0 && c <= 255)
        bw.r = c;
    else
        bw.r = in->r;
    //Now determine if we apply a 'grain' effect to this pixel - every 4th pixel
    //If the current pixel is divisible by 4, apple a 'grain' effect.
    if(x % 4 == 0 && y % 4 == 0)
        bw.r &= in->g;
    //Finally determine if this pixel is 'red' enough to be left as red...
    //Red colour threshhold.
    if (in->r > in->g+55 && in->r > in->b+60) {
        //Only show the red channel.
        bw.g = 0;
        bw.b = 0;
    } else {
        //Make all colour channels the same (Black & White).
        bw.g = bw.r;
        bw.b = bw.r;
    }
    //Set the output pixel to the new black and white one.
    *out = bw;
}







“An Old Man Dies” RenderScript Kernel root function

First, we apply a formula to calculate a colour value for the pixel that will result in a high contrast black & white image. The value for every fourth pixel is further modified to stand out, resulting in a slight grain effect. Finally, if the pixel’s red colour value exceeds a certain threshold, only the red channel for that pixel is shown. Otherwise, the blue and green channels are set to the same value as the red to achieve the black & white look.

Once again, can you guess the movie that inspired this filter?

Movie Vision Filters: “Get To The Chopper…”

This filter creates a ‘thermal camera’ effect, and also applies a Heads Up Display (HUD) type decoration. The colour filtering utilises RenderScript, whilst the HUD leverages Android’s built in face detection. A set of look-up tables map specific colour ranges to output colours. Thermal cameras generally map infrared to a narrow set of output colours. This image filter mimics this by mapping input image colours to a similar set of output colours.

/**
* Creates the lookup table use for the 'heat map' splitting the image int
* 16 different colours.
*/
private void createLUT() {
    final int SPLIT = 8;
    for (int ct = 0; ct < mMaxColour/SPLIT; ct++){
        for (int i = 0; i < SPLIT; i++){
            switch (ct) {
                /**
                * The following cases define a set of colours.
                */
                case (7):
                    mRed[(ct*SPLIT) +i] = 0;
                    mGreen[(ct*SPLIT) +i] = 255;
                    mBlue[(ct*SPLIT) +i] = 0;
                    break;
                case (6):
                    mRed[(ct*SPLIT) +i] = 128;
                    mGreen[(ct*SPLIT) +i] = 255;
                    mBlue[(ct*SPLIT) +i] = 0;
                    break;
                …
                …
            }
      }
  }
  redLUT.copyFrom(mRed);
  greenLUT.copyFrom(mGreen);
  blueLUT.copyFrom(mBlue);
  mChopperScript.set_redLUT(redLUT);
  mChopperScript.set_greenLUT(greenLUT);
  mChopperScript.set_blueLUT(blueLUT);
}







“Get to the Chopper” creating look-up tables

On the Java side, along with setting up the typical set up of Allocation objects to pass input and receive output from RenderScript, three look-up tables are defined: one each for the red, green and blue colour channels. Each look-up table is essentially an array of 255 values, giving the output value for each of the possible input values of the colour channel. Each frame is again first converted to RGB before being passed to the image effect RenderScript kernel. After the RenderScript filtering, the decoration drawing callback is used to draw a red, triangular ‘targeting’ reticule on any faces that were detected by the Android face detection API.

...
    //Basic skin detection.
    //These values specifically filter out skin colours.
    if(in->r > in->g+10 && in->r > in->b+5 && in->g < 120) {
        //If skin has been detected, apply the 'hotter' colours.
        out->r = in->r & 40;
        out->g = in->g & 40;
        out->b = 24;
        out->a = 0xff;
    }
    //Use the external lookup allocations to dertermine the colour.
    out->r = *(uchar*)rsGetElementAt(redLUT, in->r);
    out->g = *(uchar*)rsGetElementAt(greenLUT, in->g);
    out->b = *(uchar*)rsGetElementAt(blueLUT, in->b);
...







“Get to the Chopper” RenderScript Kernel root function

The RenderScript script for this effect is very simple. For each pixel, it first checks if the RGB values fall within a range considered a ‘skin tone’. If so, the output is forced to the ‘hot’ output colours. Otherwise, the output values for the pixel are set directly from the pre-configured look-up tables for each channel.

Which movie inspired “Get to the Chopper”? As a hint, it features the same actor as “I’ll be back”.

That concludes this second Movie Vision App blog. Read on for the most complex image effects of the Movie Vision App and some concluding comments in The Movie Vision App: Part 3!

Creative Commons License

This work by ARM is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. However, in respect of the code snippets included in the work, ARM further grants to you a non-exclusive, non-transferable, limited license under ARM’s copyrights to Share and Adapt the code snippets for any lawful purpose (including use in projects with a commercial purpose), subject in each case also to the general terms of use on this site. No patent or trademark rights are granted in respect of the work (including the code snippets).

Anonymous
Mobile, Graphics, and Gaming blog
  • What is Arm Performance Studio?

    Jai Schrem
    Jai Schrem
    Arm Performance Studio gives developers free tools to analyze performance, debug graphics, and optimize apps on Arm platforms.
    • August 27, 2025
  • How Neural Super Sampling works: Architecture, training, and inference

    Liam O'Neil
    Liam O'Neil
    A deep dive into a practical, ML-powered approach to temporal super sampling.
    • August 12, 2025
  • Start experimenting with Neural Super Sampling for mobile graphics today

    Sergio Alapont Granero
    Sergio Alapont Granero
    Laying the foundation for neural upscaling to enable sharper, smoother, AI-powered gaming on next-generation Arm GPUs.
    • August 12, 2025