Arm Community
Arm Community
  • Site
  • User
  • Site
  • Search
  • User
Arm Community blogs
Arm Community blogs
Mobile, Graphics, and Gaming blog Virtual Reality – Blatant Latency and how to Avoid it
  • Blogs
  • Mentions
  • Sub-Groups
  • Tags
  • Jump...
  • Cancel
More blogs in Arm Community blogs
  • AI blog

  • Announcements

  • Architectures and Processors blog

  • Automotive blog

  • Embedded and Microcontrollers blog

  • Internet of Things (IoT) blog

  • Laptops and Desktops blog

  • Mobile, Graphics, and Gaming blog

  • Operating Systems blog

  • Servers and Cloud Computing blog

  • SoC Design and Simulation blog

  • Tools, Software and IDEs blog

Tell us what you think
Tags
  • mobile-vr
  • gdc2016
  • afbc
  • Mali-T880
  • gdc16
  • gdc
  • Mali
  • video_processor
  • virtual-reality
  • virtualreality
  • display processor
  • Mali-T860
  • gpu
  • TrustZone
  • vr
Actions
  • RSS
  • More
  • Cancel
Related blog posts
Related forum threads

Virtual Reality – Blatant Latency and how to Avoid it

Freddi Jeffries
Freddi Jeffries
March 1, 2016
5 minute read time.

Welcome back to the third post in my Virtual Reality (VR) blog series. We’ve already discussed the reasoning behind our interest in VR as a major driving force for graphics development, but even in the time between posts, VR uptake has grown tangibly and it’s become apparent that the industry is moving ahead fast. It’s only a matter of time now until VR is totally mass market. Yes, it will probably start with high end gaming but is set to grow so quickly that the time to take notice is now.

The what and the why

In blog two we considered the difficulty of focus when developing for VR and some of the ways of managing this. This time around we’re looking at how to develop low latency VR. The latency measurement in question is the time it takes to turn the actual motion of your head into the image you see on the screen of your VR headset. The two events need to be close enough together that you don’t notice any disconnect – just like in the real world. If latency is too high or too variable the immersion feels unnatural and the disparity with your brain’s understanding of normal movement will start to cause nausea or dizziness – not a super fun experience. Industry research tell us that this “motion-to-photons” latency should be consistently less than 20 milliseconds (ms) for a smooth and natural VR experience. This is a tough call at a standard refresh rate of 60Hz, which would give a latency of 16ms, but is attainable with the right approach.

The winning combination

There are a few elements which, when combined, can contribute greatly to successfully building a low latency VR system. The first aspect we will consider is front buffer rendering. Double or triple buffering is commonly used in graphics applications, including Android, in order to increase smoothness by allowing the GPU to draw pixels to a rotating buffer of off-screen copies, swapping them with the  on-screen buffer at the end of each display refresh. This process helps iron-out variations between neighbouring frame times but also has the added effect of increasing latency which is of course the opposite of what we are looking for in a VR application. In front buffer rendering, the GPU is able to bypass the off-screen buffers and render directly to the buffer the display reads from in order to reduce latency. Rendering to the front buffer needs careful synchronisation with the display to ensure the GPU is always writing ahead of the display reading. The context_priority extension available on Mali GPUs is used to enable prompt scheduling of tasks on the GPU in order to allow front buffer rendering processes such as Timewarp to take priority over less immediately urgent tasks and improve user experience.

Front buffer.jpg

Another important part of this puzzle is selecting the right type of display for your VR device. Organic Light Emitting Diode (OLED) displays are an additional element that can improve a VR experience and they work differently from the familiar and well-established LCD display. Each and every pixel in an OLED display provides its own light source through the Thin Film Transistor array which sits behind it, as opposed to the white LED backlighting of an LCD. The brightness of an OLED display is established by the level of power of the electron current travelling through the film. Because colours are managed by individually varying the tiny red, green and blue LEDs behind the screen, it is possible to get brighter and sharper hues with stronger saturation. You can turn off sections of the panel so you can achieve a deeper, truer black than is typically possible on an LCD which has to block out the back light. This is the usual selling point for OLED panels, but critically for VR it also allows an easier means to achieve low persistence through partial illumination. A full persistence display is lit continuously which means the scene view is correct only briefly then is very quickly out of date. A low persistence approach means the image only remains lit while the view is correct and then goes dark, a process which is imperceptible at a high refresh rate, providing the illusion of a continuous image.

  low persist screens.jpg

This is important to reducing blurring for the user. The additional flexibility with which you can illuminate the pixels in an OLED panel means that the display can show multiple partial images during a single refresh and so react mid-frame to the changes fed to it by the sensors in the headset, allowing the head location to change whilst the scan is moving across the screen. This is not possible to achieve with an LCD panel without replacing the global backlight. The ability to drive an OLED panel by drawing to the front buffer in sections or slices through a Timewarp-like process is key for achieving a lower latency VR experience. This is because the image you see on screen can adapt to your head movements much more quickly than is otherwise possible.

  Low_persistence.jpg

Why warp?

Now we consider one of the keystones of this combination, the Timewarp process. Due to the comparatively gradual changes of scene in an immersive VR application, the image changes between views by a small and therefore, relatively predictable amount. Warping is basically shifting an image rendered at an older head location to match a newer one. This partially decouples the application frame rate from the refresh rate and allows the system to enforce a latency guarantee that some applications may not provide. This shifting can account for changes in head rotation but not head position or scene animation. It’s therefore something of an approximation but provides an effective safety net and also enables an application running at 30 FPS to appear (at least in part) as if it’s tracking the users head at 60 FPS or above.

  warping.jpg

Tricks of the trade

In this post we've discussed the tight integration that needs to exist between GPU and display, but this is only one part of the stack. If we are to play videos, perhaps DRM-protected ones, and integrate system notifications; the complexity doesn’t end there. High quality VR support requires that your multimedia products are well synchronized and communicate in bandwidth-efficient ways in order to provide not only the best experience to the end user but also the best power efficiency and performance. The ARM® Mali™ Multimedia Suite (MMS) of GPU, Video and Display processors is integrated and supported by efficiencies such as ARM Frame Buffer Compression (AFBC) and ARM TrustZone® making it a leader in VR development technology.

Join us at GDC to find out more!

Anonymous
  • Bruce Lee
    Bruce Lee over 9 years ago

    Thanks!

    So, GPU can render data at 'warped' coordinate and write them directly to front buffer with Mali!

    Thank you for your support. I'd gladly wait for the Part 4.

    • Cancel
    • Up 0 Down
    • Reply
    • More
    • Cancel
  • Freddi Jeffries
    Freddi Jeffries over 9 years ago

    Hi, glad it was helpful! Sure, It’s “ahead” in the sense that the GPU writes to the buffer just before the display reads it so it can account for last minute changes. Does that make more sense?

    Part 4 of the VR series is coming soon, stay tuned!

    • Cancel
    • Up 0 Down
    • Reply
    • More
    • Cancel
  • Bruce Lee
    Bruce Lee over 9 years ago

    Hello.

    Thank you for an epic post! It was very helpful for me.

    But I have a question about how the Front Buffer Rendering works:

    The Front Buffer Rendering lets GPU to render directly to the front buffer, but the data should be written ahead of display reading.

    The question is, that "ahead" means the area where the display already read?

    So there is no need for buffer swapping by make writing and reading be occurred at a same buffer? (By the fact that display reads buffer sequentially)

    Or the "ahead" means the area where the display is about to read? So GPU can correct some changes which are appeared while the display refreshing?

    I'm very confusing about this concept. Please let me clear about this.

    • Cancel
    • Up 0 Down
    • Reply
    • More
    • Cancel
  • Sean Lumly
    Sean Lumly over 9 years ago

    Thanks Sam!

    I will absolutely thoroughly look through the Oculus Mobile API!

    Thanks again,

    Sean

    • Cancel
    • Up 0 Down
    • Reply
    • More
    • Cancel
  • Sam Martin
    Sam Martin over 9 years ago

    Hi Sean,

    Yep - the eyebuffer is double buffered in GearVR. There are other buffers as well for different layers – have a look at the Oculus Mobile SDK API (comments in the headers) for more info.

    Cheers,

    Sam

    • Cancel
    • Up 0 Down
    • Reply
    • More
    • Cancel
>
Mobile, Graphics, and Gaming blog
  • Optimizing 3D scenes in Godot on Arm GPUs

    Clay John
    Clay John
    In part 1 of this series, learn how we utilized Arm Performance Studio to identify and resolve major performance issues in Godot’s Vulkan-based mobile renderer.
    • June 11, 2025
  • Bringing realistic clothing simulation to mobile: A new frontier for game developers

    Mina Dimova
    Mina Dimova
    Realistic clothing simulation on mobile—our neural GAT model delivers lifelike cloth motion without heavy physics or ground-truth data.
    • June 6, 2025
  • Join the Upscaling Revolution with Arm Accuracy Super Resolution (Arm ASR)

    Lisa Sheckleford
    Lisa Sheckleford
    With Arm ASR you can easily improve frames per second, enhance visual quality, and prevent thermal throttling for smoother, longer gameplay.
    • March 18, 2025