Welcome back to the third post in my Virtual Reality (VR) blog series. We’ve already discussed the reasoning behind our interest in VR as a major driving force for graphics development, but even in the time between posts, VR uptake has grown tangibly and it’s become apparent that the industry is moving ahead fast. It’s only a matter of time now until VR is totally mass market. Yes, it will probably start with high end gaming but is set to grow so quickly that the time to take notice is now.
In blog two we considered the difficulty of focus when developing for VR and some of the ways of managing this. This time around we’re looking at how to develop low latency VR. The latency measurement in question is the time it takes to turn the actual motion of your head into the image you see on the screen of your VR headset. The two events need to be close enough together that you don’t notice any disconnect – just like in the real world. If latency is too high or too variable the immersion feels unnatural and the disparity with your brain’s understanding of normal movement will start to cause nausea or dizziness – not a super fun experience. Industry research tell us that this “motion-to-photons” latency should be consistently less than 20 milliseconds (ms) for a smooth and natural VR experience. This is a tough call at a standard refresh rate of 60Hz, which would give a latency of 16ms, but is attainable with the right approach.
There are a few elements which, when combined, can contribute greatly to successfully building a low latency VR system. The first aspect we will consider is front buffer rendering. Double or triple buffering is commonly used in graphics applications, including Android, in order to increase smoothness by allowing the GPU to draw pixels to a rotating buffer of off-screen copies, swapping them with the on-screen buffer at the end of each display refresh. This process helps iron-out variations between neighbouring frame times but also has the added effect of increasing latency which is of course the opposite of what we are looking for in a VR application. In front buffer rendering, the GPU is able to bypass the off-screen buffers and render directly to the buffer the display reads from in order to reduce latency. Rendering to the front buffer needs careful synchronisation with the display to ensure the GPU is always writing ahead of the display reading. The context_priority extension available on Mali GPUs is used to enable prompt scheduling of tasks on the GPU in order to allow front buffer rendering processes such as Timewarp to take priority over less immediately urgent tasks and improve user experience.
Another important part of this puzzle is selecting the right type of display for your VR device. Organic Light Emitting Diode (OLED) displays are an additional element that can improve a VR experience and they work differently from the familiar and well-established LCD display. Each and every pixel in an OLED display provides its own light source through the Thin Film Transistor array which sits behind it, as opposed to the white LED backlighting of an LCD. The brightness of an OLED display is established by the level of power of the electron current travelling through the film. Because colours are managed by individually varying the tiny red, green and blue LEDs behind the screen, it is possible to get brighter and sharper hues with stronger saturation. You can turn off sections of the panel so you can achieve a deeper, truer black than is typically possible on an LCD which has to block out the back light. This is the usual selling point for OLED panels, but critically for VR it also allows an easier means to achieve low persistence through partial illumination. A full persistence display is lit continuously which means the scene view is correct only briefly then is very quickly out of date. A low persistence approach means the image only remains lit while the view is correct and then goes dark, a process which is imperceptible at a high refresh rate, providing the illusion of a continuous image.
This is important to reducing blurring for the user. The additional flexibility with which you can illuminate the pixels in an OLED panel means that the display can show multiple partial images during a single refresh and so react mid-frame to the changes fed to it by the sensors in the headset, allowing the head location to change whilst the scan is moving across the screen. This is not possible to achieve with an LCD panel without replacing the global backlight. The ability to drive an OLED panel by drawing to the front buffer in sections or slices through a Timewarp-like process is key for achieving a lower latency VR experience. This is because the image you see on screen can adapt to your head movements much more quickly than is otherwise possible.
Now we consider one of the keystones of this combination, the Timewarp process. Due to the comparatively gradual changes of scene in an immersive VR application, the image changes between views by a small and therefore, relatively predictable amount. Warping is basically shifting an image rendered at an older head location to match a newer one. This partially decouples the application frame rate from the refresh rate and allows the system to enforce a latency guarantee that some applications may not provide. This shifting can account for changes in head rotation but not head position or scene animation. It’s therefore something of an approximation but provides an effective safety net and also enables an application running at 30 FPS to appear (at least in part) as if it’s tracking the users head at 60 FPS or above.
In this post we've discussed the tight integration that needs to exist between GPU and display, but this is only one part of the stack. If we are to play videos, perhaps DRM-protected ones, and integrate system notifications; the complexity doesn’t end there. High quality VR support requires that your multimedia products are well synchronized and communicate in bandwidth-efficient ways in order to provide not only the best experience to the end user but also the best power efficiency and performance. The ARM® Mali™ Multimedia Suite (MMS) of GPU, Video and Display processors is integrated and supported by efficiencies such as ARM Frame Buffer Compression (AFBC) and ARM TrustZone® making it a leader in VR development technology.
Join us at GDC to find out more!
Thanks!
So, GPU can render data at 'warped' coordinate and write them directly to front buffer with Mali!
Thank you for your support. I'd gladly wait for the Part 4.