At the Arm Tech Symposia events towards the end of 2018, attendees had the privilege of previewing a new exciting display processor from Arm called Mali-D77. This has the potential to shape the way in which future application processors drive Virtual Reality (VR) /Augmented Reality (AR) head-mounted displays (HMDs). In addition to offering best-in-class 4Kp90 display performance and image processing functions for standard smartphone/tablet/DTV sized displays, Mali-D77 adds fixed function hardware for three VR processing tasks mandated for HMDs. These functions were all demonstrated through the VR demo.
The first of these, Lens Distortion Correction (LDC), pre-distorts the images in order to counter the effect of the lens through the use of programmable distortion splines, so that when they are viewed through the lenses of any VR headset they appear correct, and undistorted. The second, Chromatic Aberration Correction (CAC), pre-separates the colour channels in the opposite direction in order to counteract the colour blurring effect caused by the lenses. Lenses are prisms, and so when passing through the lens, the image undergoes colour channel splitting which is the cause of the blurring. Last, but by no means least, Mali-D77 performs Asynchronous TimeWarp (ATW), which re-projects the content such that its "real-world" orientation remains constant. In other words, it rotates and/or translates the scene according to the latest head pose and position of the headset in the 3D space.
The Mali-D77 demo at the Tech Symposia events consisted of a premium smartphone running Google’s ARCore and together with the Inertial Measurement Unit (IMU) provides the Six Degrees of Freedom (6DOF) co-ordinates to the Mali-D77 Display hardware which is sitting inside the FPGA. The diagram below shows the high-level data flow of the demo in action.
Mali-D77 fetches static images from DRAM and performs a clever caching of them followed by LDC, CAC and ATW. Mali-D77 does all this with a single pass through memory. The short video below shows a snippet of the demo that was shown at the events where Asynchronous TimeWarp (ATW) is performed on the cached images based on the 6DOF co-ordinates it receives from ARCore and the IMU. These are the three rotations (pitch, yaw, and roll) and three translations (up/down, forward/back, right/left).
When rendering a VR scene, LDC, CAC and ATW will typically consume up to 20 percent of the processing time on the Mali GPU. So, giving back that time to the GPU means more time for it to render the VR scene at much higher resolutions and frame rates. Rendering at higher resolutions provides greater image clarity and is crucial for VR head mounted displays to eliminate artefacts (such as screen door effect caused by the unlit spaces between pixels), which appear due to the proximity of one’s eyes to the display. In addition, rendering at much higher frame rates (90 to 120 frames per second (fps)) will eliminate ghost artefacts on low persistence LCD panels and reduce the overall motion-to-photon latency (thus preventing motion sickness). Performing ATW at the last stage of the multimedia pipeline with Mali-D77 (rather than on the GPU) is also favourable, since the VR scene is composed and time warped just-in-time before sending it out to the display; thereby avoiding the associated delay with the extra read and write through memory.
Overall, the Mali-D77 display processor can ensure that the next generation of VR headset devices achieve more than double the pixel throughput of what is achievable today – 4Kp90 (or 2160x2160p90 per eye) at >100° Field of View (FOV) for world-scale wireless VR/AR within the power constraints of a mobile device. VR users will no longer rely on plugging the headset to a PC or console to drive an immersive VR experience. They will also be able to achieve the same immersive experience with a much lighter, more comfortable and untethered VR headset that brings more hours of battery life.
Mali-D77 is a disruptive technology in the way we think about heterogeneous computing for future VR headsets. Mali-D77 enables significant quality improvements for VR content, but can also lead the way to lighter, more comfortable standalone headsets free from any cables. It has the potential to unlock premium VR performance to the area constrained mainstream market meaning more revenue for mobile VR app vendors and more affordable VR headsets for users – a much needed boost for the entire VR ecosystem.
Is 2019 the year where VR display will start to really ramp up as we strive to meet the optimum immersive viewing experience? We certainly believe that this can be achieved through Mali-D77, with more information about the product in our launch blog. At the SID Display Week exhibition taking place between 14 and 16 May, Arm will be exhibiting the Mali-D77 VR demo in action. We would encourage SID Display Week attendees, our ecosystem partners and other interested parties to visit our booth #842 to find out more!
Arm's VR solutions