Tech Symposia VR demo previews Arm’s next DPU

At the Arm Tech Symposia events towards the end of 2018, attendees had the privilege of previewing a new exciting display solution from Arm called Mira. This has the potential to shape the way in which future application processors drive Virtual Reality (VR) /Augmented Reality (AR) head mounted displays. In addition to offering best-in-class 4Kp120 display performance and image processing functions for standard smartphone/tablet/DTV sized displays, Mira adds fixed function hardware for three VR/AR processing tasks mandated for head mounted displays. These functions were all demonstrated through the VR demo.

The first of these, Lens Distortion Correction (LDC), pre-distorts the images in order to counter the effect of the lens through the use of programmable distortion splines, so that when they are viewed through the lenses of any VR headset they appear correct, and undistorted. The second, Chromatic Aberration Correction (CAC), pre-separates the colour channels in the opposite direction in order to counteract the colour blurring effect caused by the lenses. Lenses are prisms, and so when passing through the lens, the image undergoes colour channel splitting which is the cause of the blurring. Last, but by no means least, Mira performs Asynchronous TimeWarp (ATW), which re-projects the content such that its "real-world" orientation remains constant. In other words, it rotates and/or translates the scene according to the latest head pose and position of the headset in the 3D space.

LDC, CAC and ATW on Mira

The Mira demo at the Tech Symposia events consisted of a premium smartphone running Google’s ARCore and together with the Inertial Measurement Unit (IMU) provides the Six Degrees of Freedom (6DOF) co-ordinates to the Mira Display hardware which is sitting inside the FPGA. The diagram below shows the high-level data flow of the demo in action.

Mira FPGA demo high-level data flow

Mira fetches static images from DRAM and performs a clever caching of them followed by LDC, CAC and ATW. Mira does all this with a single pass through memory. The short video below shows a snippet of the demo that was shown at the events where Asynchronous TimeWarp (ATW) is performed on the cached images based on the 6DOF co-ordinates it receives from ARCore and the IMU. These are the three rotations (pitch, yaw, and roll) and three translations (up/down, forward/back, right/left).