Mobile gaming is now the largest and most valuable segment of the global gaming market. Porting your game to mobile is not straightforward however, as there is a range of constraints and considerations that are required for an optimal experience due to the form-factor. Virtual reality (VR) introduces further demands on these constraints since much of the gameplay and user interaction relies on a range of external cameras or sensors.
Arm technology is at the heart of the majority of smartphones and other portable gaming devices, enabling the mobile gaming revolution we see today. We recently partnered with Fast Travel Games, a VR games studio based in Stockholm, Sweden, to explore how to bring their successful debut title Apex Construct to mobile.
Figure 1: Apex Construct
Apex Construct is an action/adventure game offering a narrative-driven VR experience, with a balanced combination of combat, exploration and puzzle solving. Armed with a bow, the player takes the role of a human returning to Earth after an apocalyptic event that tore the planet apart, leaving it shattered and with no organic life.
Figure 2: GearVR headset
Being able to run VR with only a smartphone and a portable and affordable headset means we can provide immersive experiences that can be played in any place at any time. Also, as we are no longer bound to the limits of a room for tracking the virtual space, interesting opportunities combining virtual and real-world elements arise.
Apex Construct is a complex game built for PC and console and makes heavy use of positional tracking and motion controllers. However, it was well optimized as it had to support an already wide range of performance levels and this provided a great start to porting one of the levels to mobile and GearVR.
Apex Construct relies on two main interaction mechanisms:
At a glance, they both seem impossible on the GearVR due to the absence of any external tracking. The GearVR has 3 degrees of freedom (3DOF), meaning that only the orientation of the head is tracked. This is true for both the headset and the controller.
For the first mechanism above, we worked around the single controller by replacing the bow and arrow with a gun, which worked smoothly. How about the second mechanism, though? Once again it seems that we are lacking the hardware support to replicate the positional tracking achieved in the console experience. Luckily, we found an interesting way around it: we can enable Google's ARCore on a Samsung Galaxy S8/S9 and exploit just the SLAM map that it generates to achieve inside-out tracking. This, together with the headset's rotational tracking, lets us achieve 6DOF on mobile.
Figure 3: 6DOF on mobile
Since ARCore is not originally designed for inside-out tracking, we had to come up with some inventive ways of adapting the platform to our needs. As a starting point we followed this tutorial written by Roberto Lopez Mendez, which outlines our early experiments with this technology. Since Roberto's instructions use ARCore Developer Preview 1, it requires a few changes to adapt the main tracking script to newer versions of the API. However, this script alone is not enough for a smooth experience in Apex Construct due to side-effects to performance and UX.
Regarding performance, ARCore will not negatively affect CPU frame time. It runs in a thread parallel to the game, and from our tests it tended to fully utilise one of the cores, which increases CPU load without adding overhead to the render threat. The overhead coming from the tracking script was small too (~0.5ms). In our tests we also got an unexpected GPU load related to ARCore, of about 4ms every two frames, which is probably linked to the camera stream updating at 30Hz.
There is a key Unity setting enforced by the ARCore Unity plugin: disabling multi-threaded rendering. This means that Unity will use a single thread for all CPU work, both game logic and render logic. Since the game was CPU-bound at that stage, disabling multi-threaded rendering was a serious performance hit. The reason why Unity enforces single-threaded rendering is synchronization of the AR background texture, which can be accessed by both ARCore and Unity. For inside-out tracking we do not need this texture. If at the time you are reading this Unity does not provide an option to enable multi-threaded rendering, or ARCore does not separate access to the camera stream and the SLAM processing, then the only solution is to modify the ARCore Unity plugin.
In the Developer Previews of ARCore, the plugin was making mostly direct calls to the underlying API, while the initialization was handled by a separate plugin. We replaced this plugin with a custom one which initialized ARCore without linking the AR background texture to a Unity texture, which solved our problem. However, since then version 1.0 has been released, and there is a new library that handles all calls to ARCore. While it might still be possible to modify the Unity plugin to work around the multi-threading lock, it will be harder since we do not have access to the new library's code.
Figure 4: Apex Construct
Regarding the UX, here is a recap of all the tweaks we had to implement:
The 3DOF controller in a 6DOF environment does not cause any UX issues. In GearVR the controller's position is relative to the player, which still feels natural even when the user moves around.
ARCore is not designed for this kind of non-AR applications but luckily it allowed us to implement inside-out tracking using only the phone's camera, achieving an experience comparable to what consoles can offer.
To offer the best experience on mobile, it was not sufficient to adapt the gameplay of the game, but to make sure it ran smoothly and efficiently for the device's battery life. This involves a process of profiling and iterative optimizations.
VR constraints the hardware to stabilise performance, for instance CPU and GPU frequencies are reduced. Profile in VR as often as possible to avoid side-tracking your investigation and optimize areas that do not represent a bottleneck in VR. Oculus provides basic performance stats through logcat including the frequencies used.
adb logcat –s VrApi
If you profile without VR, assume that in reality:
Fast Travel Games provided an excellent starting point for the mobile prototype by designing Apex Construct with performance in mind from the beginning. As we shall see, many of the optimization and best practices discussed here apply to consoles too.
The Apex Construct team implemented several tools that make the game scalable. For instance, it uses a sub-level system to divide the large number of objects and allow multiple people to work on a single scene at a time, and they are all merged at build time. At runtime, it avoids loading unnecessary resources with a bucket streaming system. Sets of objects that are usually active together are grouped into buckets which are enabled/disabled depending on the player's location. Finally, several quality settings are also adjusted at runtime to adapt to the hardware the game is running on.
Fast Travel Games provided us a level that we could profile together and further optimize for mobile. With Arm's DS-5 Streamline we gathered initial performance information and located bottlenecks. This tool displays counters values from the Mali GPU as graphs on a timeline. Since the level was quite complex, we started by tracing a scripted path around the level to locate the heaviest scenes, those who took the longest to render. At this point Streamline reported that the game was fragment bound.
Figure 5: DS5 Streamline
With Arm's Mali Graphics Debugger, we could analyse the scene draw-call by draw-call, including all render targets, textures and shaders. We identified that the vertex blending used for shading high-area assets was particularly expensive. By baking these textures offline, we saved around 7ms. This is an example of something we recommend prototyping separately first, and then apply to the game if it looks promising. In our case we worked on a texture merging tool to accelerate applying this technique to the whole level.
Figure 6: Mali Graphics Debugger
It is worth noting that at this point we were working with a non-VR version of the game, to simplify the analysis. We were now hitting a vertex count bottleneck in the heavy scenes, due to the high-detail meshes, parts of which were not visible from the player's location. Fast Travel Games created a tool that simplified these meshes. With this we reduced one of the scenes from using 2.3M triangles to 1.44M.
Figure 7: Top – The mesh simplification tool achieved 55% results in some cases, with minimal visual impact. Bottom – Fast Travel Games developed a mesh cutting tool to further reduce the number of invisible vertices.
We hit again a fragment bottleneck this time caused by the lighting shaders. We prototyped moving from the engine's standard shading to using Blinn-Phong and Lambert which further reduced the rendering time.
Figure 8: Left: sample scene using Standard lighting. Right: Simplified lighting shaders improved rendering time with little visual impact.
Finally, after applying all these optimizations and achieving 60FPS on mobile in the non-VR version, we moved to GearVR, where performance dropped significantly. In this case Streamline reported the application to be CPU-bound, due to the large number of draw-calls involved. This was something that we remedied by merging objects that use the same materials, removing some props and simplifying materials to allow for further driver and hardware optimizations. We also refined the occlusion parameters and used Fast Travel Games' stream bucket mechanism to be more aggressive at not loading sections that the player cannot see from their location.
@ApexConstructVR has been ported to a mobile #VR prototype in collaboration with tech company @Arm to be shown at @Official_GDC on March 18-21. Our CTO @MrBenj4min will hold talks with Arm's @mostlypablo on how this port was created #GDC2018 #ApexConstruct— Fast Travel Games (@fasttravelgames) March 8, 2018
@ApexConstructVR has been ported to a mobile #VR prototype in collaboration with tech company @Arm to be shown at @Official_GDC on March 18-21. Our CTO @MrBenj4min will hold talks with Arm's @mostlypablo on how this port was created #GDC2018 #ApexConstruct
In the end, these optimizations allowed the game to offer a smooth and complete experience on GearVR including ARCore for inside-out tracking. Hopefully this provides some useful advice and encourages you to also try porting your game to mobile, although remember that it will be much easier if you keep an eye on performance and best practices from the start.
To optimize your own games and port them to mobile, we encourage you to try Arm’s free DS-5 Streamline and Mali Graphics Debugger developer tools that allow you to profile and debug your applications.
As demonstrated for a level of Fast Travel Games' Apex Construct, by following some general rules and designing games with efficiency in mind, it is possible to port complete immersive experiences to mobile. This will only become easier as mobile GPUs such as Arm Mali improve every year and Vulkan consolidates as the graphics API of choice, allowing an effective use of CPU. We are closer than ever to closing the gap in gaming experiences between mobile and console.
Thank you very much I'm glad this was useful to you. At the moment it is not possible for us to publish the source code. However most of the code related to ARCore integration was published here: https://community.arm.com/graphics/b/blog/posts/mobile-inside-out-vr-tracking-now-on-your-phone-with-unity (note that we used an older version of ARCore)
Very nice work. I am also working on making GearVR 6DoF. I made some progress myself and push the updates for the latest devices. I would love to learn more about your work. Any chance you make your source code public?
My latest project can be viewed here. github.com/.../ARCoreInsideOutTrackingGearVr