Game performance is arguably one of the most important components that contribute to the success of a game. It is particularly critical in mobile games where each player’s device type and capability can vary greatly. Arm Mobile Studio provides tools that help Android game and app developers benchmark, optimize, and debug CPU, GPU, and even bandwidth use. It can be used for analyzing standalone applications or as a complement to building apps with the Unity and Unreal game engines.
The latest addition to the suite, Arm Performance Advisor, enables you to run a performance profile and extract clear, easy-to-understand data, key metrics and issues, and optimization advice. It even transforms a profile into an HTML report that you can share with the entire team.
What makes it even better is that Arm Mobile Studio Professional Edition includes headless Continuous Integration (CI) support for Performance Advisor, which game studios can use to easily set up automated on-device performance testing across an entire device farm. In this article, we examine how the headless Performance Advisor works with CI systems. We will also give an overview of the steps you can take to automatically generate useful analyses and visualizations for your team regularly. We will then see how the generated data can help identify issues when they are introduced during the development process to ensure a high-performance gameplay experience for your players at launch.
It is simple. The best mobile games and apps are fun, look great, have steady frame rates, and are battery efficient to keep users engaged and playing longer. Without good performance and gameplay experience, your players stop playing your game. There is a direct correlation between frame rate and player retention, and maintaining steady high performance in mobile games and applications is a tough challenge. This is complicated further for Android app developers because they have to support a wide range of devices with varied capabilities.
If you are a mobile games studio that is serious about ensuring a broad audience for your games, you want to target as many devices as possible. This requires lots of testing on different Android devices, which is a daunting task if done manually. Assigning testers to continually run through performance tests on every device in your device farm is time-consuming and expensive, so you want to avoid doing this kind of testing manually if you can. Ideally, you want to test your game’s performance on a multitude of devices during the development process so that you can fix problems immediately when they appear rather than trying to track them down and patch them up later on. Just a bit of upfront performance analysis effort, debugging and testing performance throughout the development, can save your team a lot of time in the long run. It could potentially even save your game. The main tools you need to set this up for your team or organization are a performance analysis tool that can run in headless mode and a good CI system to control it through remote commands.
Let us explore the high-level steps for running Performance Advisor across an Android device farm with a CI tool for automated testing.
First, configure your test devices to capture a performance profile on the host machine:
Once Performance Advisor is set up for all of the devices connected to the host machine, set up a CI tool such as Jenkins, TeamCity, or Buildbot for your network. You can use any CI that can run commands both on your host machines and a CI server to orchestrate the actions.
After you have finished setting up your CI server, you are just a couple of steps away from continuous automated testing.
Here are the steps you could follow for daily performance testing:
For a more in-depth walkthrough of this process, see the full tutorial on integrating Mobile Studio into a CI workflow.
This final step depends on what you want to do with your performance data.
Perhaps you could run a CI task to compile the HTML reports from each device into an emailed summary for your team to look through during the next daily stand-up meeting. Quickly see how the average FPS has changed since yesterday, and analyze any CPU or GPU issues Performance Advisor has identified. (The video Performance Reporting with Arm Mobile Studio demonstrates this process).
In one scenario adopted by many teams already, you push the data from each of the exported JSON files to Elasticsearch, which would allow you to easily visualize it with a tool like Kibana. We will go through some example visualizations in the next section.
While this process may seem like a bit of work, keep in mind that it is a one-time cost that will enable continuous automated testing for your team for countless days. It is worth it. The net result is unlimited testing scalability. Any new test devices will only need to be configured once.
Now that we have taken a high-level look at how you can set up automated performance testing in an Android device farm, let us check out some of the ways you can work with the data it generates.
Look at these basic performance graphs and visualizations generated using Kibana with data pushed into Elasticsearch. They showcase a few examples of the kind of automated performance insights that can be made available to your team.
Frame rate is a fundamental metric for recognizing if an app is running smoothly. This gauge visualization shows all test devices falling short of the targeted 60fps with the high-end devices very near target frame rate and low-end devices significantly lower at an average of 51.207fps, which may indicate a need for performance optimization.
The time-series graph can be used to gain valuable progress insight as well. This line graph helps visualize the average FPS of the performance captures over a period of two weeks. It reveals when the project performance degraded over time and when the team took a day to address the performance issues within the game.
Knowing the CPU and GPU utilization is especially important for mobile applications because it directly affects battery efficiency. Players have to stop playing your game when they need to recharge the device and this may lead to losing interest in the game as well. Here we can see that the CPU utilization is slightly lower for high-end devices in contrast to the low-end devices while the GPU utilization is significantly lower, reflecting more room for optimization on the GPU for low-range and mid-range devices.
Being able to split performance profiles in Mobile Studio by region helps you further categorize data, as shown in the following graphic. By keeping an eye on the shader and GPU cycles, you can keep track of shader operation complexity, which can severely impact the frame rate and battery power usage on low-end devices. The read/write GPU bandwidth metric can assist in identifying areas in your game that depend too heavily on textures and render targets.
Ideally, each pixel in a scene is only ever drawn once per frame for maximum performance, so an average value of overdraw significantly greater than 1.0 in this bar graph hints at room for optimization in the object rendering of all scenes. Even a simple metric such as pixel overdraw can help tremendously in making sure your game or app runs optimally.
You have now seen how visualizations like these ensure that everyone—from managers to developers to artists—can quickly see and understand how the current build is performing and take immediate action to fix performance issues that appear from automated testing. The type of metrics you want to track and analyze will depend on the game or app. A game with heavy graphics and special effects will likely focus on steady frame rate and shaders, while a game with AR camera features may be more concerned with battery power efficiency. If you can anticipate the metrics you need, it helps you choose the optimal metrics for your project.
You have seen how to approach setting up automated builds, deployments, and performance testing in an Android device farm. You also saw the kind of amazing visualizations you can produce from the JSON data generated by the Performance Advisor. Imagine how easily and quickly your team could identify and fix performance issues with automated testing. It could mean the difference between the next top game and the next flop game. Ready to try it yourself? Get in touch with us and we will get you started.