Android software frequently sags under the sheer weight of all the different devices it’s required to support. This is because developers can’t fine-tune the performance of their apps and games with the same ease and speed that they can on iOS, where consumer choice over hardware is kept to a bare minimum. Indeed, it can be a major effort to make an Android game run crash-free on popular devices, let alone optimise its frame rate, RAM requirements, battery consumption or other aspects of its usability.
There's nothing earth-shattering in these observations, and nothing to make us appreciate Google's operating system any less. What's new, however, is that we're just starting to get a handle on the precise scale of Android's performance deficit relative to iOS, as measured from the perspective of real phone users. This is an important step towards fixing the issue and ultimately making Android experiences more responsive, less resource-hungry and more energy-efficient.
Our team at GameBench recently completed a unique comparison between the Galaxy S6 and the latest iPhones, based on how well each phone handles a sample of ten popular cross-platform games. The GS6 is the best-performing Android phone we've tested so far, but we found that it lagged behind the iPhone 6 Plus to the tune of around 5 percent, and behind the regular-sized iPhone 6 by around 15 percent. Other Android phones fared worse: the HTC One M9 and Google Nexus 6 both showed a shortfall of 19 percent, while the LG G4 lagged by 21 percent, compared to the iPhone 6.
We think this information is interesting and others do too, judging from the way journalists and product reviewers have responded to it. GameBench's cross-platform comparisons also offer a way to speed up and scale up cooperative efforts between hardware and software engineers across the mobile industry, which is why OEMs, chip designers and game studios are starting to make use of our data and tools. However, the data will only be truly constructive if they're interpreted the right way: not as judgements of hardware or software, but as evidence of how pairings of devices and apps come together to produce good or bad user experiences. This distinction still leaves a lot of people stumped.
After we published our last report, we saw plenty of commentators using our work as ammunition to argue that "my phone is better than your phone." Some hardware-centric readers even suggested that our evidence proved certain technical superiorities in the iPhone's GPU, involving its texture compression formats, pixel data storage formats, and the precision of its arithmetic logic units. These notions all ignore the influence of game developers and the software optimisation process, so they are not logically supported by our data.
Texture compression is actually one area where the developer’s decisions are crucial to the end result. The developer may choose to use an older (and worse) type of texture compression in their game for the sake of being compatible with older devices, or because they are not aware that better choices were available to them. If this game then looks bad or performs poorly on a very modern device, whose more up-to-date texture compression capabilities are left unexploited, this can't really be blamed on the hardware.
Our performance tests don’t apportion credit or blame to hardware factors for the simple reason that our methodology wasn't designed for this. Specifically, unlike traditional hardware benchmarks, we don’t fix the software load that is applied to different devices. We wouldn't even try to control this variable, because doing so would require synthetic workloads rather than the real workloads that we wish to measure (and that users actually care about).
To illustrate this point about measuring real workloads, and why this is useful even though it doesn't necessarily identify causal factors, let's look at the cross-platform sci-fi strategy game, XCOM: Enemy Within. From a pure engineering perspective, the iOS and Android editions of XCOM technically constitute different software loads and therefore can't underpin any sort of hardware comparison: they don't have the same code, they don't play at the same resolution and they probably don't exploit available hardware capabilities to the same degree. From a user's perspective however, XCOM is marketed as the same game on both platforms, with the same price tag and the same promise of letting you defend the earth against an alien invasion. So we absolutely can use it to compare user experiences -- and when we do, the results are pretty interesting.
GameBench shows that XCOM plays smoothly on the iPhone 6, iPhone 6 Plus and the GS6, at a steady 30 frames per second (fps). On the other hand, the game stumbles along at just 22fps on the LG G4. The game also murders the G4's battery, draining it around 50 percent quicker than it does on the GS6, despite the fact that the GS6's battery has a smaller physical capacity.
We can't know from these top-level figures what's hurting the user experience on the G4, but we can be pretty sure it's not just hardware. If we tried to lay it at the feet of the GPU for example, we would then have to explain why the Google Nexus 6 plays XCOM rather better, at 25fps, and with less battery drain, despite having very similar GPU specs and the same display 1440p display resolution as the G4.
A whole range of different factors could be at play, but what matters most is that the G4's problem with XCOM is properly highlighted and not just dismissed as a hardware issue. Ideally, it would be investigated through the sharing of performance data between the OEM and the game developer, and then fixed for the benefit of LG customers who want to indulge in some smooth, stutter-free killing of extraterrestrials. If this same approach could be used to assess and optimise many popular device-app pairings on Android, preferably during pre-release testing, then the platform's performance deficit relative to iOS would very likely disappear.