Hi
Does the DVFS of mali400mp make sense?
My platform is ODROID, which equipped with the mali400mp GPU.
I tried to find out how the GPU affect the power consumption of the platform.
The CPU frequency and bus frequency were all fixed.
The mali400mp GPU only supplies two available frequency. They are 160MHz and 267MHz, and their corresponding voltage are 950000 and 1000000 respectively.
mali_dvfs_table mali_dvfs[MALI_DVFS_STEPS]={ /*step 0*/{160 ,1000000 , 950000}, /*step 1*/{267 ,1000000 ,1000000} };
I compared the power consumption of ODROID under different GPU frequencies by fixing them one by one when playing 3D games.
But the power conumption of ODROID varied slightly (less than 5%).
I don't think the result is resonable. Because if less than 5% power saving is achieved, why GPU DVFS ?
According to the papers I have investigated, the power consumption of GPU should be increasingly significant.
But our results show that the power consumption of GPU only occupies a tiny percentage of the platorm's power consumption.
Are there any problems in my experiments ?
Thank you very much.
Beilei Sun
Hi Beilie Sun,
How are you measuring power? If you measure the battery power output then that will often include the whole platform, including DDR memory and the display panel, for example, which will skew your analysis.
Power is proportional to V2 so all other things being equal energy per operation should follow a square law.
(0.952)/(1.02) = 0.9x
... so I would expect a ~10% improvement in energy efficiency per clock (note this is energy per operation (J), not power (W)). The GPU is clocked at 60% of the rate (166/267) so under sustained load I would expect instantaneous power for the GPU power rail to drop:
0.6 * 0.9 = 0.55x
There are lots of assumptions in there of course - is the load sustained all the time, or is the GPU going idle, how hot does this SoC get (it will leak more when it gets warmer), etc. Note that this is the analysis for _only_ the GPU power rail - power draw for everything else will not be affected.
Mali GPUs are designed for energy efficiency; for most use cases it is not uncommon for the GPU to only be sipping at the battery ...
HTH, Pete
Hi Pete
We can measure the power consumption of GPU directly based on our available measure devices.
... (0.952)/(1.02) = 0.9x ... so I would expect a ~10% improvement in energy efficiency per clock (note this is energy per operation (J), not power (W)). The GPU is clocked at 60% of the rate (166/267) so under sustained load I would expect instantaneous power for the GPU power rail to drop: 0.6 * 0.9 = 0.55x ...
...
Minyong Kim pointed out that GPU accounts for 12%~20% of total system power consumption in his paper <Accurate GPU Power Estimation for Mobile Device Power Profiling> in 2013. But he did not specify his experiment platform. According to his results and the above calculation,
0.55*12%~0.55*20%
about 6.6%~11% power consumption should be saved. But this percentage is obviously higher than our results.
The power consumption of Mali 400MP only take a small part of that of the platform. So the DVFS of Mali 400MP may only have insignificant power saving influence for the platform.
But in the future, will GPU become a new power saving bottleneck of smart phone ?
Best regards!
Beilei
Minyong Kim pointed out that GPU accounts for 12%~20% of total system power consumption in his paper <Accurate GPU Power Estimation for Mobile Device Power Profiling> in 2013.
I think such generalizations are very difficult to make. If someone puts an 8-core Mali-450 alongside a single Cortex-A7 you are going to get a very different relative power load to a 1-core Mali-450 along side a 4-core Cortex-A15. It really depends on many factors: system configuration, what workload you are running, how the GPU was synthesized in silicon, what silicon process, temperature, what content etc - so my gut feel would be that such generalizations are over simplified.
I'd rather have the 5% power saving than none at all =) In most modern devices that translates in to hours more usage on a single battery charge. It should also be noted that on some SoCs the voltage and frequency choice ranges can be much wider than this (the available range depends heavily on silicon process and implementation) - for example a voltage range of 0.7v to 1.0v would give a 50% improvement in energy efficiency per operation, rather than the 10% you see on your platform. In this scenario DVFS for very light workloads would be a large benefit.
But in the future, will GPU become a new power saving bottleneck of smart phone?
See above - it really depends on the specific SoC and content you are running.
Thanks very much.
I agree with you that the power saving of GPU DVFS for the platform really depends on the specific SoC and the contents.
But in the future, will GPU become a new power saving bottleneck of smart phone? See above - it really depends on the specific SoC and content you are running.
The GPU in the emerging smartphones tends to be more power critical because of the higher frequency and increasing cores.
If GPU is used to execute tasks that are processed by CPU nowadays in the future, then will GPU become a new power saving bottleneck of smart phone?
Thanks again for your great suggestions.