I was watching the Mali GPU training video 2-2 Best practices principles. At around 3:50 there is a cycles per pixel calculation presented.
It shows (ShaderCores * Frequency) / (TargetFPS / Pixels).
So my question is: why is there no warp size in this? Shouldn't it be more like (ShaderCores * WarpSize * Frequency) / (TargetFPS / Pixels)?
Of course the "derated" factor would then be smaller to counter "useless" (not contributing to final image) helper lanes. Could it be that this is just a theoretical difference for actual cycles per pixel and the warp size doesn't make that big of a difference in real world use? Or is there something I got wrong in my base assumption to include the warp size in the calculation?
Loving the Mali GPU training videos so far! :)