Quote from: Devin on November 30, 2020, 10:46:14
Why? If the displays are the same and at max brightness use a majority of the power budget, then you would expect the difference in runtime to be much less at high brightness.
Why? Elementary mathematics. You can calculate from the numbers given how much power was being used in each case on average. And you can look at the differences. If you keep all else equal between the tests and you just change the brightness setting, then the difference should come down to the display. And it should be the same regardless of the processor if the display is the same and you use the same settings. Similarly, if the brightness is set the same, displays are the same and just the processors differ, then the difference should come down to the processor (actually, the entire rest of the system but they're tied together) and shouldn't depend on brightness setting. That's just logic. And it doesn't work out. Neither one of them.
Of course, there is also the question of manufacturing variance. And if you're comparing a single run to a single run, there is a lot of uncertainty. You've got no clue how repeatable those numbers are. Are we seeing a manufacturing variance? Is it a different panel with the same maximum brightness, just lower efficiency? Is the brightness actually higher? Was it a fluke? The numbers just don't add up. That's a fact. Unless I made an error; it was just a quick back of an envelope check.
We're talking about less than a watt here. But with 3-6 watts in total for the entire system, it's significant. You're, of course, right that impact of a display goes up as total consumption goes down.