Quote from: Andreas Osthoff on December 01, 2020, 16:25:08Thank you for the update. When you test the MBP13 M1, can you also do a run at 400 cd/m2? To compare against the MBA.
Hi, since the result was so much lower for the Wi-Fi runtime at max. brightness, we did repeat it. But the result was the same with a difference of a few minutes.
Quote from: Svallone on November 30, 2020, 19:07:10Of course the difference is largely due to the consumption of the display. Going from 150 cd/m2 to maximum brightness almost doubles consumption. I'm just saying that not necessarily all of it is. If I take the difference between maximum and 150 cd/m2 from the Intel version and apply it to the M1 version, I gain about 42 minutes. That's 9:10. IIRC. It's not a world of difference, but it's still 42 minutes. You're still losing almost 7 hours due to brightness alone. I do wonder how efficient the display is compared to low power displays used by Lenovo, for example. I might run the numbers later.
You are getting an important point, variance bound to the component type. Nevertheless, even that supposed variance due to display difference, assuming displays are different, is far from explaining the difference in low vs high brightness power levels. Therefore, I think that what the other reader wrote is fundamentally right. The difference in power consumption observed here is largely bound to display power consumption. I.e. if they want devices that last longer, OEMs should look at reducing power consuption of the SOC, for sure, but also and especially of the screen.
Quote from: _MT_ on November 30, 2020, 17:57:40Quote from: Devin on November 30, 2020, 10:46:14Why? Elementary mathematics. You can calculate from the numbers given how much power was being used in each case on average. And you can look at the differences. If you keep all else equal between the tests and you just change the brightness setting, then the difference should come down to the display. And it should be the same regardless of the processor if the display is the same and you use the same settings. Similarly, if the brightness is set the same, displays are the same and just the processors differ, then the difference should come down to the processor (actually, the entire rest of the system but they're tied together) and shouldn't depend on brightness setting. That's just logic. And it doesn't work out. Neither one of them.
Why? If the displays are the same and at max brightness use a majority of the power budget, then you would expect the difference in runtime to be much less at high brightness.
Of course, there is also the question of manufacturing variance. And if you're comparing a single run to a single run, there is a lot of uncertainty. You've got no clue how repeatable those numbers are. Are we seeing a manufacturing variance? Is it a different panel with the same maximum brightness, just lower efficiency? Is the brightness actually higher? Was it a fluke? The numbers just don't add up. That's a fact. Unless I made an error; it was just a quick back of an envelope check.
We're talking about less than a watt here. But with 3-6 watts in total for the entire system, it's significant. You're, of course, right that impact of a display goes up as total consumption goes down.
Quote from: Devin on November 30, 2020, 10:46:14Why? Elementary mathematics. You can calculate from the numbers given how much power was being used in each case on average. And you can look at the differences. If you keep all else equal between the tests and you just change the brightness setting, then the difference should come down to the display. And it should be the same regardless of the processor if the display is the same and you use the same settings. Similarly, if the brightness is set the same, displays are the same and just the processors differ, then the difference should come down to the processor (actually, the entire rest of the system but they're tied together) and shouldn't depend on brightness setting. That's just logic. And it doesn't work out. Neither one of them.
Why? If the displays are the same and at max brightness use a majority of the power budget, then you would expect the difference in runtime to be much less at high brightness.
Quote from: Anony on November 30, 2020, 00:07:19
150 nits test ahahahahahaha
Test in real conditions next time please
Quote from: _MT_ on November 30, 2020, 10:31:24
If they indeed have the same displays (Intel and M1), then the numbers don't add up. Either the M1 must have higher maximum brightness, or its display must be a lot less efficient (by about 20 %, I would say) or there was a background process or something that skewed the results.
Quote from: Alejandro on November 27, 2020, 18:42:56Technically, you could charge your laptop using a 5 W power supply. It would take a long time, but it should charge just fine (as long as it was powered off, obviously).
I have tried a MacBook 12 m3 and the new MacBook Air m1 and both can charger with an anker PD 3.0 20W and the newer Apple 20W.
It could be very interesting in logictics terms, Apple could makes only one power brick 25W? as Samsung makes, and other 65-96 for MacBook Pro.
That could save money for Apple, and consumers as well, on account of similar consumption
Quote from: Astar on November 27, 2020, 18:40:55I can only imagine they use this value for historic reasons (you don't want to change methodology too often). They have been doing it that way for a long time. Displays used to be quite dimmer than good displays are today. I remember a time when premium business laptops were around that value and some of the cheaper displays couldn't hope to reach such heights. I believe back then, NBC Wi-Fi test was done at minimum brightness. If you can believe that. Choosing a value that some displays won't be able to meet is problematic. Although, bumping it to, say, 250 would be worth a consideration (at least 200-220 - hopefully nothing of consequence is below that today). Ideally, you'd test a range of values. This is further complicated by different materials and finishes. A classic matte display can do with lower brightness for the same legibility. And believe me, those dim displays were used in daytime as well, not just at night (150 is actually quite unpleasant at night, at least to my eyes).
Battery runtime test conducted with a display at 150 nits??!? That is so ridiculously dim! What the hell can you see? Did you run that test in a completely dark room?!
The general quality of the Notebookcheck reviews and articles have been seriously going downhill !