Quote from: SlimOwner on November 24, 2023, 22:43:44Regarding the high idle watt usage. You need to install an older version of the nvidia driver (mid october 2023). It's a bug. Then, the idle watt usage is around 10watts. I have around 10 hours of battery life...
Well, what kind of "work" can there be in Idle mode, right? 100Wh/10 = 10 hours ideally. In real work, everything will be much worse, only if your job is not typing text in a word processor. Even banal consumption of information in real surfing will immediately lead to a 1.5-2 times reduction in battery life.
In a good way, in order for even moderately heavy regular surfing to work for at least 10 hours, the system consumption, in principle, should not exceed 10W. And this is impossible. And if you add something else running in the background, everything gets even worse.
Local tests for battery life can be safely divided by 1.5-2 times. This is proven by reviews on competing sites with the same models.
----
This is precisely the problem with modern laptops - they consume more and more at the slightest serious impulse load.
The goal should be the opposite (relative to the cheating that Intel did because of its hopelessness with their backward technical processes for the last 8 years and thanks to its almost monopoly position, it forced AMD to get involved in this race of consumption growth (sales 5: 1 in the x86 market in favor of Intel) and even Apple) - to reduce the overall consumption of laptops (and smartphones), while simultaneously increasing overall performance.
As a result, of the high-consuming devices in a laptop (smartphone), only 2 areas should remain (3 taking into account wireless communication over a longer distance) - image and sound output, because they depend on the physical sensitivity of the human organs of vision and hearing (in an ergonomic, safe zone, of course) - and these thresholds cannot be reduced in any way and, in general, there are physical restrictions much higher than for chips and circuits that serve them.
Now everything is bad at the level of key chips and circuits - they simply consume a lot and inefficiently, laptops have long left the comfortable power consumption zone for both cooling systems and battery cells. And progress (performance per 1W) is slowing down and slowing down.
There is either no progress in batteries, or it is deliberately not shown to the public - for example, for safety reasons.
Those. Now, in order to bring the same laptops to a normal state in terms of consumption within the capabilities of the current technology for the production of mass batteries, it is necessary to reduce the average operating consumption of laptops on the x86 platform by literally 3-4 times immediately, simultaneously. Which will immediately lead to a sharp decline in their overall performance, for obvious reasons.
Progress in recent years has not been due to a real rapid increase in performance per 1W, as before, but due to a "boost" (doping) in consumption beyond any reasonable limits for this class of devices.
Apple is the best here for obvious reasons - its chips have the best possible process technology on the planet from TSMC, today it is "3nm". And also a reasonable approach to limiting performance by boosting TDP. As a result, their laptops practically do not lose performance on battery compared to running on a power supply and at the same time have greater battery life - try to do the same trick with your Slim and quickly see how much performance it suffers on battery compared to running on a power supply.