The following holds true for ALL the laptops, not only this one: actually we have to account for the difference between the maximum of 35 degrees and the test temperature of only 20 degrees. That makes a 15 degrees, however if the test would have been performed at 22 degrees, the difference would have been 16 degrees, and if the test would have been at 24 degrees, the difference would have been 17 degrees, thus leading to a maximum of 41 degrees !!! The reason is very complicated to explain and involves also some chemistry knowledge, but the fact is that in reality, nobody has 20 degrees at home/office, if it's summer, probably between 26 and 29. If hotter, air conditioning is used, but not necessarily when the temperature is bellow 29. If it's winter, probably the temperature is between 23 and 25. If lower than 23 central heating is used, but actually 20 degrees is not . That would give a difference of 18 degrees in winter to a total of 42-44 and almost 20 degrees in summer to a total of 46-49 degrees. And these are the real temperature when the laptop is absolutely new. After a year, just before replacing the thermal paste you can easily add another 20% heat => hardly bearable. Now, all these temperatures apply to an ULV processors, the coolest CPU and coolest laptop, being barely at the tolerability limit. Imagine how could you bear another laptop which yields a 20 to 25 degrees difference - applicable for most of the laptops in existence: a couple of thousands euros spend for something you can use exactly like a desktop and not at all on top of your lap.
This is the reality, the smart electronic engineers try to lower the TDP, to increase the CPU operations over TDP ratio, and stupid mechanics, lower the cooling system specs, lower the power of the battery, etc, so that nobody will actually benefit from the progress, actually trying to make them profit from the difference, but the sad thing is that they win 10 to 20 dollars more, but we spend hundreds more for nothing.