The obvious problem is that load matters. It makes no sense to compare numbers obtained from different tests. The 3950X can take about 150 W AFAIK, if not more (again, depends on the workload). Which is great. It's just a lot more than "less than 100 W".
Also, I thought it's well understood that having more cores at a lower frequency is more efficient than having fewer cores at a higher frequency. As long as the load scales out practically linearly. Up to a point. Because the cores have to be connected to memory and the more cores you have, the bigger problem the interconnect becomes.
So, for Intel, it's not just about efficiency of the node. They're at a disadvantage from the simple fact that they push higher frequencies. That's physics. Which can yield better results in practical loads (many loads don't scale out well at all). But in a benchmark designed to scale out, you're screwed. If you have to run at 5+ GHz and your competitor can match you at under 3 GHz (because he has double the cores, so can run at half the frequency with the same IPC), you really are screwed.
I'm not trying to defend the 10980HK. 5+ GHz chip in a laptop is kind of bonkers. It just doesn't fit with the mobile aspect. Consider how much power the desktop 8 core 5+ GHz chips take. Laptops don't exist in a separate universe. The same laws of physics apply. But articles like this are trash. Rather than providing interesting information and educating people, they just seek sensation. It pisses me off.