Quote from: 88&88 on August 29, 2022, 16:50:20QuoteBecause GTX 1080Ti is "less efficient" than a 4080Ti. If you recreate 1080Ti with 4080Ti node and specs it will consume more power. Lovelace is architecturally superior and consumes less energy while giving more performance. Its just NVIDIA pushing the power limit to the max with Ampere and Lovelace.
I've doubt about what you wrote, i'm not hardware engineer, but every new node in comparison of old one consume less or improve perf at same energy, this according tech sites, and according this logic, recreating the same GTX 1080 arch in a lower node, having same perf, the GPU shall consume less, but adding newer memory like GDDR6X even better perf. Hope that TSMC can do this doing comparison, they have everything.
If you recreate 1080 Ti on N5 node, you'll get more energy efficient 1080 Ti.
But if you tweak its architecture a bit (like implement proper async, add native FP16, add more warp/dispatch units, and the rest generational improvements) you'll find out that it's even more efficient.
NVIDIA doesn't follow your approach, because anyway you cannot use old node die masks for newer node. And if you cannot use it, then why not to tweak arch during node switch?
Also, if you look at equal-performance GPUs from different series more precisely, you'll find out that 1080 Ti consumes about 250W, and comparable 2070 SUPER on almost same node (TSMC 16FF vs TSMC 12FFN) requires less (220W).
Samsung 8N is not next node, it's slightly density-tweaked variant of Samsung 10nm (which is a part of 1x nm series nodes), so it's perf/W is not that high, but it still is, and for comparable 3060/3060Ti you'll get 170-200W consumption.