Quote from: kekeqs on June 15, 2024, 19:08:54I think you are confusing Intel's Xe iGPU with their Arc GPUs.
You can see tons of videos where Xe iGPU works fine with most games (and no complaints about drivers) compared to their dedicated GPUs.
I didn't mix it up. Examples for Iris Xe graphical issues:
- Crysis Remastered: a square part of the sky is red-colored
- Avatar Frontiers of Pandora: everything is red-yellow
- Alan Wake 2: same issue as in Avatar
- Assassins Creed Valhalla: the colors of clouds (should be white) and sky (should be blue) are inverted (clouds are blue, sky is white)
- Using the Iris Xe in Unreal Engine 5 Editor produces similar graphical artifacts
Some people would now say "who would want to use it for gaming anyways?", but that isn't an excuse. An iGPU must be able to display colors correctly. If the iGPU is slow or stutters - it's acceptable due to the nature of it having less resources than a dGPU. But it's absolutely NOT ACCEPTABLE, if it produces serious graphical glitches. The Iris Xe issues haven't been fixed for 3 years, and will never be fixed in the future.
Note, that even a 4 year old Vega 8 iGPU has not a single one of these errors. And don't get me started about frametimes of the Iris Xe. It's sooo bad. The truth is, the Iris Xe hasn't gotten much love. As such I would never recommend anyone to buy an Intel CPU of the 12th, 13th, 14th generation, if that person wants to do any 3D work or casual gaming with the iGPU. That person should either use AMD or for Intel Arrow Lake.
Intel has become better with the Arc graphics, which I think has fixed the above mentioned color issues, and also the frametimes are slightly better than before. But overall in frametimes and fps the Arc iGPU still loses in much more games compared to AMD iGPU. I've watched benchmarks over and over again, and while in some cases Arc can keep up or slightly win over AMD, the majority of games simply run better with AMD.
What I specifically mean with "Intel could learn a lot from AMD" is, that they should make the iGPU work better together with the CPU. AMD has nearly perfected iGPU-CPU working together, and thus is like 1.5+ times faster with the same amount of shader cores. Or the other way round, Intel needs 1.5+ times the shader cores to get the same gaming results as AMD.
For example:
Intel needs 1024 shader cores to achieve the same performance as AMD with 768 shader cores (Intel Arc-8-Xe-cores iGPU vs AMD Radeon 780m).
Intel needs 512 shader cores to achieve the same performance as AMD with 256 shader cores (Intel Arc-4-Xe-Cores vs AMD Radeon 740m iGPU, where AMD is actually 2.0 times more efficient)
I would like to see Intel improve their iGPU performance to a similar state as AMD with similar hardware specs, but as of now it's not yet the case.