News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by RobertJasiek
 - June 16, 2024, 16:13:11
Quote from: Alan on June 16, 2024, 15:29:45If you look at the very large number of Macbooks being sold today, a lot more people don't really care about a dGPU.

What do you mean by "today"? 2024-06-16? Or an expectation for the whole year 2024?

What is the number of Macbooks being sold "today" so that you call it "very large"?

How do you assess "a lot" as a measurement of the number of people that don't care about a dGPU?

Surely, you must be aware that the group of people that don't care about a dGPU include users of Macbooks, other users of such Apple devices without dGPU, users of non-Apple computers without dGPU that are employees and private users of non-Apple computers without dGPU. It is well known that most of such computers run Windows. Therefore, I do not understand what your "a lot more people" refers to. A lot more users of computers without than with dGPU? A lot more users of Macbooks but not of other Apple devices without dGPU?

I guess you might have something to say but your text comes across as Apple PR claiming that the most significant group of non-dGPU users would be Macbook users - a very false statement. Therefore, please explain what you actually mean!
Posted by Hotz
 - June 16, 2024, 15:47:32
Quote from: Alan on June 16, 2024, 15:29:45
Quote from: Hotz on June 15, 2024, 18:50:48But as we all know these strong AMD iGPUs will only ever come in small quantities for laptops, and 9 out of 10 devices have an additional dedicated GPU, which makes the main strength of the APU obsolete.
... a lot more people don't really care about a dGPU.

Personally, I would much rather prefer a Ryzen 7840U/680M that can handle medium gaming in a thin & light form factor than a heavy gaming laptop

So while your claim that 9 out of 10 laptops will have a dGPU may be true, I doubt 9 out of 10 buyers are looking for laptops with dGPU

That was precisely my point as well (maybe I didn't express it clearly enough?). Many people looking for a laptop actually don't want a dGPU anymore. They would be satisfied with a decent iGPU. But those awesome iGPUs from AMD are mostly built into laptops which also have a dGPU built in. As in 9/10 laptops. Often gaming laptops. And thus these laptops are also more expensive. This has been the case for most of the 680m and 780m laptops, and the same stupidity will probably be applied to most of the 880m laptops. Thus the situation is kind of a joke for the customer.
Posted by Alan
 - June 16, 2024, 15:29:45
Quote from: Hotz on June 15, 2024, 18:50:48But as we all know these strong AMD iGPUs will only ever come in small quantities for laptops, and 9 out of 10 devices have an additional dedicated GPU, which makes the main strength of the APU obsolete.

If you look at the very large number of Macbooks being sold today, a lot more people don't really care about a dGPU.

Personally, I would much rather prefer a Ryzen 7840U/680M that can handle medium gaming in a thin & light form factor than a heavy gaming laptop that barely runs for 4 hours unplugged. Heck, even the older Vega8 chips like the 15W TDP Ryzen 7730U work perfectly fine for todays workloads. I've seen friends play older AAA games on it and actually having fun.

So while your claim that 9 out of 10 laptops will have a dGPU may be true, I doubt 9 out of 10 buyers are looking for laptops with dGPU. There just aren't that many gamers looking for a laptop as a primary gaming device when a desktop is so much more cost effective. Plus, not all laptop buyers are gamers so an improved iGPU will benefit people who aren't looking for a laptop with a dGPU. I believe it's why AMD still invests in iGPU instead of aggresively pushing their dGPU for the last few years. Can you even guess how many laptop models come with an AMD dGPU?
Posted by Starjack
 - June 16, 2024, 10:25:39
Sorry to go off the subject, just want to bring it out.
Posted by Starjack
 - June 16, 2024, 10:12:41
"I think you are confusing Intel's Xe iGPU with their Arc GPUs.

You can see tons of videos where Xe iGPU works fine with most games (and no complaints about drivers) compared to their dedicated GPUs. Do note that they are on their first gen dedicated gpu, too, but I wouldn't worry about driver support. Intel has the resources to get good at it, so it's just a matter of time."


The thing about it is i don't know if Intel made these iGPUs just for gaming or be good in gaming when they seem a bit conservative in what they offering. Take this from my experiences with my current laptop has the 12th Gen i3 CPU with UHD Graphics iGPU that has 64EUs, when it run games much better than my last AMD powered laptop but at least with one game there were graphics glitch. Whilst the Xe and ARC series are much faster than HD and GMA series iGPUs of the past, they still suffer the same way because of issues with graphics or display drivers from Intel. It makes it seems that the Xe i/dGPUs from Intel run games with more quantity but less quality. Quality may only be in favor of rendering high def resolutions of tv/monitor screens.
Although i only wondered when Raja Koduri join up with Intel after leaving AMD, what he did offer to Intel may help push them into making competent graphics solutions for both business and gaming markets? Other than offering the Xe Series, why didn't he allow them to change the architect too? Other than relying on Execution Units (EUs), why not using stream processors or shaders straight out? The number of these graphics cores especially along with good drivers use by AMD and Nvidia make them competent enough in running games. Might be the same with Apple and their costumed-made iGPUs as well. And yes i know EUs contain shaders or ALUs in them.

"An iGPU must be able to display colors correctly. If the iGPU is slow or stutters - it's acceptable due to the nature of it having less resources than a dGPU. But it's absolutely NOT ACCEPTABLE, if it produces serious graphical glitches."

I agree. Although sometimes the problem may not lie with the iGPU alone, it could also come from Windows OS or the game itself. Windows OS because we might tweak settings from it to give better gaming performance if we're not please with how the iGPU runs. The game because it may be incompatible, runs on a old graphics engine or doesn't support the latest DirectX version.
Posted by heffeque
 - June 16, 2024, 09:16:57
Going back to topic, I'm wondering if Zen 5's efficiency is part of the reason that it's faster, but not directly as in "better CPU equals better performance", but indirectly as in "since less power is needed for the CPU, more power can be dedicated to the GPU".

Obviously faster RAM, more units, and architecture improvements are also adding up.

IMO, the HX 370 will be a beast that many people into mini-PC, handhelds, and entry level laptops will love.

I do see the need for an equivalent with a much smaller GPU for devices with higher-end discrete GPU (similar to the 7945HX).
It does seem like a shame to have such a great iGPU to be unused.
Posted by George
 - June 15, 2024, 22:24:54
Quote from: Hotz on June 15, 2024, 20:14:31I would like to see Intel improve their iGPU performance to a similar state as AMD with similar hardware specs, but as of now it's not yet the case.

The problem with team Blue is and always will be is that they are LATE to the game.

AMD and Nvidia have been doing this for decades while Intel until fairly recently stayed away from the graphics market as it is not really needed/required for displaying "office" applications.

Then the "AI" thing came around with Nvidia leading, AMD playing catch up and team Blue having nothing to show.

YES, they MAY eventually 'catch up' however in 2024Q2 anyone wishing to use an iGPU for anything more complicated then 'office applications' I'd recommend a device with an AMD APU and still for those that are hoping for 'serious gaming' or professional CAD/CAM they will require a device with a dGPU.

Actually to give credit where it might be due, Intel has made great strides in the GPU department over what they were offering just a few years ago however that is not to say that they (AND AMD!) still have room for improvements.
Posted by Hotz
 - June 15, 2024, 20:14:31
Quote from: kekeqs on June 15, 2024, 19:08:54I think you are confusing Intel's Xe iGPU with their Arc GPUs.

You can see tons of videos where Xe iGPU works fine with most games (and no complaints about drivers) compared to their dedicated GPUs.


I didn't mix it up. Examples for Iris Xe graphical issues:

- Crysis Remastered: a square part of the sky is red-colored
- Avatar Frontiers of Pandora: everything is red-yellow
- Alan Wake 2: same issue as in Avatar
- Assassins Creed Valhalla: the colors of clouds (should be white) and sky (should be blue) are inverted (clouds are blue, sky is white)
- Using the Iris Xe in Unreal Engine 5 Editor produces similar graphical artifacts


Some people would now say "who would want to use it for gaming anyways?", but that isn't an excuse. An iGPU must be able to display colors correctly. If the iGPU is slow or stutters - it's acceptable due to the nature of it having less resources than a dGPU. But it's absolutely NOT ACCEPTABLE, if it produces serious graphical glitches. The Iris Xe issues haven't been fixed for 3 years, and will never be fixed in the future.

Note, that even a 4 year old Vega 8 iGPU has not a single one of these errors. And don't get me started about frametimes of the Iris Xe. It's sooo bad. The truth is, the Iris Xe hasn't gotten much love. As such I would never recommend anyone to buy an Intel CPU of the 12th, 13th, 14th generation, if that person wants to do any 3D work or casual gaming with the iGPU. That person should either use AMD or for Intel Arrow Lake.


Intel has become better with the Arc graphics, which I think has fixed the above mentioned color issues, and also the frametimes are slightly better than before. But overall in frametimes and fps the Arc iGPU still loses in much more games compared to AMD iGPU. I've watched benchmarks over and over again, and while in some cases Arc can keep up or slightly win over AMD, the majority of games simply run better with AMD.


What I specifically mean with "Intel could learn a lot from AMD" is, that they should make the iGPU work better together with the CPU. AMD has nearly perfected iGPU-CPU working together, and thus is like 1.5+ times faster with the same amount of shader cores. Or the other way round, Intel needs 1.5+ times the shader cores to get the same gaming results as AMD.

For example:
Intel needs 1024 shader cores to achieve the same performance as AMD with 768 shader cores (Intel Arc-8-Xe-cores iGPU vs AMD Radeon 780m).
Intel needs 512 shader cores to achieve the same performance as AMD with 256 shader cores (Intel Arc-4-Xe-Cores vs AMD Radeon 740m iGPU, where AMD is actually 2.0 times more efficient)


I would like to see Intel improve their iGPU performance to a similar state as AMD with similar hardware specs, but as of now it's not yet the case.
Posted by Bluegizmo83
 - June 15, 2024, 20:02:16
QuoteIt could bring more powerful mini PCs and, most importantly, better-performing gaming handhelds (preorder ROG Ally X on Best Buy).

That's messed up. Don't intentionally mislead people in your articles like this! The way that "preorder the Ally X" was thrown onto the end of that sentence makes it sound like you're telling people to pre-order the Ally X to get the better gaming performance of the new AMD Strix 890M chip, which the Ally X does NOT have!
Posted by CnH
 - June 15, 2024, 19:39:13
Quote from: kekeqs on June 15, 2024, 19:08:54no complaints

No, Xe has serious issues too. Notably, in rpcs3 usage they take/use a lot more power than igpu's by AMD.

Hope Lunar Lake fixes this and makes massive strides in this area.

Honestly a bit disappointed with rdna3.5 only matching an RTX 2050 but I suppose it's massively b/w limited being an APU using dual channel shared memory.
Posted by julia_beautiful
 - June 15, 2024, 19:13:44
Now I am clear, the future of my next ultrabook will be an AMD Zen 5 with one of these iGPUs and the product I want is the MINISFORUM V3 but updated with Zen 5, 70% of the population is in this type of market niche and practically 100% of SMEs and small businesses, so I see a promising future for this iGPU and AMD processors. Also a sector where everyone will want this AMD Zen 5 processor, it will be essential in handheld consoles to have maximum power with minimum consumption.
Posted by kekeqs
 - June 15, 2024, 19:08:54
Quote from: Hotz on June 15, 2024, 18:50:48That's impressive, especially with the knowlege that AMD graphics performs much better in gaming than in synthetic benchmarks. Intel is exactly reverse, being good in synthetic benchmarks but bad in games. Intel could learn a lot from AMD driverwise.

But as we all know these strong AMD iGPUs will only ever come in small quantities for laptops, and 9 out of 10 devices have an additional dedicated GPU, which makes the main strength of the APU obsolete. (It probably differs for Mini-PCs, but that is another story with all these no-name-brands with no warranty and support). At the same time Intel iGPUs will come in big quantities, but which are significantly weaker.

The whole situation is a joke for the customer.

I think you are confusing Intel's Xe iGPU with their Arc GPUs.

You can see tons of videos where Xe iGPU works fine with most games (and no complaints about drivers) compared to their dedicated GPUs. Do note that they are on their first gen dedicated gpu, too, but I wouldn't worry about driver support. Intel has the resources to get good at it, so it's just a matter of time.

As for who is stronger, idk, but a product that's easily available is better than one that's not. AMD has had issues with having enough stock for laptops that you stop to wonder if there might be yields problems still.
Posted by Hotz
 - June 15, 2024, 18:50:48
That's impressive, especially with the knowlege that AMD graphics performs much better in gaming than in synthetic benchmarks. Intel is exactly reverse, being good in synthetic benchmarks but bad in games. Intel could learn a lot from AMD driverwise.

But as we all know these strong AMD iGPUs will only ever come in small quantities for laptops, and 9 out of 10 devices have an additional dedicated GPU, which makes the main strength of the APU obsolete. (It probably differs for Mini-PCs, but that is another story with all these no-name-brands with no warranty and support). At the same time Intel iGPUs will come in big quantities, but which are significantly weaker.

The whole situation is a joke for the customer.

Posted by Redaktion
 - June 15, 2024, 17:50:03
Golden Pig Upgrade shared the 3DMark Time Spy benchmark results of the Radeon 890M iGPU inside Ryzen AI 9 HX 370 on Bilibili. The integrated graphics card with the 16 compute units (CUs) reportedly scores over 3600 on the benchmark, which puts it very close to the Nvidia RTX 2050 GPU.

https://www.notebookcheck.net/AMD-Radeon-890M-iGPU-of-Ryzen-Strix-Point-reportedly-scores-close-to-RTX-2050-on-3DMark-Time-Spy.848589.0.html