News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by GeorgeS
 - January 29, 2025, 01:30:13
Quote from: Russel on January 28, 2025, 21:57:50When amd claims superiority over 4070, it'd just invite ridicule just like what we see in the comment section. 😂

For those of us that have memories of the last weeks/months/years might recall the fairly recent similar sounding claim that: "the PS5-PRO can outperform a 4070"

I think we all know how that turned out. :)

Granted, AMD has made GREAT strides in their APU's and I can dream of a day where I can get a 12-14" AMD powered laptop for < $1000 and play AAA games on it @1080P without needing a dGPU.

IMHO: out of AMD, Apple, Intel, Qualcomm & Nvidia (if they ever enter the APU laptop market) I'm betting on AMD to deliver APU's gamers can play games on.

Posted by Russel
 - January 28, 2025, 21:57:50
Quote from: APT on January 27, 2025, 12:42:24
Quote from: Russel on January 27, 2025, 11:41:02The comparison is more about incremental upgrade of z13 from previous generation to the current one.

In performance? Sure. But I think people will be pleasantly surprised when it comes to efficiency and battery life. When you do things all on one chip on package and don't need to communicate to external chips through longer distances coupled with GDDR memory there is often huge energy savings cost. (Which benefit heat/thermals and noise of device)

I think for people in the US, you're better off getting a last year G14. As there are deals that can be had for $1000 there, so paying this much doesn't make much sense. Can buy a cheap tablet if really need stylus support. But for people in the Europe, where most RTX 4070 devices are over 2000 anyway, this doesn't seem like a bad deal, anymore.

I don't think of this as a barely improved 2023 rog flow z13 but more of a bigger ROG Ally with 3 times the performance, more memory and storage.

The only worry I've is warranty. I don't really trust Asus enough when it comes to theirs or their reliability. Hope this device is sold at retail stores with good after sale policies. (E.g. Costco)

I wasn't trying to discredit them of any progress made in efficiency.
I was merely stating that it's rather deceptive to portray it as a victory over 4070 mobile, because a 4070 mobile in a legion 5 pro would perform much better.
So rather than saying that amd's chip comes on top, it's more correct to say that Asus has improved what it offers with the z13 by a lot.

When amd claims superiority over 4070, it'd just invite ridicule just like what we see in the comment section. 😂
Posted by Bennyg
 - January 28, 2025, 07:21:31
"more efficient than A 4070" just doesn't do the baiting of clicks that "more efficient than THE 4070" does...

A TDP gimped 4070 is NOT representative of the vast majority of 4070s that are >=100W

With this kind of misleading garbage AMD really is becoming the new Intel
Posted by paulzrecs
 - January 28, 2025, 05:23:54
That 4070 is running at 60w.
Posted by opckieran
 - January 28, 2025, 04:34:03
Quote from: GeorgeS on January 27, 2025, 19:48:22While it is generally up to the the DEVICE OEM to decide what (if anything!) they wish to pair an APU with, this level of performance in a iGPU pretty much negates the need or desire for dGPU in anything but professional CAD/CAM applications.

Sadly the technology & performance may never trickle down into sub $700 devices. (or might be a LONG TIME coming)



TL;DR, While I was writing my initial response below, it dawned on me that no Halo laptop will ever cost $700. That would severely undermine the entirety of AMD/Intel's mobile SOC lineups. Why would a non-Halo laptop be worth more than $700 if the Halo SOCs are far and away AMD/Intel's best chips? The answer would have to be nVidia, and neither AMD nor Intel would want to create a situation where they significantly undervalue their own lineups while simultaneously creating an opportunity by dint of confession for nVidia to market themselves as the premium or must-have option (i.e "it isn't a real laptop without nVidia"). There's no way they'd ever do such a thing as to allow the consumer to perceive AMD or Intel so poorly, and nVidia so highly at the same time.



In more detail:
AMD initially said their 300 series wouldn't allow expandable RAM, but this was proven to be incorrect/bypassed in the NUC market. As for Strix Halo not allowing a dGPU, they've already got the more CPU-oriented 9955HX(3D) SOC with 16 Zen 5 classic cores and a smaller iGPU specifically geared for maximum CPU performance and the gaming market, so there's no need for them to step on their own toes here.

AMD doesn't primarily see Strix Halo as a performance achievement. They've actually been making large iGPU SOCs since the PS4/XBOne, so nearly 15 years (they were taping out well before those consoles launched). To them, this is purely an engineering showcase. And that's why they feel justified in asking for supposedly $2200+ for Strix Halo laptops (again, my money is on a $1600-1800 launch). But in reality it won't be any faster than a 5060 on a good day at the same wattage, but, because it's a SINGLE chip, customers are supposed to be impressed (whoopee). Even though battery life isn't gonna be much better. For $2200, I'd rather just get a slim 5070Ti laptop with a really good screen and maybe even save $300-500 at the same time. Unless running 20-32b parameter LLMs or other similarly large AI workloads locally is that important to you. Even then, I'd demand a minimum of 48GB RAM to do that comfortably.

IMO, this is most likely AMD's way of exiting the laptop dGPU market, as Intel already has (they saw the writing on the wall after a single generation). They're struggling enough as is on the desktop dGPU market, and are almost certainly doing significantly WORSE in laptops. I'd estimate, if they've got 8-15% market share in the desktop dGPU market, they're probably at 0.2%-0.8% for laptop dGPU. You barely see ANY laptops with AMD dGPUs in them, probably because efficiency matters a lot more on laptops AND you don't get the crucial perf/$ and/or VRAM capacity advantage that you do in the desktop space. AMD's own site, TODAY, lists a grand total of NINE RX 7000 series laptops, with more than half of those listings being for the Chinese market. It's depressing. Not even the boutiques like Eurocom, Sager, nor Alienware offer AMD dGPU laptops today. Shoot, the 7800M launched back in September, and the only product it's currently offered in is an outrageously-priced $1150  OcuLink+USB4 Dock, which only gets you Desktop 4070 or Mobile 4080 performance anyways! No price advantage! I'd rather just DIY at that point and save the money!

Point is, that's just how non-competitive AMD are in the laptop dGPU space, and have been for easily a decade now, not even exaggerating. I'd bet money that AMD are finally throwing in the towel for laptop dGPUs, if not after RX 7000S/M(XT), definitely after RX 90M. But what sucks here is that even at a supposedly "low" $1800, AMD are STILL overestimating their brand compared to nVidia. Because who on earth is actually gonna spend $1800+ for a non-Apple device and be satisfied that it didn't come with nVidia? At that price it's probably 1-3% of the high-end laptop market. A niche, most likely high-end Linux laptop enthusiasts. And unless AMD provides an actually substantial alternative to CUDA, it won't get any better for what they're trying to sell to that userbase.

And that isn't even taking into account the up and coming Lunar Lake Halo! So, AMD really has their work cut out for them. Otherwise, in the event of a price war, and Strix Halo will be dragged, by Intel, with their iGPU ironically of all things, kicking and screaming down to the more feasible $1200-1300 price range, where it will then be seen as an "interesting" alternative to the 5070M at best. I doubt $700 would happen since Krakan Point is supposed to live in the $700s and below. Plus, AMD is currently dominating x86 CPU high-end perf and perf/watt anyways so they have a perfectly justifiable reason to charge more than they did back in the day. Who knows, maybe Lunar Lake Halo could hit $1100 in such a scenario (the increased BoM for on-package RAM might make that difficult). But then  that's a stretch for a different reason: it's always been easier for Intel to charge more for less in part due to their terrible ethics shutting AMD out of higher laptop build quality.
Posted by GeorgeS
 - January 27, 2025, 19:48:22
Quote from: APT on January 27, 2025, 10:12:03
Quote from: GeorgeS on January 27, 2025, 03:55:37However will APU's with this iGPU really need to be paired with a dGPU?

Just to be clear, AMD has already stated multiple times now that this chip will 100% not be coming with any dGPUs.

While it is generally up to the the DEVICE OEM to decide what (if anything!) they wish to pair an APU with, this level of performance in a iGPU pretty much negates the need or desire for dGPU in anything but professional CAD/CAM applications.

Sadly the technology & performance may never trickle down into sub $700 devices. (or might be a LONG TIME coming)
Posted by opckieran
 - January 27, 2025, 18:49:07
Quote from: APT on January 27, 2025, 10:12:03
Quote from: opckieran on January 26, 2025, 22:00:37or they're about to launch a chip that will most likely be placed in devices starting at $1600-1800 on the low end

It's already been listed on the Asus US web store for $2199.99 for the 16 core / 32GB variant. I think $1600 expectation might be a bit low. Would expect the lower end 12 core version to be listed around $1999.99. Might drop to the prices you listed in a years time on a Cyber Monday / Christmas sale or whenever Nvidia launch their mobile PC SoC as competition.

Quote from: opckieran on January 26, 2025, 22:00:37just to compete with the upcoming 5060, which will be far more accessible for gaming (likely to launch in devices around $900-1000). Given enough VRAM
...
provided the GPU is allocated enough VRAM.

And that's the problem with the way Nvidia had segmented their lower end mobile budget lineup. Who in their right mind is going to buy a 8GB ram GPU in 2025 (even if RTX 5060 is found in $1000 laptops, which I doubt due to annual price increases not to mention trump tariffs) when there's already plenty of evidence on YouTube showing that it clearly isn't enough now? The Nvidia mobile GPUs that come with more vram likely won't be any cheaper than flow z13.



I'm hesitant to put too much stock in pre-order listings this far out, but at $2200 this APU simply makes no sense for a 16C 32GB RAM config given that nVidia announced that 5080 laptops would also start at $2200. A $2200 Halo APU laptop would have to come with 64GB of RAM at the minimum to stand apart from any 5080 laptop. When I speculated a $1600-1800 entry launch price, that was to allow them to differentiate the 395 from the 370 while still being a sane pick against a 5070 or proposed Lunar Halo laptop (which the more I think about it, there's no way I wouldn't just go with the 5070Ti, which on nVidia's official site, is given a range of 60-115W, which would also enable it to pair with a 370 and stay at the 120W range).


It's entirely likely that 5060 laptops will launch around $1000 or at most $1100, given that x60 series laptops have all launched at that price +/- $100 for the past 3 generations at least, and that nVidia also announced that 5070 laptops will start at $1300. Can't see the entry price being about $1000 at launch given no tariffs.


Speaking of, tariffs haven't been confirmed yet, but will impact the market as a whole, so it's not like they'll grant a price advantage to any particular manufacturer anyways.


Finally, I hold the wildly unpopular opinion that 8GB VRAM is actually fine for a midrange GPU. For starters, the most actively played games today are several years old and have very low VRAM requirements. It's those games and gamers that 8GB GPUs are designed for IMO. Additionally, most users would not be able to tell the difference between medium and high/ultra details in high-paced games where FPS matters more. I think nVidia understands this better than AMD or Intel, especially given how much work they are putting into accelerating AI and DLSS. Unfortunately, nVidia are very much aware that there is currently no equal to them in those areas, thus they can leverage that to charge oodles more for a pittance of extra VRAM.
Posted by APT
 - January 27, 2025, 12:42:24
Quote from: Russel on January 27, 2025, 11:41:02The comparison is more about incremental upgrade of z13 from previous generation to the current one.

In performance? Sure. But I think people will be pleasantly surprised when it comes to efficiency and battery life. When you do things all on one chip on package and don't need to communicate to external chips through longer distances coupled with GDDR memory there is often huge energy savings cost. (Which benefit heat/thermals and noise of device)

I think for people in the US, you're better off getting a last year G14. As there are deals that can be had for $1000 there, so paying this much doesn't make much sense. Can buy a cheap tablet if really need stylus support. But for people in the Europe, where most RTX 4070 devices are over 2000 anyway, this doesn't seem like a bad deal, anymore.

I don't think of this as a barely improved 2023 rog flow z13 but more of a bigger ROG Ally with 3 times the performance, more memory and storage.

The only worry I've is warranty. I don't really trust Asus enough when it comes to theirs or their reliability. Hope this device is sold at retail stores with good after sale policies. (E.g. Costco)
Posted by Russel
 - January 27, 2025, 11:41:02
The comparison is more about incremental upgrade of z13 from previous generation to the current one.
AMD vs intel or amd vs nvidia is probably not where the focus should be 😂
Posted by APT
 - January 27, 2025, 10:12:03
Quote from: Kiri11 on January 26, 2025, 21:39:04Does it support FSR 4? If yes, this is amazing!

No. FSR4 is based specific features that require RDNA4 hardware. This is RDNA3.5, unfortunately. They might eventually backport some small improvements to older architectures (RDNA3/3.5). But don't expect that anytime soon, and don't expect this version to be remotely close in quality to when run on the proper hardware it was purpose built for.

Quote from: opckieran on January 26, 2025, 22:00:37or they're about to launch a chip that will most likely be placed in devices starting at $1600-1800 on the low end

It's already been listed on the Asus US web store for $2199.99 for the 16 core / 32GB variant. I think $1600 expectation might be a bit low. Would expect the lower end 12 core version to be listed around $1999.99. Might drop to the prices you listed in a years time on a Cyber Monday / Christmas sale or whenever Nvidia launch their mobile PC SoC as competition.

Quote from: opckieran on January 26, 2025, 22:00:37just to compete with the upcoming 5060, which will be far more accessible for gaming (likely to launch in devices around $900-1000). Given enough VRAM
...
provided the GPU is allocated enough VRAM.

And that's the problem with the way Nvidia had segmented their lower end mobile budget lineup. Who in their right mind is going to buy a 8GB ram GPU in 2025 (even if RTX 5060 is found in $1000 laptops, which I doubt due to annual price increases not to mention trump tariffs) when there's already plenty of evidence on YouTube showing that it clearly isn't enough now? The Nvidia mobile GPUs that come with more vram likely won't be any cheaper than flow z13.

Quote from: GeorgeS on January 27, 2025, 03:55:37AMD could of simply compared their new product to one with a 4050M or 4060M (or even the mentioned 4070M with ALL at full power. (or max TDP)

Indeed, there is no shame in being 'only being as fast' as a full powered 4050M-4060M, when those are about up to 300%-400% faster than the next closest currently sold igpu. I'm guessing they did this because RTX 4070 laptops are being sold at higher margins / additional premium (relative to 4050M / 4060M devices) and they're trying to not sell strix halo for any less than a $2000. It's look bad comparing to a 4050M when those can already be found in $1000 laptops, wouldn't it?

Quote from: GeorgeS on January 27, 2025, 03:55:37After all, on an APU how much "control" does AMD have over the ACTUAL power draw of a subsection of the APU as well as how well can they measure it?

I think we'll have control over this if not through Radeon software through 3rd party tools. Correct me if I'm wrong as it's been awhile since I last saw their videos but when ThePhawx and Geekerwan do APU reviews don't they sometimes control how much wattage goes to either gpu or cpu in their benchmark tests?

Quote from: GeorgeS on January 27, 2025, 03:55:37However will APU's with this iGPU really need to be paired with a dGPU?

Just to be clear, AMD has already stated multiple times now that this chip will 100% not be coming with any dGPUs.
Posted by Mr Majestyk
 - January 27, 2025, 05:28:15
Quote from: GeorgeS on January 27, 2025, 00:21:30
Quote from: Mr Majestyk on January 27, 2025, 00:07:03
Quote from: GeorgeS on January 26, 2025, 22:15:54So quite obviously for clickbait AMD severely limited the 4070M TDP to be 'similar' to the 8060S.

So is anyone terribly surprised that the 8060S is more efficient than the 4070M?

4070M has a TDP of 115W, Strix 390 has a TDP of 120W, so I doubt the 4070M is being too gimped in this comparison. Let's n0t forget 4070M is a mobile version of the pathetic 4060 desktop, not 4070.

ROTFLMAO!!!!

So the Strix 390 has a TDP of 120W. Of that 120W how much is the 8060S using? 20W? 40?

To conclude, they GIMPED a 4070M to <= 1/3 power (or more) to chart some Fanboi clickbait!!

LOL!!!



You can quickly tell how stupid spmeone is when all they can do is throw insults. I don't own any AMD laptop so couldn't care less about Halo or 4070M, but it sounds like you are butt hurt. You should be posting on wccftech with all the other losers.
Posted by GeorgeS
 - January 27, 2025, 03:55:37
Quote from: opckieran on January 27, 2025, 00:55:00With that in mind though, the 13900H was a terrible CPU to pair it with, given how terribly thirsty it runs. It's mind-boggling that AMD didn't compare their 8060S with ASUS's Zephyrus or ProArt models which are currently available equipped with AMD's own Strix Point HX 370 SoC as well as a 4070. 

But then again, such an apples-to-apples comparison probably wouldn't help AMD's narrative. Considering this, the upcoming iGPU will probably be more competitive with a 75W 4060M or 5050M as opposed to a 100W 4070, which isn't saying much to begin with, since the 4070M really should have been a 4060Ti mobile

AMD could of simply compared their new product to one with a 4050M or 4060M (or even the mentioned 4070M with ALL at full power. (or max TDP)

Then we'd be talking Apples vs Apples - as after all, who 'gimps' their system?

Fully 'trading blows' with dedicated mobile GPU's would be IMPRESSIVE in itself right?

BUT that would not be impressive enough for the narrative & resulting hype AMD wanted to generate so they instead used untraceable/unrepeatable methods to 'gimp' a 4070M to base their tests on.

After all, on an APU how much "control" does AMD have over the ACTUAL power draw of a subsection of the APU as well as how well can they measure it? (the same goes with the 4070M)

Will the new iGPU outperform prior iGPU's? Sure.

However will APU's with this iGPU really need to be paired with a dGPU? Maybe for the BEST gaming performance but it is rather debatable how many of the general public will want MORE performance then what this APU can provide.

IMHO: the death of the dGPU in all but workstations/desktops might be near. :)
Posted by opckieran
 - January 27, 2025, 00:55:00
Quote from: GeorgeS on January 27, 2025, 00:21:30
Quote from: Mr Majestyk on January 27, 2025, 00:07:03
Quote from: GeorgeS on January 26, 2025, 22:15:54So quite obviously for clickbait AMD severely limited the 4070M TDP to be 'similar' to the 8060S.

So is anyone terribly surprised that the 8060S is more efficient than the 4070M?

4070M has a TDP of 115W, Strix 390 has a TDP of 120W, so I doubt the 4070M is being too gimped in this comparison. Let's n0t forget 4070M is a mobile version of the pathetic 4060 desktop, not 4070.

ROTFLMAO!!!!

So the Strix 390 has a TDP of 120W. Of that 120W how much is the 8060S using? 20W? 40?

To conclude, they GIMPED a 4070M to <= 1/3 power (or more) to chart some Fanboi clickbait!!

LOL!!!





Doing a little digging, it appears that the tested 4070M laptop was an ROG Flow Z13, where the 4070M was limited to 65W. That isn't terrible considering that in reality it tops out at 100W in terms of any appreciable performance gain, so it's effectively running at 2/3rds.

With that in mind though, the 13900H was a terrible CPU to pair it with, given how terribly thirsty it runs. It's mind-boggling that AMD didn't compare their 8060S with ASUS's Zephyrus or ProArt models which are currently available equipped with AMD's own Strix Point HX 370 SoC as well as a 4070. 

But then again, such an apples-to-apples comparison probably wouldn't help AMD's narrative. Considering this, the upcoming iGPU will probably be more competitive with a 75W 4060M or 5050M as opposed to a 100W 4070, which isn't saying much to begin with, since the 4070M really should have been a 4060Ti mobile


https://www.amd.com/content/dam/amd/en/documents/partner-hub/ryzen/ryzen-ai-max-series-how-to-sell-guide-competitive.pdf
Posted by GeorgeS
 - January 27, 2025, 00:21:30
Quote from: Mr Majestyk on January 27, 2025, 00:07:03
Quote from: GeorgeS on January 26, 2025, 22:15:54So quite obviously for clickbait AMD severely limited the 4070M TDP to be 'similar' to the 8060S.

So is anyone terribly surprised that the 8060S is more efficient than the 4070M?

4070M has a TDP of 115W, Strix 390 has a TDP of 120W, so I doubt the 4070M is being too gimped in this comparison. Let's n0t forget 4070M is a mobile version of the pathetic 4060 desktop, not 4070.

ROTFLMAO!!!!

So the Strix 390 has a TDP of 120W. Of that 120W how much is the 8060S using? 20W? 40?

To conclude, they GIMPED a 4070M to <= 1/3 power (or more) to chart some Fanboi clickbait!!

LOL!!!

Posted by Mr Majestyk
 - January 27, 2025, 00:07:03
Quote from: GeorgeS on January 26, 2025, 22:15:54So quite obviously for clickbait AMD severely limited the 4070M TDP to be 'similar' to the 8060S.

So is anyone terribly surprised that the 8060S is more efficient than the 4070M?

4070M has a TDP of 115W, Strix 390 has a TDP of 120W, so I doubt the 4070M is being too gimped in this comparison. Let's n0t forget 4070M is a mobile version of the pathetic 4060 desktop, not 4070.