News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by missingxtension
 - June 02, 2024, 14:45:48
Screw ray tracing, i am not rendering anything on blender at the moment. Long time video card user, since the riva tnt  radeon 9800 etc. I got a A770 because i don't consider any vard on the market acceptable, i was going to nake the jump from a 670 to a 2060 but made it to a rx580 instead. Narrow busses with low ram and higher caches are not the kind of thing I want to be supporting. Its about the damn principle, I am not buying a stupid 3070 (rehashed 1060 with rtx) for the pricing of a 670 series. Get the f*** outta here with rhat s***. Then AMD jumps on that band wagon with a rebranded rx480 only to follow it up with an rx6600 or even rx6700. Thats not an upgrade path I want to be a part of.
Posted by TLDR
 - June 01, 2024, 17:00:14

Yes we can.

Aslong as it's after we stop pretending AMD is taking any cpu client marketshare and that they're still a poor underdog despite being a making billions. That it's acceptable their client division still can't supply enough chips for mobile in 4+ generations in a row due to TSMC supply.

Funny, how this very same source seems to have no issues supplying epyc cpus for enterprise data centre or meteor lake laptops or soon with SXE laptops. It's almost as if they don't really care about actually competing, just like with their radeon dgpu (both on pricing and driver quality).

Isn't it also funny how there's been no mention of strix point handhelds just 7840u and hawk point refreshes? At the rate AMDelay is operating at, we're likely going to see lunar lake handhelds before seeing anything stx based.
Posted by Drefsab
 - June 01, 2024, 10:48:43
Wow talk about missing the whole point. No one was claiming that Arc was the best out there. Arc really is for early adopters still, but that misses the critical factor. Arc is important as it adds another player in the GPU market where nvidia is king and amd is playing catch up.

Just look at how each arc driver release brings major performance improvements, add that to the hardware improvements in battlemage and celestial in a few short hardware generations we should have a good option for the wider market. First gen hardware was never going to go toe to toe with nvidia etc.

I honestly expect come Druid you will see a truly viable option for how public and hopefully at reasonable prices. Show me another company trying to take on nvidia and amd with any measure of success?
Posted by TJ
 - June 01, 2024, 01:25:47
Gamers don't care about how much silicon or transistors are used in a GPU, you dumb f.
They care about price to performance.

Who hired/let you write this pos and how this  end up in my feed.

Is there no screening for common sense before publishing articles nowadays
Posted by DrCoonan
 - May 31, 2024, 23:49:05
What a cute opinion piece. Puff piece even. Usually the editor is older and knows more but that's ok. You're a smart kid.
Intel IS beating Radeon in several areas. Great performance in all titles is the last area in which Intel is behind on however ATI/RTG has been in this game longer than Nvidia, is only constrained by the budget allocated them by AMD, and never beats Nvidia in any meaningful way since before AMD owned them. Picking up pricing similar to Nvidias was their downfall. It gave Intel the time and space it needed to get a foothold in the business. Come in low priced, continue driver development alongside GPU production. Wait for awesome drivers before selling a high end GPU that beats AMD and rolls up on Nvidias best. And so far so good. AMD still sells for too much. Nvidia prices are whack. Intel prices have been good. Worth it even recently. They need to all be reduced and streamlined but for now, pre Battlemage, which we all know is in testing, they are still great value for money and keep getting better unlike Intel's stock. I see all of these as markers along the path to longterm stability and usability. Intel focused far more on productivity, 3D rendering, CAD, streaming, encode/decode and their performance in apps that enable it from day one. Like AMD did with Ryzen. Once the process was refined they poured in the nitrous and now Ryzen is a top gaming chip for the first time since the Athlon 64x2 vs Pentium D. Intel is doing the same. Building up rock solid fundamentals so their GPUs make sense in the average work notebook or desktop and can play games with the newest features and buzzwords and at a price that craps on all comers.The better the drivers get the better your deal gets has been a great ride too. My A750 went from losing to a GTX 1060 to crushing my ol GTX 1080. Can't laugh at that. Plus there are games that my GTX 1080 will never truly be optimized to run. Despite it having incredible raster performance, due to planned obsolescence that my A750 can run rings around it in, finally. The A750 is on par with a 3060 in the games it's optimized for. Granted, Nvidia is optimized for almost everything, but the fact that a $190 msrp GPU w/8GBs vram is beating a card with more VRAM in 4k isn't something to scoff at. It beats my GTX 1080 in 4k and my GTX 1080 was my first 4k card and was damn good at it too. I'm not much for frame manipulation like XeSS or Frame gen but they can be used. Even if you use AMDs FSR and Framegen on it. I don't need double frames if it comes with double latency and it's noticeable, thank you.
In the end I just wanted to point out, in a way any one could read and tell, that Intel is doing God's work right now as per GPUs and I will only be upset if Battlemage increases in price. If AMD doesn't pour that sugar into it's GPU r&d soon, Intel WILL surpass them. It's a when, not an if.
Posted by vertigo
 - May 31, 2024, 20:35:42
Competition is almost always a good thing, even if it's not great, though A's logic is sound and as they said, in this case it may be good and bad. Still, as much as I hate Intel, I actually considered the A770, and might have gone with it if it were cheaper, but at least when I was looking it wasn't enough of a savings over the 6700XT I went with, which has similar to better performance and is likely more stable with less of the growing pains issues.

But as rapid as Intel's progress has been with how hard their driver team is working, I don't find it unreasonable at all to see Battlemage being very competitive with AMD on price and performance and with Nvidia on price and possibly price/performance, and it seems likely it will require far less tinkering than Alchemist, which is the other reason I decided on AMD, as I didn't want to have to mess with it, I just wanted it to work (not to say AMD is perfect in that sense, but far better than Intel from what I've seen).

I suspect this isn't just about getting into the gaming dGPU market, or even about AI (though that's likely a big part of it), but also about improving their iGPUs to make their mobile chips more competitive. So even if they lost money on Alchemist, and even if they continue to do so on Battlemage, it may be worth it overall, especially in the long run, and the refusal by many companies to spend/lose money now to make more later is why they're now finding themselves struggling to begin with. Either way, even if they're selling them at a loss, that's certainly not "setting gamers up for disappointment." How would it be a disappointment to get a card with competitive performance at a competitive price just because Intel is losing money on it?

The main concern would be that it carves away sales from AMD and then Intel drops out, leaving a weakened AMD to continue against Nvidia. But with Nvidia increasingly ignoring low- and mid-range gamers and AMD ignoring the high-end, at least Intel is providing some competition to part of the market, where without them each level doesn't really have much.
Posted by RobertJasiek
 - May 31, 2024, 19:25:11
You sound as if the relatively new GPU competitor Intel could quickly be able to compete at the top. This is not how the chip race works!

AMD and Intel might do a bit better by licensing CUDA and Tensor cores, but I do not know if Nvidia is obliged to offer them for licensing. Even then, they need to rewrite libraries for these cores and this is also very difficult with speed factors up to 6 for a given hardware generation. Nvidia has decades of code experience, and AMD and Intel cannot catch up quickly only by licensing cores.
Posted by A
 - May 31, 2024, 18:27:59
Quote from: Another Person on May 31, 2024, 07:49:34I think most folks are to focused on the hardware. This isn't about Intel trying to take over the graphics market IMO. This is about slowing down the free printed money that Nvidia has been able to leverage to this point. It's a long game. Intel may never have the most powerful GPU, but they CAN make money off it and chip away at the other 2 companies profits. Same way others are chipping away at the CPUs. If anything Intel should have started the push years before.

The problem is a bit more complex though, if their target market is the budget GPU market, they aren't chipping nvidia, they are chipping AMD. The end result would be AMD having less budget to compete with Nvidia

It wouldn't be so bad if Intel aimed for top end or Nvidia was more even with AMD in terms of share. But as-is, unfortunately it isn't helping take down Nvidia, the opposite, it helps them
Posted by RobertJasiek
 - May 31, 2024, 08:38:19
Quote from: Jok3r on May 31, 2024, 06:01:49Competition is good for all of us also.

Yes.

Quotefanboys who will never give up their Nvidia no matter what the circumstances.

It is not about fanboyism, but about application speeds, stability, types of cores and existence of libraries for the cores enabling much higher application speeds for number crunching applications, such as machine learning. There, Intel and Apple are behind Nvidia by large factors. For some such applications, AMD can compete for Nvidia but; for most such applications, AMD is also way behind but not as far as Intel.
Posted by 0802AM
 - May 31, 2024, 08:12:01
Intel pulls all kinds of dirty tricks to reduce power consumption. Some of those tricks are bad for your eyes, people. Don't buy anything that comes with Intel graphics. Period.
Posted by Another Person
 - May 31, 2024, 07:49:34
I think most folks are to focused on the hardware. This isn't about Intel trying to take over the graphics market IMO. This is about slowing down the free printed money that Nvidia has been able to leverage to this point. It's a long game. Intel may never have the most powerful GPU, but they CAN make money off it and chip away at the other 2 companies profits. Same way others are chipping away at the CPUs. If anything Intel should have started the push years before.
Posted by 1Life
 - May 31, 2024, 06:36:07
Very much disagree and this article is a disservice to the GPU market which is being monopolized ! I love my A750 I play multiple games all in 2k and everything works perfectly. Even stable overclocking in the control centre to have noticeable improvements in games.the arc cards are a very interesting competitor and I'll be buying next gen.
Posted by Jok3r
 - May 31, 2024, 06:01:49
I bought the A770 at the point when they were just cleaning up the drivers and I've been super pleased with it. For the amount of money I spent on it, it was a gamble but a great deal in the long run and it's only improving. I play on a 32" 160Mhz 1440p monitor and there's pretty much nothing I wanted to play that I haven't been able to max out and gotten acceptable playable performance out of. My only real issue with it is it's lack of VR support. Competition is good for all of us also. It forces AMD and Nvidia to rethink their pricing schemes but just like anything else you are gonna have your fanboys who will never give up their Nvidia no matter what the circumstances.
Posted by M.M.
 - May 31, 2024, 02:16:49
Alchemist used N6, which was not a competitive fab scale.  Battlemage uses N4.  Same thing, and it'll undoubtedly compare to the nVidia 4000 midrange, just as its predecessor did to the 3000 midrange.

Posted by George
 - May 31, 2024, 01:21:11
Forgetting for the moment that anything 'Gaming' or actual 'Graphics' for Team Blue's chips was ALWAYS an after thought - the MAIN interest for them to get into making their own GPU's is that Nvidia (and even AMD) are KILLING them with AI!!

These chips were intended to be used on AI applications.

However given Team Blue's usual 'gee why not toss a few $Billion and take over the market' Kool-Aid that the DECADES of research+work Nvidia & AMD have done in this space for granted as Team Blue surely can out perform THEM if they wanted to. :)

I'm afraid that much like any of their other blunders, these chips will likely get AXED and/or sold/spun off as a loosing business for them.