News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning - while you were reading a new reply has been posted. You may wish to review your post.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Andy Blakely
 - Yesterday at 23:30:25
Price on paper and in real life are too very different things. What's the street price going to be?

Personally, I'm waiting for AMD to jump back in after their skipped season. I'm watching the AMD AI progress, too. 7900 XTX here, mostly for 3440x1440x gaming, but getting into AI work on the side now.
Posted by Erik Szpyra
 - Yesterday at 22:20:15
It does not cost that much extra to add the ram, this pricing is an insane markup, simply because some fools are willing to pay, not that their production costs are near that high. Some will say supply and demand, but it's just greed.
Posted by RobertJasiek
 - Yesterday at 20:01:32
I could use lots of CUDA and Tensor cores for higher speed so, enough cash presumed, might buy, say, 7 or 8 mid to high end Nvidia cards. However,

- such 90 tier cards alone would amount to roughly €15000 ~ 20000 (plus PC hardware with water cooling)

- the yearly power bill (in Germany) would be several thousand euros

A compromise with 70 tier or 80 tier cards (7 or 8 of them) would bring down their price to €5000 ~ €10000 but the yearly power bill would still be a few thousand euros.

AI inference would be fun at that speed but prohibitively expensive even with moderate wealth. It really boils down to "more than 1 card, or at best 2 cards, is luxury", unless the faster work pays itself in one's business at least proportionally to the expense.
Posted by Bard
 - Yesterday at 18:52:23
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.

I will say no not for my use I don't play games I work with AI modelling. I hoped it came with even more vram I need two of these to do fine tuning of my model and that is bare minimum. In comparison I need 3 off 4090 so it's cheaper with 2 of these than 3 off 4090. For gamers I have no idea what they need but I guess 5080 will do fine.
So the 5090 is like 4090 an overkill for most but a really good tool for some, which is why it's made a niche card. Not in A6000 or A100 class but for those with lower budgets.
Posted by Chris Jones
 - Yesterday at 13:07:14
Nvidia's graphics cards business is a nuisannace at this point... its practically a rounding error, and something they dont even want to deal with anymore... given how much they are making off their AI hardware and ventures... graphics cards are an afterthought.
Nvidia is NOT going out of business any time soon - to even mention otherwise is an uninformed take.
Posted by Warrior24_7
 - Yesterday at 08:32:39
So. The card will sell out anyway and be scalped! Just making it more expensive! Nothing will change from the 4090 era. AMD wishes it had it like this!😂 It is an enthusiast card directed at that market. The most popular card in the world is going to be the 5060 (fifty sixty). You liked that didn't you?🫵 You're welcome!
Posted by HulkMode666
 - Yesterday at 06:37:49
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.
You must not play anything new, or 4k, or hyper modded, or paid any attention to the devs collective VRAM outcry, or do more than game......

Or all of the above.

These things have far more use than pushing pixels. Have for ages.
Posted by HulkMode666
 - Yesterday at 06:32:11
Quote from: Greg on October 28, 2024, 16:31:28RTX 5090...that's darling!
I'm gonna hang onto my RTX 4090 for a minute.
5090 isn't the huge leap forward that the 4090 was.
Maybe the 6090 (assuming Nvidia is still in business) will be something special.
This class of video processor is much like the Laser in 1969!
A technological solution in search of a problem it can solve.
Hydrate before coffee, or not.
Why would Nvidia go oit of business? They're making stonks selling to prosumer HPC and server.

It's Intel who looks like they may be 2 inches from the bottom of a spike pit.
Posted by HulkMode666
 - Yesterday at 06:29:52
Well we're here.

The simple-minded have helped the lether mam have his way.

"It's more performamce so it sHoUlD be more expensive" even though for ages it was the entore stack is getting REPLACED.

Tis is why normal folk need to be left out of everything. We have fewer amd fewer nice things, and the nice things we do get are pushed further into the vanity realm because "reasons."

Yall better be proud of yourselves.
Posted by Vik
 - Yesterday at 05:40:55
Thank you Nvidia for convincing me to keep my 3090 for 2 more years!
Posted by KC
 - Yesterday at 03:33:50
Who has that kind of cash in this economy for a gaming GPU????
Posted by Splatterling
 - Yesterday at 01:22:01
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.

It may be for games but I am dependent on CUDA for work and my renders literally starve and crash the software or in blenders case, crash windows with my measly 8gb vram of my old 3070. I 'llst' my 4090 and it literally costs me twice the time and nerve to work on things.

16 gig are not enough either, I have loads of scenes that ate more than that on my ex-4090.

So no it's not overkill. 24 is what xx80 should have, 32 instead of 24 is nice but at that price I want 64 gig vram.

So yea that sucks. I and literally all my freelancer colleagues are forced to buy high-end Nvidia gpus because:

- having cuda or not is a non-negotiable
- GPU renderers don't handle vram well, so either you have more than needed or say goodbye to your free time and weekend. Redshift in Maya has gotten a lot better. Blender is literally the worst and unusable with 8gb vram (and cycles is not very fast either)

So yeah, looking forward to buying that overpriced thing..

But yeah, 4090 is overkill and not necessary for games no doubt about it unless you don't know what you are doing (using 8xMsaa, setting things to ultra, which in 99% of cases does nothing more than use uncompressed textures which you will never notice but it's a good reason to tell players that there's a reason to have a 4090).
Posted by RoyT
 - Yesterday at 00:59:47
pcie^
Posted by RoyT
 - Yesterday at 00:58:39
Without p2p memory sharing - the consumer cards took a huge dive in value. The last card with shared memory over pie was the 1080ti. That was intentionally gimped on every future consumer card.

It makes sense to gimp a consumer card to distinguish from a real AI card with memory sharing (over pcie or NVLink)... but still charging a premium for it is a bridge too far.
Posted by The electrician
 - Yesterday at 00:41:17
At this point your going to need to bring in a 220v service in to your living room.