Quote from: iooipiop on February 02, 2024, 12:44:20proof or gtfoLOL. Cry, baby...
Quote from: iooipiop on February 02, 2024, 12:44:20Proof.Quote from: NikoB on January 30, 2024, 17:06:07FP64 calculation units are deliberately strangled thereproof or gtfo
Quote from: NikoB on January 30, 2024, 17:06:07FP64 calculation units are deliberately strangled thereproof or gtfo
QuoteAll gaming cards are mediocre in professional tasks for a simple reason
QuoteTo what end?
Quote from: Neenyah on January 29, 2024, 00:01:24Quote from: davidm on January 28, 2024, 23:48:18OK, well thanks for that. But it's not exactly the same, it would be helpful if typical leading tasks (Stable Diffusion, llama) were front and centre benchmarks, without having to decipher from how many frames per second Lara Croft is rendered. Is there a Lara Croft FPS to llama token/s calculator I'm not aware of?I'm not an expert in the AI/DL/ML field but I'm slowly learning (not interested too much in it, tbh, that's why I'm snailing; who knows, perhaps I would be faster if there was Lara Croft involved 😋). Judging by the info I gathered so far in recent months, mainly from r/machinelearning, r/egpu, egpu.io and various different YT videos on that matter, you can expect about 5-15% loss over what you'd get with the same GPU in a desktop. With that being said, I believe that used/refurbished 3090 24 GB would do wonders for way lower price than the 4090, giving better price:performance return.
Quote from: davidm on January 28, 2024, 23:48:18OK, well thanks for that. But it's not exactly the same, it would be helpful if typical leading tasks (Stable Diffusion, llama) were front and centre benchmarks, without having to decipher from how many frames per second Lara Croft is rendered. Is there a Lara Croft FPS to llama token/s calculator I'm not aware of?I'm not an expert in the AI/DL/ML field but I'm slowly learning (not interested too much in it, tbh, that's why I'm snailing; who knows, perhaps I would be faster if there was Lara Croft involved 😋). Judging by the info I gathered so far in recent months, mainly from r/machinelearning, r/egpu, egpu.io and various different YT videos on that matter, you can expect about 5-15% loss over what you'd get with the same GPU in a desktop. With that being said, I believe that used/refurbished 3090 24 GB would do wonders for way lower price than the 4090, giving better price:performance return.
Quote from: Neenyah on January 28, 2024, 23:41:45Quote from: davidm on January 28, 2024, 23:39:28The fact it is called "gaming" doesn't really mean anything, except it's *marketed* toward gamers. The fact is, especially given its 24GB VRAM, it's a decent chip for AI and creative work. This is notebookcheck, it's not limited to notebooks, it shouldn't be limited to gamers and their one track minds (and I suspect many of them are more obsessed with stats than actual experience).Well I know David and I agree with you completely, I'm just saying that if the manufacturer is literally in its name calling it "gaming" it would be weird for Notebookcheck to not include gaming performance. They did a great job by covering a lot of possible usages, IMHO, from serious apps to various games. But games are also quite an excellent benchmark for exactly eGPUs because in pro apps you can't really bottleneck eGPU setup as easy and as quickly as with games (where you can see bottleneck in less than 20 seconds as soon as shaders load/cache), so if you pay attention to wider picture you can get to your own conclusions if they [your desired and used apps] weren't mentioned in the article. Again - I agree with you, don't think the opposite.
As I mentioned, the performance characteristics of the entire system are very different for AI/creative.
Quote from: davidm on January 28, 2024, 23:39:28The fact it is called "gaming" doesn't really mean anything, except it's *marketed* toward gamers. The fact is, especially given its 24GB VRAM, it's a decent chip for AI and creative work. This is notebookcheck, it's not limited to notebooks, it shouldn't be limited to gamers and their one track minds (and I suspect many of them are more obsessed with stats than actual experience).Well I know David and I agree with you completely, I'm just saying that if the manufacturer is literally in its name calling it "gaming" it would be weird for Notebookcheck to not include gaming performance. They did a great job by covering a lot of possible usages, IMHO, from serious apps to various games. But games are also quite an excellent benchmark for exactly eGPUs because in pro apps you can't really bottleneck eGPU setup as easy and as quickly as with games (where you can see bottleneck in less than 20 seconds as soon as shaders load/cache), so if you pay attention to wider picture you can get to your own conclusions if they [your desired and used apps] weren't mentioned in the article. Again - I agree with you, don't think the opposite.
As I mentioned, the performance characteristics of the entire system are very different for AI/creative.
Quote from: Neenyah on January 28, 2024, 23:29:18When manufacturers stop being obsessed with gamers primarily? Gigabyte Aorus RTX 4090 Gaming Box 🤔
From Gigabyte's official product site:
"Powerful GeForce RTX™ 4090 delivers incredible performance for gamers and creators"
"For GAMERs
The AORUS RTX 4090 GAMING BOX transforms an ultrabook laptop into the ultimate gaming rig, delivering incredible performance for real-time ray tracing and graphics-intensive games. A network chip that allows you to connect to a wired network is built into the GAMING BOX. You don't have to worry about transmission interference during the game. Install the GIGABYTE CONTROL CENTER to adjust the RGB lighting and performance for your preference."Quote from: davidm on January 28, 2024, 23:13:57I would bet a large and increasing proportion of 4090 users are doing AI and creative work.Literally any current GPU, even with TB over two PCIe lanes, is a massive upgrade over any existing iGPU. Data is shown in the review tho.
Btw, my comment about gaming was related to George's very accurate comment above mine, about the 4090 being too expensive and overkill in terms of price:performance because you get about the same performance with much cheaper GPU. I don't care about AI/ML but my 6800 XT in an eGPU setup with my X1 Carbon is faster in After Effects than a desktop 4060 in a desktop, if that means anything useful to you.
Quote from: davidm on January 28, 2024, 23:13:57When is notebookcheck going to grow up and stop being obsessed with "gamers."When manufacturers stop being obsessed with gamers primarily? Gigabyte Aorus RTX 4090 Gaming Box 🤔
Quote from: davidm on January 28, 2024, 23:13:57I would bet a large and increasing proportion of 4090 users are doing AI and creative work.Literally any current GPU, even with TB over two PCIe lanes, is a massive upgrade over any existing iGPU. Data is shown in the review tho.