Quote from: JabbaBacca on May 28, 2023, 15:09:468GBs is simply not big enough to run 16B+ parameter LLMs. We need bigger VRAM for self-hosted AI.We need bigger VRAM for a lot of things. Unfortunately, there only seem to be VRAM modifications for desktop cards, not laptop cards.
Quote from: okthatsenough on May 26, 2023, 04:42:44Is 8gb of vram bad now? I'm making do with 2gb on a 1050 mobile and can pay more than enough games on Steam at 1080p60. The self-proclaimed "gamers" just need to stop complaining so much.It is when you can pick up used RTX 3060 with 12 GB for 200$, which is what I did.
Quote from: okthatsenough on May 26, 2023, 04:42:44Is 8gb of vram bad now? I'm making do with 2gb on a 1050 mobile and can pay more than enough games on Steam at 1080p60. The self-proclaimed "gamers" just need to stop complaining so much.