Here is an another example: AMD advertises local LLM / AI again (first time done so may with the Strix Halo introduction):
Quote from: techpowerup.com/review/amd-ai-bundle Jan 21st, 2026AMD is finally making local AI easy. With the new AI Bundle, you can run image generation and LLMs directly on your PC, no cloud, no subscriptions, no data leaving your system. We could even run a massive 120B parameter model on a small laptop, something impossible on any consumer GPU.
It is correct, one can run Gpt-Oss-120B with 64 GB RAM + dedicated GPU.