News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by lmao
 - February 28, 2024, 18:29:27
they claim running something like mixtral8x7b on their asic is about 5-10 times faster than on H100, not sure if it's quantized in their tests or not, they don't give much details really

card price is $20K
Posted by Redaktion
 - February 28, 2024, 17:38:46
The LPU Inference Engine from Groq is designed to be considerably faster than GPGPUs when processing LLM data. To achieve this, the LPU makes better use of sequential processing and is paired with SRAM instead of DRAM or HBM.

https://www.notebookcheck.net/Groq-presents-specialized-language-processing-unit-significantly-faster-than-Nvidia-s-AI-accelerators.808177.0.html