The newest system from Cerebras can handle multi-trillion parameter generative AI problems at twice the performance of its predecessor, while partnering with Qualcomm will help them cut inference ...
The company tackled inferencing the Llama-3.1 405B foundation model and just crushed it. And for the crowds at SC24 this week in Atlanta, the company also announced it is 700 times faster than ...
SUNNYVALE, Calif. & VANCOUVER, British Columbia--(BUSINESS WIRE)--Today at NeurIPS 2024, Cerebras Systems, the pioneer in accelerating generative AI, today announced a groundbreaking achievement in ...
The University of Edinburgh has announced the installation of an AI cluster consisting of four CS-3s using Cerebras’s 3 rd generation of Wafer Scale Engine processors Operated by the Edinburgh ...
If you thought a chip like AMD's MI300A was big at 146 billion transistors, you ain't seen nothing yet. AI company Cerebras announced its third-generation AI chip, CS-3, a "wafer-scale" silicon ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results