Advertisement
UK markets closed
  • FTSE 100

    8,139.83
    +60.97 (+0.75%)
     
  • FTSE 250

    19,824.16
    +222.18 (+1.13%)
     
  • AIM

    755.28
    +2.16 (+0.29%)
     
  • GBP/EUR

    1.1679
    +0.0022 (+0.19%)
     
  • GBP/USD

    1.2494
    -0.0017 (-0.13%)
     
  • Bitcoin GBP

    50,511.09
    -998.80 (-1.94%)
     
  • CMC Crypto 200

    1,304.48
    -92.06 (-6.59%)
     
  • S&P 500

    5,099.96
    +51.54 (+1.02%)
     
  • DOW

    38,239.66
    +153.86 (+0.40%)
     
  • CRUDE OIL

    83.66
    +0.09 (+0.11%)
     
  • GOLD FUTURES

    2,349.60
    +7.10 (+0.30%)
     
  • NIKKEI 225

    37,934.76
    +306.28 (+0.81%)
     
  • HANG SENG

    17,651.15
    +366.61 (+2.12%)
     
  • DAX

    18,161.01
    +243.73 (+1.36%)
     
  • CAC 40

    8,088.24
    +71.59 (+0.89%)
     

Chip startup Cerebras launches new AI processor

By Max A. Cherney

SAN FRANCISCO, March 13 (Reuters) - Artificial intelligence startup Cerebras Systems announced a new version of its dinner-plate-sized chips on Wednesday, claiming the hardware will offer twice the performance for the same price as its predecessor.

WHY IT IS IMPORTANT

Santa Clara, California-based Cerebras' AI chips compete with the advanced hardware produced by Nvidia that help OpenAI develop the underlying software that powers apps such as ChatGPT. Instead of stitching together thousands of chips to build and run AI applications, Cerebras has bet that its roughly foot-wide chip can outperform Nvidia's clusters of chips.

ADVERTISEMENT

KEY QUOTE

"So the largest chip that we made was our first generation. People said we couldn't make it," Cerebras CEO Andrew Feldman said to reporters on Tuesday. "Eighteen months later we did it in seven nanometer. Eighteen months (after that), we've announced a five-nanometer part. This is the largest part by more than three and a half trillion transistors."

CONTEXT

Power consumption is a critical problem for AI processing. Cerebras' third-generation chip uses the same amount of energy to achieve superior performance, when power costs to build and run AI applications have soared. Cerebras does not sell the chips by themselves, but says the systems constructed around them are a more efficient method of building AI applications, a process called training.

BY THE NUMBERS

The new Wafer-Scale Engine 3 (WSE-3) has 4 trillion transistors capable of performing 125 petaflops of computing. It was built on Taiwan Semiconductor Manufacturing Co's 5nm manufacturing process.

Feldman said Cerebras is cash flow positive

WHAT IS NEXT

Cerebras also said on Wednesday that it planned to sell its WSE-3 systems together with Qualcomm AI 100 Ultra chips to help run artificial intelligence applications, a process known as inference. (Reporting by Max A. Cherney in San Francisco; Editing by Leslie Adler)