Wafer Scale Engine 3: The world's largest computer chip contains 4 trillion transistors

Supercomputer company Cerebras has designed and manufactured the world's largest computer chip containing 4 trillion transistors (active semiconductor components, often used as an amplifier element or an electronic key) to run supercomputers. Condor Galaxy 3 computer with 8 exaFLOP capacity in the future.

Wafer Scale Engine 3 (WSE-3) is Cerebras' third generation platform, designed to run powerful artificial intelligence (AI) supercomputers such as OpenAI's GPT-4 and Anthropic's Claude 3 Opus.

Picture 1 of Wafer Scale Engine 3: The world's largest computer chip contains 4 trillion transistors

Similar to its 2021 predecessor, WSE-2, the chip includes 900,000 AI cores, made from a 21.5 x 21.5 cm silicon wafer. The WSE-3 uses the same amount of power as the WSE-2 but is twice as powerful.

Wafer Scale Engine 3 contains 4 trillion transistors, more than 57 times more than one of the most powerful chips currently used to train AI models, the Nvidia H200 graphics processing unit (GPU), which has only 80 billion transistors.

In the future, WSE-3 will be used for the Condor Galaxy 3 supercomputer located in Dallas, Texas, which is under construction. Condor Galaxy 3 includes 64 base blocks of the Cerebras CS-3 AI system powered by the WSE-3 chip. When operating, the entire system has a computational capacity of up to 8 exaFLOPs. After that, Condor Galaxy systems 1, 2, 3 will combine together, the entire network will reach a total capacity of 16 exaFLOPs, (one exaFLOP will be equal to 1,000 petaflops, equivalent to 1 trillion calculations per second). Much more powerful than the world's most powerful supercomputer today, Oak Ridge National Laboratory's Frontier with a capacity of 1 exaFLOP.

The Condor Galaxy 3 supercomputer will be used to train future AI systems.

Update 15 March 2024
Category

System

Mac OS X

Hardware

Game

Tech info

Technology

Science

Life

Application

Electric

Program

Mobile