Nvidia is developing a line of AI GPUs with 144GB HBM3E memory

As revealed by TrendForce, Nvidia is working on developing the next generation of B100 and B200 GPUs based on the Blackwell architecture.

The new GPUs are expected to hit the market in the second half of this year, and will be targeted at the CSP cloud customer segment, or in other words, organizations and businesses using Cloud Service Providers ( CSP) for their cloud computing needs. Nvidia will also add a streamlined version of the B200A for enterprise OEM customers with AI needs.

It is reported that TSMC's CoWoS-L packaging capabilities (used on the B200 series) are still limited. Therefore, it is likely that the B200A series will switch to the relatively simple CoWoS-S packaging technology. This is reasonable because Nvidia is planning to focus on the B200A series to meet the requirements of Cloud Service Providers.

B200A technical specifications

Because it is still in the internal development stage, the technical specifications of the B200A have not yet been fully clarified. Currently, it can only be confirmed that HBM3E memory capacity will be reduced from 192GB to 144GB. There is also information that the number of memory chip layers is halved from 8 to 4. However, the capacity of a single chip increases from 24GB to 36GB.

Nvidia is developing a line of AI GPUs with 144GB HBM3E memory Picture 1Nvidia is developing a line of AI GPUs with 144GB HBM3E memory Picture 1

Notably, the power consumption of the B200A GPU is said to be lower than the B200 and does not require liquid cooling. The new GPU's air cooling system will also make them easier to set up. The B200A is expected to be available to OEMs around the second quarter of next year.

Supply chain survey shows that NVIDIA's main high-end GPU shipments in 2024 will be based on the Hopper platform, with the H100 and H200 for the North American market and the H20 for the Chinese market. Since the B200A will hit shelves around Q2 2025, it is not expected to impact the H200 (which will launch in or after Q3).

5 ★ | 1 Vote