Saturday, October 1, 2022

Nvidia’s Hopper H100 pictured, features 80GB HBM3 memory and impressive VRM

Backside line: Nvidia took the wraps off its Hopper structure at GTC 2022, saying the H100 server accelerator however solely displaying off renders of it. Now we lastly have some in-hand images of the SXM variant of the cardboard, which features a mind-boggling 700W TDP.

It has been a bit over a month since Nvidia unveiled their H100 server accelerator primarily based on the Hopper structure, and thus far, we have solely seen renders of it. That modifications right now, as ServeTheHome has simply shared footage of the cardboard in its SXM5 kind issue.

The GH100 compute GPU is fabricated on TSMC’s N4 course of node and has an 814 mm2 die dimension. The SXM variant features 16896 FP32 CUDA cores, 528 Tensor cores, and 80GB of HBM3 memory related utilizing a 5120-bit bus. As could be seen within the photos, there are six 16GB stacks of memory across the GPU, however one in all these is disabled.

Nvidia additionally quoted a staggering 700W TDP, 75% increased than its predecessor, so it is no shock that the cardboard comes with an extremely-impressive VRM answer. It features 29 inductors, every geared up with two energy levels and a further three inductors with one energy stage. Cooling all of those tightly packed elements will in all probability be a problem.

One other noticeable change is the connector structure for SXM5. There’s now a brief and a protracted mezzanine connector, whereas earlier generations featured two identically sized longer ones.

Nvidia will begin delivery H100-equipped programs in Q3 of this 12 months. It is value mentioning that the PCIe model of the H100 is at present listed in Japan for 4,745,950 yen ($36,300) after taxes and delivery, though it has fewer CUDA cores, downgraded HBM2e memory, and half the TDP of the SXM variant.

Related Articles


Please enter your comment!
Please enter your name here

Stay Connected

- Advertisement -spot_img

Latest Articles