At its semi-annual GPU Technology Conference (GTC), Nvidia showcased its latest server-class GPU family, called Blackwell, offering a jump in both computing power and bandwidth and, most importantly, a significant increase in integrated memory.
As mentioned in our July 2023 article, memory is the main bottleneck in the training of large GenAI models. As data hungry AI processing chips need large and fast memory pools, the memory banks are now directly integrated into the CPU/GPU package in order to reduce latency (as pictured above).
Here, we are not talking about the traditional memory found in PCs but HBM (High Bandwidth Memory) modules, currently the fastest available solution. The HBM market is literally booming since the release of ChatGPT as every AI infrastructure rollout comes with terabytes of HBM modules.
Micron (a portfolio company), the world’s third largest memory manufacturer, just confirmed the insatiable demand for high-end memory chips. Last night, the company announced stellar quarterly numbers with revenue increasing by more than 57% year-on-year, a growth level not seen since 2017. The outlook was also raised well above estimates, with the company guiding for $6.6 billion revenue (10% above consensus) and gross margin of 26.5% (vs. consensus at 20.4%), leading to an EPS guide of $0.45, crushing $0.09 expectations.
Importantly, Micron stressed that it had a visibility of more than 12 months, as the company’s production of HBM chips is sold out for 2024 and already almost fully allocated for 2025… This volume visibility comes with strong pricing conditions: pricing in Q1 was up double digits quarter-on-quarter, and Micron expects prices to be up each quarter, giving strong confidence in the margin outlook and earnings power.
These figures and comments illustrate how fast HBM is taking off and how powerful its impact is on industry capacity and pricing. The memory semiconductor industry has thus entered a very positive and uncharted territory as it has always had to deal with way shorter timeframes and more volatile order flows.
The “memory cartel” composed of Samsung Electronics, SK Hynix and Micron is thus literally rushing to increase the production volumes, an arms race that is also benefitting some semi equipment makers as HBM modules are manufactured in 3D, hence needing specific (hybrid) bonding, testing and advanced packaging machines.