
For the first time in years, memory has become interesting again (see here and here). Once the most predictable component in a computer’s bill of materials, it has quietly turned into a global bottleneck and artificial intelligence sits at the center of the shift.
The disruption begins with high-bandwidth memory, or HBM, a premium version of DRAM built to keep increasingly power-hungry AI accelerators saturated with data. By stacking memory dies and connecting them in three dimensions, HBM delivers extraordinary throughput. It also devours manufacturing capacity. Industry estimates suggest a single gigabyte of HBM can consume roughly three times the wafer area of standard DDR5, turning production inside memory fabs into a zero-sum tradeoff.
By volume, HBM remains a niche, representing only a small share of total DRAM bits shipped. In revenue terms, it punches far above its weight. Analysts estimate HBM accounted for more than 30 percent of total DRAM revenue in 2025 despite its modest volume, thanks to prices that can run several times higher than conventional memory. For chipmakers, the incentive is unmistakable.
That math has driven the industry’s heavyweights – SK hynix, Samsung, and Micron – to redirect capacity at a pace rarely seen in memory manufacturing. SK hynix, long the second-largest DRAM supplier, surged past Samsung in overall DRAM revenue in 2025 on the strength of HBM. Researchers estimate it now controls roughly half of the HBM market, with Samsung behind and Micron smaller but scaling quickly. SK hynix has said its HBM output is effectively sold out through most of 2026, underscoring how tightly supply is constrained.
Micron has been unusually candid about what comes next, warning that memory shortages are likely to extend beyond 2026 as AI demand consumes capacity faster than new fabs can be built. Samsung, meanwhile, is pushing aggressively toward next-generation HBM4 while courting major AI customers, a sign of just how intense the race has become.
The consequences are spilling well beyond AI hardware. Prices for conventional DRAM have surged, quadrupling in some spot markets and rising more than 170 percent year over year in late 2025. PC and smartphone makers are already signaling higher prices as memory reemerges as a meaningful cost factor.
Storage vendors are feeling the knock-on effects too. Western Digital and Seagate have benefited from AI-fueled enthusiasm – Seagate’s stock more than tripled in 2025 – but tight memory supply and shifting data-center spending patterns are reshaping how storage is purchased and deployed.
What this moment reveals is the fragility of an industry long assumed to scale effortlessly. Memory fabs take years and tens of billions of dollars to build, and the advanced packaging HBM depends on is even harder to expand than silicon production itself.
The higher price of your next laptop or the longer wait for an enterprise server reflects a deliberate, industry-wide pivot toward AI, where high-margin HBM now takes precedence and everything else gets squeezed. In the age of artificial intelligence, even RAM has become front-page news.






