AI CPUs increase demand for DDR5 memory, storage chips "super cycle" continues until 2027.

date
15:02 02/05/2026
avatar
GMT Eight
As the demand for general-purpose DRAM outside of HBM increases, the end of the "super cycle" for memory chips may be delayed from the previously expected 2026 to 2027.
The transformation of AI reasoning architecture is reshaping the demand pattern for storage chips, and the duration of this supply-demand imbalance may exceed market expectations. According to a report from the Seoul Economic Daily on Saturday, with companies like Intel launching AI CPUs with up to 400GB of memory, the demand for DDR5 on the server side is rapidly expanding. Analysts point out that the existing capacity of Samsung Electronics and SK Hynix is struggling to keep up with the simultaneous increase in demand for both GPU and CPU, and the shortage of DRAM supply is expected to continue until 2027. Market signals are already reflected in spot prices, with Korean securities data showing a 2.8% increase in the price of DDR5 (16GB) in April compared to the previous month, while traditional DDR4 prices fell by 16%, leading to a widening price gap between the two. Industry insiders say that the current shortage in the DRAM market is about 10 percentage points of the demand. As demand for general DRAM, in addition to HBM, continues to rise, the end of the "super cycle" for storage chips may be pushed back from the previously expected 2026 to 2027. CPUs rise as "AI coordinators," with memory demand doubling The core driver of this round of demand expansion lies in the strategic shift of the AI industry from training to reasoning. In the past, AI data centers were built around GPUs as the core computational infrastructure, with server configurations typically consisting of 8 GPUs paired with 1 CPU, focusing on large-scale parallel training tasks. However, as inference scenarios become more complex, the role of the CPU is transitioning from an auxiliary processor to an "AI coordinator" - responsible for scheduling multiple intelligent AI agents, managing outputs from various modules, and coordinating the overall workflow. The key to this role shift is "contextual memory." CPUs need to store and reference the outputs of various intelligent AI agents in real-time to coordinate the complete reasoning process, making high-capacity memory a rigid requirement. Intel executives stated in a recent earnings call that the AI inference infrastructure's CPU-to-GPU computational ratio has evolved from the previous 1:8 to 1:4, and this ratio is further narrowing towards 1:1. In this context, CPU manufacturers are increasing the DRAM configuration of AI CPUs to 300 to 400GB, up to four times higher than the traditional CPU products with 96 to 256GB configurations. With GPU and CPU demand overlapping, the DDR5 supply-demand gap continues to widen The competition for storage capacity is spreading from the GPU side to the CPU side, with demand growing exponentially. On the GPU side, NVIDIA's next-generation AI chip "Vera Rubin" is equipped with 288GB memory through 8 sets of HBM stacks, while AMD's next-generation GPU MI400 has a memory capacity of up to 432GB. Google's latest release of the eighth-generation tensor processing unit TPU 8i is also equipped with 288GB HBM. On the CPU side, once Intel's "Xeon" and AMD's "Epyc" series AI CPUs start to adopt up to 400GB of DDR5 on a large scale, the imbalance between general DRAM supply and demand will further intensify. Unlike HBM, which is mainly supplied by a few companies like SK Hynix, the expansion of DDR5 demand will directly impact the supply balance of the entire general DRAM market. Price differentiation in the spot market has clearly reflected this structural change: DDR5 prices are strengthening against the trend, while DDR4 prices continue to be under pressure, with a sharp contrast in the market performance of the two products, reflecting an accelerating trend of demand migration to the new generation standard. Samsung and SK Hynix face pressure on capacity, expectations for a super cycle are revised upwards The constraints on the supply side make it difficult for this storage shortage to be alleviated in the short term. As major global DRAM suppliers, Samsung Electronics and SK Hynix have always had their capacity expansion pace constrained by the wafer fab construction cycle and the yield climb of advanced processes. With much of the HBM capacity already locked in, the effective capacity space for general DDR5 production for both companies is relatively limited, making it difficult to respond quickly to the incremental demand brought by AI CPUs. Industry insiders point out that the overall supply of the current DRAM market is about 10 percentage points short. The prices of commodity-grade DRAM have more than doubled from their lows, driving historic profits for storage manufacturers such as Samsung and SK Hynix. With the continuous overlap in demand from both GPU and CPU sides and the expectation for the duration of the super cycle being extended from the previously anticipated 2026 to 2027, the outlook for the storage chip industry's business cycle may be more enduring than previously expected. This article was reprinted from "Wall Street News", by Zhao Ying, GMTEight Editor: Chen Siyu.