
Memory and Storage Market Update: Recent Signals Across DRAM, HBM, and NAND
The global memory and storage market is entering a phase where pricing, supply, and technology roadmaps are increasingly shaped by AI demand rather than traditional consumer cycles. Recent data and vendor disclosures point to tightening conditions across DRAM, HBM, and NAND, with ripple effects now visible throughout servers, SSDs, and downstream systems.
Macro Market Signals
DRAM pricing has accelerated sharply. Recent industry reporting shows that DRAM contract prices have surged in late 2025, driven by continued demand from AI, cloud, and data center customers. Samsung’s performance outlook for late 2025 reflects a dramatic increase in DRAM prices, with contract pricing reportedly rising more than 300% year-over-year in some categories and another ~55–60% expected increase in early 2026 as suppliers prioritize memory for AI workloads. This pricing strength has been linked to tight supply and strong enterprise demand.
Independent market commentary also notes that memory supply constraints—particularly arising from the shift toward high-bandwidth memory and advanced server DRAM—are contributing to rising prices at both contract and spot levels.
NAND flash markets are tightening. While TrendForce data shows more modest NAND contract price growth in late 2025 — with typical contract increases in the 5 %–10 % range — this still represents a reversal from earlier price declines and reflects tightening supply as demand rebounds in higher-value segments such as enterprise SSDs and cloud storage workloads.
Alongside overall price increases, market commentary from storage suppliers and controller makers has confirmed that NAND flash prices have more than doubled over recent quarters in some configurations, driven by AI and data center buying — and that all 2026 NAND production capacity is effectively sold out, supporting continued price strength.
AI demand is the common denominator. AI-driven demand for high-bandwidth memory (HBM) and advanced DRAM continues to reshape the memory market. Suppliers are prioritizing capacity for AI and cloud customers, which has contributed to tighter overall memory supply and persistently elevated pricing across DRAM, DDR5 server memory, and enterprise SSDs. Independent commentary on memory shortages extending into 2026 links these supply pressures directly to AI workload growth and data center deployments requiring specialized memory types.
Technology Roadmap Developments
On the advanced memory front, HBM4 remains a focal point, but its mass production timeline has shifted later than many early expectations. Industry reporting indicates that HBM4 volume availability is now expected no earlier than late 1Q or mid-2026, largely due to evolving customer specifications and the need to revise designs to meet higher performance targets. This delay extends the market’s near-term reliance on HBM3E-class products, which continue to fill the bandwidth gap for AI accelerators
At the same time, memory expansion technologies are gaining traction as data centers look for ways to scale capacity beyond on-package DRAM. Compute Express Link (CXL) is increasingly discussed as a key memory expansion and composable resource solution, with the latest CXL 3.1 specification enabling fabric-attached memory and improved systems-level interoperability between CPUs and external memory pools. Early demonstrations and infrastructure planning guidance indicate that memory pooling via CXL is progressing toward production-ready use cases, particularly for AI inference and large cache sharing across nodes.
Other emerging memory concepts such as Processing-in-Memory (PIM) and near-memory compute continue to advance in research and prototype phases. These approaches aim to reduce data movement overhead and improve performance efficiency by integrating computation closer to memory. While broad adoption remains limited today, academic and industry progress underscores their potential for future high-efficiency workloads.
Vendor Updates
Samsung — Memory and Storage Innovation
Samsung continues to push deeper integration across memory, storage, and internal silicon. On the SoC side, the company is advancing the Exynos 2800 platform with in-house CPU and GPU development, aiming to reduce long-term reliance on external vendors such as Qualcomm and AMD. Current planning targets deployment in flagship Galaxy devices around the 2028 timeframe.
On the storage side, Samsung has demonstrated NVMe SSD optimizations using ATS/PRI mechanisms to improve data access efficiency. QEMU-based simulations show ATC-enabled designs reducing access latency, highlighting software–hardware co-optimization rather than raw bandwidth gains.
Samsung is also exploring AI-driven fraud detection systems using CXL-based memory expansion, with internal testing indicating up to a 4× improvement in total cost of ownership through shared memory pools.
From a market and partnership perspective, Samsung is increasingly viewed as a potential second-source manufacturing partner within Nvidia’s ecosystem, while also engaging in licensing discussions with AI-focused companies such as Groq. At the same time, Samsung has delayed portions of NAND delivery to non-priority channels, favoring data center customers. This has contributed to sharp price increases—estimated at 50–100%—in certain SSD segments.
SK hynix — HBM and AI Memory Leadership
SK hynix continues to lead the HBM and AI memory market. 2026 outlooks project the global memory market exceeding $440 billion, with HBM reaching ~$54–55 billion and growing close to 60% YoY. According to Bank of America forecasts cited in industry reporting, the HBM market is expected to reach approximately $54.6 billion in 2026, up ~58% YoY as AI infrastructure demand expands. SK hynix is estimated to control ~62% of HBM shipments and ~57% of HBM revenue, maintaining clear leadership as of Q2-Q3 2025.
The company was first to mass-produce 12-high stacked HBM, including HBM3E and early HBM4-class devices (~36 GB per stack), with deployments closely tied to Nvidia and Google accelerator platforms. SK hynix has completed internal certification for HBM4 and prepared production systems, and analysts project the company will hold HBM market share in the low-60% range in 2026 supported by early HBM4 supply to key customers such as Nvidia (SK hynix newsroom).
Beyond HBM, SK hynix is expanding its AI memory portfolio across both DRAM and NAND. This includes AI-D DRAM technologies such as MRDIMM, SOCAMM2, LPDDR5R, CMM, and PIM, alongside AI-N NAND offerings like high-IOPS SSDs, HBF architectures, and QLC-based storage optimized for AI workloads.
On expansion and partnerships, SK hynix has announced plans for a large-scale 2.5D packaging facility in Indiana, targeting operations around 2028 with an investment approaching $38.7 billion. This move positions SK hynix to challenge more vertically integrated HBM supply models. In parallel, the company is collaborating with Nvidia and Kioxia on SLC-based AI SSDs targeting extreme IOPS ranges (25 million to 100 million).
At its AI-focused events, SK hynix has emphasized a strategic shift from component supplier to ecosystem creator, anchored around three pillars: customized HBM, AI-optimized DRAM, and AI-optimized NAND.
Application-level innovation is also expanding. Industry discussions increasingly highlight memory’s role in emerging fields such as humanoid robotics, where ACiM and PIM architectures enable low-power neuromorphic computation. Recent demonstrations at SC25 showcased advanced memory products including 256GB modules and ultra-high-capacity eSSDs reaching 245TB.
Micron — Financial Acceleration and Capacity Expansion
Micron entered FY2026 with strong momentum, reporting Q1 revenue of ~$13.6 billion, up roughly 57% year over year, driven primarily by AI-related DRAM and NAND demand . Data center revenue exceeded $7.6 billion (+55% YoY), while HBM and server DRAM revenue surpassed $5.6 billion, representing an approximately 5× increase from the prior year.
To support continued growth, Micron plans ~$20 billion in FY2026 capital expenditures, focused on expanding advanced DRAM and HBM capacity. The company has also revised its HBM market outlook, projecting the HBM TAM could approach $100 billion by 2028, reflecting accelerating adoption in AI accelerators.
On the product front, Micron introduced the 3610 NVMe SSD, positioned as the industry’s first PCIe Gen5 QLC client SSD, offering 1–4 TB capacities and up to ~11,000 MB/s sequential reads . In parallel, Micron announced it will exit portions of the Crucial consumer SSD and memory business by early 2026, reallocating resources toward higher-margin enterprise and data center markets .
Capacity expansion remains a strategic priority. Micron plans to invest ~$9.6 billion in a new HBM facility in Japan, targeting production around 2028, as part of its global AI memory roadmap. In the United States, the company has begun construction on a New York megafab project, supported by federal and state incentives, forming part of a broader long-term manufacturing expansion strategy. The company is also evaluating strategic options, including a possible acquisition of PSMC’s Tongluo fab. At HPE Discover 2025, Micron-aligned systems appeared in AMD’s Helios AI rack platforms with expanded networking.
Sandisk — Brand Reset and SSD Positioning
Sandisk has formally rebranded its client SSD lineup under the new “SanDisk Optimus” family, replacing legacy WD_Black and WD Blue models with three tiers — Optimus, Optimus GX, and Optimus GX PRO — aimed at creators, gamers, and professionals, including AI PC workloads. The rebranding was announced at CES 2026 and reflects Sandisk’s initiative to unify its SSD offerings following its 2025 spin-off from Western Digital.
Sandisk’s Optimus GX PRO tier is highlighted as the flagship SSD line designed for high-performance workloads such as content creation, professional workstations, and AI-oriented systems.
There is also industry evidence of NAND flash supply tightness and pricing pressure affecting SSD makers. Transcend and other module suppliers have reported delayed NAND deliveries from major suppliers including SanDisk, contributing to sharp price increases of 50–100 % in SSDs and flash products as capacity is prioritized for data centers and hyperscale customers.
From an architectural perspective, Sandisk is highlighting HBF capacity challenges and positioning stacked NAND designs as a complementary solution to HBM, enabling terabyte-scale memory pools. Academic commentary, including insights from KAIST researchers, continues to emphasize the importance of GPU optimization alongside memory hierarchy design.
Kioxia — Product Execution and Strategic Pressure
Kioxia continues to focus on enterprise and efficiency-driven storage. The BG7 SSD series offers capacities from 256GB to 2TB over PCIe 4.0, delivering up to 7,000 MB/s read speeds while improving power efficiency by approximately 67%. At CES 2026, Kioxia highlighted SSD solutions spanning AI, mobile, and automotive applications.
In the data center segment, Kioxia’s CM7, CD8P, and CD8 SSD families have achieved compatibility with Microchip SmartRAID controllers, supporting up to 32 NVMe devices per system. On the software side, Kioxia is advancing AI-oriented optimization through AiSAQ ANNS software, which integrates with the Milvus vector database to reduce DRAM usage. The project has been released as open source on GitHub.
However, financial pressure remains. Q2 FY2025 net profit declined by more than 60%, reflecting ongoing challenges in product mix and margin structure. Industry speculation continues around a potential US–Japan joint NAND fab, with Kioxia viewed as a possible participant. Meanwhile, progress in QLC NAND transition has been slower than expected, with attention increasingly centered on ultra-high-capacity enterprise SSDs, including a 245TB-class eSSD targeted for production around 2026.
What to Watch (Next 6–12 Months)
HBM Production Scaling:
Whether HBM4 and advanced HBM3E capacities can scale fast enough to meet AI accelerator demand while supporting broader server and enterprise memory needs.
Price Divergence:
The gap between AI‑optimized memory pricing and legacy memory pricing may widen further if supply constraints persist.
Inventory Dynamics:
With inventories at historic lows, even modest shifts in demand could trigger outsized price volatility.
Secondary Market Signals:
Tightening enterprise memory and storage supply is likely to boost activity in secondary markets, increasing opportunities and pricing for businesses looking to sell RAM or sell SSDs.
Regional Supply Considerations:
Ongoing US–China tensions and China’s domestic memory expansion strategy may influence sourcing, pricing, and regional inventory planning, underscoring the value of diversifying supply channels.