Nvidia’s decision to switch the type of memory used in its AI servers could sharply increase global memory prices over the next two years, according to a new report from Counterpoint Research.
Chipmakers are currently struggling with shortages of older memory chips because most of them shifted production to high-end chips needed for AI. Counterpoint says an even bigger issue is coming.
To cut power costs, Nvidia is moving from traditional server memory (DDR5) to low-power chips called LPDDR — the kind usually used in phones and tablets. AI servers need far more memory chips than a smartphone. This shift could suddenly create massive demand for LPDDR that suppliers are not prepared for.
Due to this chipmakers like Samsung, SK Hynix, and Micron may have to redirect more factory capacity to LPDDR. If they do, shortages across the entire memory market could worsen.
Counterpoint warns that Nvidia’s shift makes it a “smartphone-sized” customer overnight — a major shock to the supply chain. The firm predicts server-memory prices could double by late 2026 and says overall memory prices may rise 50% by mid-2026.
If that happens, cloud companies and AI developers will face higher costs at a time when they’re already spending heavily on GPUs and data-centre power upgrades.

















































