Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Google "wrongly accused"? Storage supply chain voices intensify: AI demand has not decreased, and structural shortages continue
Ask AI · Why didn’t Google’s compression algorithm curb the surge in storage demand?
Google has recently rolled out the TurboQuant compression algorithm, claiming it reduces the memory footprint of key model caches by at least 6x. The news briefly sent the storage industry into panic.
But at the MemoryS 2026 summit, executives from multiple storage vendors—including Samsung Electronics, Yangtze Memory, Kioxia, and SanDisk—as well as cloud computing and chip companies, spoke intensively. They believe that as AI accelerates adoption, storage demand is still being pushed to new highs, and the shortage situation could persist.
A conference attendee told First Financial that although some technologies are improving data-processing efficiency, the overall scale of AI applications is expanding, so actual storage demand is still growing. “In the past two days, I’ve met with 20 waves of people. Everyone asks me whether there’s inventory. During the inference stage, storage demand is growing exponentially.”
“AI is rapidly consuming storage production capacity.” Ta Wei, General Manager of CFM Flash Market, said at the meeting. Data disclosed on-site shows that in 2026, the share of AI servers in total server shipments will exceed 20%, further driving up storage configurations.
He said that AI inference is driving eSSD to become the largest application market for NAND in 2026. By contrast, the smartphone market has been relatively flat, but on-device AI is expected to become a new growth engine. As another important application direction for LPDDR, automobiles are also becoming one of the key scenarios for AI deployment.
As AI applications move from model training to more frequent real-world usage, companies’ requirements for data read speeds and responsiveness have clearly increased. “High-performance storage is no longer an optional nice-to-have; it’s the core foundation that determines a system’s decision-making efficiency and scale.” Zhang Wancan, Executive Vice President of Samsung Electronics, said. Samsung plans to introduce EDSFF drives with a thickness of only 1T from 2026 to 2027. This approach can multiply both the total capacity and bandwidth per single rack, maximizing space-operation efficiency.
Tan Hong, head of the Solid-State Drive business unit at Yangtze Memory, also pointed out that the current “availability of GPU clusters” is only about 50%, and that storage data read efficiency has become an important factor limiting the full performance of compute. He said that improving data read and call efficiency can reduce compute waste.
Kioxia’s Chief Technology Officer, Koichi Fukuda, also said that as AI shifts from training to real applications, “storage has become the key bottleneck,” and the demand growth driven by inference scenarios is the most evident.
While demand is rising, adjustments on the supply side further intensify the tight situation.
Ta Wei said that because the production capacity expansion cycle for storage lasts 18 to 24 months, “supply shortages are unlikely to be alleviated in the short term; structural mismatches have become the norm.” Manufacturers are prioritizing allocating capacity to AI-related products with higher margins, while capacity for consumer products is being squeezed. Industry inventory has fallen to relatively low levels.
Pan Jiancheng, CEO of Phison Electronics, offered an even more direct assessment: “Flash will continue to be in short supply, and it will be short for a long time.”
Feedback from the system side also shows resources are becoming scarce. Teng Jingxiang, a senior technical expert at Tencent Cloud in the OS kernel team, said that as AI growth squeezes DRAM capacity, “memory resources are getting tighter and tighter.”
“This is not a simple one-time cyclical rebound, but a long-cycle paradigm shift. Storage technology is moving from micro-innovations to an architectural revolution. Concepts like CXL, compute-in-memory, and near-memory computing are accelerating toward commercialization.” Ta Wei also reminded that even amid prosperity, it’s important to stay clear-headed. He suggests rational capacity expansion on the supply side, early planning and multi-source procurement on the demand side—shifting from passive buying of storage to proactively optimizing storage.
(Intern Zhu Lingjie also contributed to this article)
( This article is from First Financial )
Related reading
A paper triggered a rout in storage chip stocks—has Google’s “DeepSeek moment” arrived?