Mar 19 2026
Data Center

NVIDIA GTC 2026: The Rise of AI Requires a Rethink of Enterprise Storage Technology

Because inference will be king as more AI agents come online and take on more work, the enterprise storage industry is reimagining how it builds storage to account for this shift.

For a long time, corporate structured data that was stored in files that lived within operating systems was the norm. IT teams had to worry about storing and retrieving the files, and  backing up and restoring the files if a worst-case scenario happened.

But in the world we’re headed toward, where data is fluid and context-dependent — and most important, not always generated, stored or retrieved by humans — that calls for a rethink on how IT teams and leaders approach their storage infrastructure and architecture.

"We used to have humans using the storage system, we used to have humans using SQL, now we're going to have AIs using these storage systems,” said NVIDIA founder and CEO Jensen Huang during his keynote address at NVIDIA GTC 2026.

One of the biggest differences that needs to be accounted for with AI agents accessing storage, versus humans, is that AI agents have even more of an appetite for speed and agility than their human counterparts.

“Unlike humans, who are more tolerant of slower computers, AI wants the tools to be as fast as possible,” said Huang.

In announcing NVIDIA’s forthcoming BlueField-4 STX storage system, Huang doubled down on the need for a new world order with storage.

“Agentic AI is redefining what software can do, and the computing infrastructure behind it must be reinvented to keep pace,” said Huang. “AI systems that reason across massive context and continuously learn require a new class of storage. NVIDIA STX reinvents the storage stack, providing a modular foundation for AI-native infrastructure that keeps AI factories operating at peak performance.”
 

Fluid, Dynamic Data Is Essential for AI, but Complicates Storage Needs

When people think about AI, they often think about the massive sets of training data needed to bring a model up. But data preparation, which includes data extraction, data enrichment, classification, embedding, indexing and semantic search, is resource intensive and requires high-performance storage infrastructure.

“Whether you're training a model, you're fine-tuning a model or you're retrieving context that the model doesn't know about to inform its decisions, all of those things require access — fast access — to accurate, recent data,” said Jacob Liberman, director of enterprise product for NVIDIA, while speaking during his session “Accelerating the Path to Production: The Evolution of Enterprise Storage to Deliver AI-Ready Data” at NVIDIA GTC 2026.

Prior to the AI era, the hottest conversations in data and storage were around amassing Big Data and pooling that data into large storage systems for retrieval. These were called data lakes. But Liberman believes AI is shifting the analogy away from data lakes into something with a lot more movement.

“The data is not only complex, but it's growing all the time. You know, people talk about data as a lake, but it's more like a river,” said Liberman. “It's flowing, it's changing — it's constantly changing. So, the first time that you prepare your data is not going to be the last time. You have to continuously prepare your data for AI as it changes.”

Part of the reason for this shift is that with AI, enterprises are dealing with unstructured data in ways that they haven’t before.

“The data itself is multimodal and heterogeneous. You're not just dealing with one type of data,” said Liberman. “Traditional enterprise applications have siloed data sources, specific data for each application, but an AI agent doesn't — isn't constrained by those boundaries. It needs to tap into all the data sources in order to make a good decision.”

To make storage technology that is more suited to agentic AI, NVIDIA is working with storage industry leaders such as IBM, Dell and NetApp to make this needed AI-ready storage infrastructure a reality, accounting for 60%-70% of the world’s on-premises enterprise data.

Jacob Liberman shows a slide of NVIDIA's storage partners at NVIDIA GTC 2026

The messy, heterogenous, unstructured data that is poised to make up much more of our enterprise data in the future is more difficult to parse and understand. But unlocking access to that data and using agents to do that heavy lifting is going to give companies and the IT industry as a whole some seriously impressive new capabilities.

“The ability for us to do things like optimize on vector search functionality, the ability to optimize on how we integrate in and communicate with these vector databases, not only allows for faster inferencing, but also faster awareness of data,” said Jason Hardy, vice president of storage technology for NVIDIA, during the session with Liberman. “The agents that are being created in the enterprise allow for us to push that real-time awareness of data which becomes critical as we start to blend in and use it to drive how our businesses operate.”

Keep this page bookmarked for articles from the NVIDIA GTC 2026.

NVIDIA
Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.