The Driving Force Behind Cloud Repatriation
Although only 8% to 9% of organizations intend to implement full workload repatriation according to IDC’s recent Server and Storage Workloads Survey, the cost and regulatory challenges driving repatriation are real.
“What we hear a lot is that the unpredictability of the cost for some workloads in the cloud has become untenable,” Gordon says. Teams also need to make room in their budget for AI infrastructure.
Organizations using “private AI” applications are likely turning to on-premises infrastructure, says Rob Tiffany, research director in IDC’s worldwide infrastructure research group and part of the cloud and edge services practice.
“They're going to have large language models or small language models running on their own gear, and training or fine-tuning those AI models with their own private corporate data,” Tiffany says. These companies are hesitant to share their LLMs with AI vendors.
FIND OUT: How to navigate cloud migration and modernization projects.
How Heavily Regulated Industries Are Moving Data Workloads
Data sets are governed by strict compliance or data residency regulations, potentially leading companies to repatriate and move IT networks back on-premises.
Compliance requirements are also a big factor, particularly in financial services. “Although there might be some experimenting in the cloud, some of these more regulated industries are thinking about how to keep production centrally in their data center, for security and availability,” Gordon says, which means the most valuable or most sensitive data could end up back in the data center.
Tiffany recommends keeping sensitive data in public cloud infrastructure and then running large workloads in Microsoft Azure or Amazon Web Services. Application Programming Interfaces make this integration between the public and private cloud possible.