COMMENTARY: Upgrading legacy systems to modern infrastructure is critical, but migrating workloads and applications introduces potential vulnerabilities that must be addressed.
It’s no secret that bad actors are targeting American infrastructure. Ransomware attacks are up by about 73% from 2022 to 2023, according to the SANS Institute, while the 2024 ECI report found that 95% of respondents dealt with a ransomware attack over the last three years.
Cybercriminals aren’t just targeting the private sector. Over the last year, hackers have targeted multiple healthcare providers, including the Vanderbilt University Medical Center; utilities including the Aliquippa Water Plant, and federal agencies like the U.S. Department of Transportation, the U.S. Department of Agriculture, and the FBI. In late June, the Cybersecurity and Infrastructure Security Agency announced that hackers attacked its Chemical Security Assessment Tool, accessing sensitive data.
Workload and application migration may have played a role in those attacks. When moving from a legacy system to a more modern infrastructure, patchwork solutions can create gaps in your security coverage, inconsistent application of security policies, and an inability to gather apples-to-apples security data from different environments. All of that makes monitoring and management tricky at best.
In short, get workload and application migration wrong, and you’ve increased your attack surface for the next bad actor who comes along.
But it’s not something you can avoid, either. Workload and application migration is a constant in IT — that’s just the nature of the beast. Over the past year, 95% of organizations said they moved applications from one environment to another, with security and innovation as the top drivers for the transfer, according to the 2024 ECI report. Eighty-five percent of those organizations said it was a challenge. And more than half of organizations say they lack interoperability between their various infrastructure environments. That all adds up to a perfect storm of security vulnerability, because it’s during migration that security practices can lapse.
One of the reasons why data migration is so complex is that data is stored in any number of locations, from brick-and-mortar data centers to the cloud or a smaller edge location (a data center that’s located physically near your users)—or a mix of all three. Complicating matters, your IT infrastructure is, and needs to be, a constantly evolving landscape. Even the best-designed environment, one that’s ideal at the start of an enterprise workload venture, can quickly become suboptimal as the project’s parameters and scope evolve. It needs to adapt as new technologies, compliance requirements, customer needs, and threats emerge. So while it may be tempting to think you could throw out your infrastructure and start from scratch (AKA a “greenfield deployment”), even if you did have the budget for it—not to mention signoff from the public—your environment wouldn’t stay secure for long. It must adapt.
We are seeing an explosion in AI related workloads to the tune of an expected 750 million new applications entering the market by 2026, according to IDC. With this exponential expansion in data and workloads, even the most well-funded private ventures are already struggling to keep up.
So it’s no wonder that organizations are hesitant to execute complex application migrations — 89% of IT decision makers consider moving workloads to a different cloud environment costly and time consuming. They think it’s easier to secure siloed data and less trouble to keep up with the shifting sands of stringent data privacy regulations. But those who delay taking action may find themselves falling farther and farther behind.
Instead of trying to hold back the tide, smart organizations are learning how to manage it. With a data migration strategy that incorporates hybrid or even multicloud solutions, organizations can minimize operational change, ensure a smooth migration, and deliver the kind of performance, reliability, and support that modern enterprise demands.
People who regularly work with data know that it will inevitably become distributed, especially given the risk of centralizing all your data in one place. And that’s a good thing. Distributing workloads across multiple environments means never having to lose data, since data is stored redundantly. That makes it easier to ensure business continuity in case of disaster. Distributing workloads also means you can protect sensitive data by segregating it in a private cloud. In a multi-cloud environment, you tailor security measures to each environment while keeping security policies consistent across all environments.
Many IT leaders still think of on-prem, the cloud and even individual clouds as separate, siloed repositories of information, but that’s not the reality. The truth is that modern tools can mitigate the risks of data decentralization, secure sensitive data consistently across environments and make it simpler to collect, monitor and manage distributed systems.
Application and data movement is only going to increase. Savvy IT leaders should emphasize flexibility and visibility when planning their infrastructure choices. Organizations must establish processes and operations to manage data and applications across environments, ideally as part of a larger IT modernization initiative. More specifically, organizations should design their IT environments to facilitate data and application portability. The growing pervasiveness of hybrid multi-cloud is proof that applications and data will continue to favor diversity and movement.