IT infrastructure: The role of the cloud
Richard Spires explains how agencies can plan for an orderly and forward-looking consolidation.
In my previous columns on IT infrastructure, I discussed the importance of moving to a modern, standardized and consolidated IT infrastructure, at least at an agency-by-agency level. Such infrastructure rationalization is foundational for enabling IT to be most effective in helping government mission or business customers.
In my columns, I presented four structural obstacles that have greatly inhibited the federal government’s ability to make significant progress on IT infrastructure rationalization, along with six approaches government can take to surmount those obstacles.
Before moving on to other topics, let’s address what I mean by a modern, standardized and consolidated IT infrastructure.
There are a number of viable models that can work for an agency, depending on the size and complexity of its IT systems. Perhaps ironically, I find it concerning that an agency would rely on only one vendor to provide its IT infrastructure capabilities, even though one would think that is the very definition of consolidated.
Instead, my experience shows that government is better served when there is regular competition for such services or at least the real threat of such competition. Too often, agencies get locked into long-term contracts that do not provide mechanisms for ensuring that an agency is keeping pace with ongoing improvements in technology. As a result, the agency falls further behind, making it exceedingly difficult to introduce new capabilities and reduce overall operating costs.
Agencies must take advantage of an approach that aligns with the mature commercial business models that serve large private-sector firms. A key component of an agency’s IT infrastructure strategy should be leveraging cloud computing capabilities -- private clouds hosted at government data centers or at a vendor’s facility, together with public cloud services.
The business model is compelling because it lowers overall capital costs and moves agencies to a consumption-based model. What is equally compelling is that cloud services are easy to benchmark in terms of service quality and cost, which enables agencies to measure whether they are getting service at competitive prices. That helps ensure a fair deal today, and as services evolve, agencies can continue to benchmark offerings to ensure that their cloud service providers are staying competitive.
So what might a modern, standard and consolidated infrastructure look like at a government agency?
In most cases, agencies will need an ongoing brick-and-mortar data center. Depending on its size and complexity, an agency might need multiple data centers to provide high-availability and disaster recovery capabilities. That might justify a small number of physical data centers but not the dozens or even hundreds that still exist at some of the large agencies. Some legacy applications cannot easily live in cloud architectures, and applications that house highly sensitive and even classified systems will need the physical security controls of a dedicated data center. However, agencies should have plans to modernize legacy systems to at least enable them to move to a highly virtualized environment.
Furthermore, agencies should be migrating most of their applications from stand-alone servers dedicated to individual systems and to cloud services using production, development and test-as-a-service models. Enterprise commodity applications should be migrated to software-as-a-service models, with applications like email and SharePoint leading the way. Some agencies are now getting more creative and looking at other enterprise SaaS offerings, including business intelligence and customer relationship management. Cloud models are even being used for virtual desktops and mobile device management, enabling agencies to move away from buying and directly managing end-user devices.
A continuing key concern about cloud computing is security. Agencies should deploy a set of private cloud services in their brick-and-mortar data centers for applications and infrastructure that handle sensitive data. Applications that use non-sensitive data can rely on public cloud services.
I am drawn to that model for several reasons:
1. It will continue to spawn competition for data center and private cloud services. As the Federal Risk and Authorization Management Program (FedRAMP) matures and as public cloud service providers address security concerns, it will become safe to move more sensitive data to public cloud services. Vendors that provide existing data center services have ample reason to ensure that they stay competitive -- namely, the looming threat offered by public cloud service providers.
2. By migrating applications to a private cloud capability now, agencies will have standardized and modernized, enabling them to more easily move those applications to public cloud providers sometime in the future.
3. Cloud brokerage models are evolving that will foster even more competition. The models allow agencies, via a broker, to easily shop their applications among multiple public cloud service providers.
Returning to the uniting theme of my columns on IT infrastructure, the key to making any of these efforts succeed is a commitment to consolidating IT infrastructure. Agencies need scale to deploy a sophisticated model of enterprise data centers, along with the use of private and public clouds that can drive significant efficiencies.
If an agency allows each of its programs or offices to go it alone, there might be some use of cloud computing on a program-by-program basis, but little efficiency will be gained and no standardization will happen. As a result, the agency is still faced with daunting complexity, inefficiency and lack of flexibility in its IT infrastructure.