Forecast: Mostly Cloudy
Government technologists want to deploy cloud computing, but agencies must learn to let go.
Since taking on the role of the first federal chief information officer, Vivek Kundra has been on a crusade promoting cloud computing, the relatively new business practice of buying technology services over the Internet from a contractor or agency. The advantages of the cloud, as those in the know call it, include significant savings and a faster way for agencies to obtain the latest technology.
But as appealing as it sounds, cloud computing will require a tectonic shift in thinking for agency information technology shops - and for federal computer users - if it has any hope of being adopted in a big way. And government managers aren't exactly known for embracing big changes in the way they do business.
But cloud computing has a good story to tell. Instead of agencies individually awarding massive hardware and software contracts, Kundra is encouraging CIOs to buy IT as a service, in which agencies would pay for Internet access to shared hardware and software that is housed in an off-site data center. In essence, an IT shop would function like a utility, outsourcing its technology applications.
"A new generation of federal decision-makers doesn't need to see the servers running the applications. They need to see the applications running on the client device,'' says Tom Simmons, vice president for government systems at Citrix Systems, a leading virtualization vendor. "That cultural shift is coming.''
That means managers won't be able to enter a data center and see the servers running the applications that support their programs. Instead, they'll have to trust a contractor or maybe another agency to deliver applications and data to them over the Internet. That's a huge leap for federal managers who typically like control. "The biggest challenge with cloud computing is cultural," says Aileen Black, vice president of public sector for VMWare, a virtualization system supplier that counts every Cabinet-level agency among its customers. "They can't go down the hallway and hug their servers anymore."
Nevertheless, technology specialists predict agencies will migrate to cloud computing, albeit gradually. Federal IT managers already have begun consolidating data centers and virtualizing servers, storage and desktops. They've also started outsourcing applications such as e-mail alerts and customer service to software-as-a-service companies. That's the beginning of cloud computing. Next, they'll revamp their data centers to operate as private clouds that offer services to other agencies and charge based on use of the equipment and applications.
The Environmental Protection Agency has been virtualizing the Unix servers in its National Computer Center in North Carolina since 2005, using IBM's logical partitioning. The agency has done the same with its Windows servers and storage systems, and it is in the early stage of evaluating virtual desktops. "We don't necessarily have to buy equipment every time we need to put up a new server," explains Vaughn Noga, acting director of technology operations and planning at EPA's Office of Environmental Information. "The senior IT people have embraced virtualization and see the benefits from an operational perspective."
Noga says the hurdle in migrating to cloud computing is in convincing EPA's program managers they no longer have to worry about the platforms that host their applications. "In the past, people defined their needs based on processors, hard drive space and memory. They defined what they needed from a purely hardware perspective.
That's got to change," he says. "Now we need to define what we need based on uptime, availability and performance. That's where we see the cultural change and the technology change in getting folks to think in those terms."
But that might not necessarily be a hard thing to do when managers understand the savings and efficiencies cloud computing can provide, says Casey Coleman, CIO at the General Services Administration and one of the leaders organizing the government's shift to the cloud. "Your basic traditional data center is probably operating at less than 30 percent capacity. It's hugely over-provisioned and using up unnecessary energy and computing power," she says. "That's where the cost advantages of cloud computing come into play."
Where to Start
Coleman is co-chairwoman of the federal Chief Information Officers Council's cloud computing working group, which is defining cloud computing terms and the services that the government will deliver via the cloud. The council is looking at what federal agencies need to make cloud computing workable, including user provisioning, service-level monitoring, analytics and reporting.
The agency will create a one-stop shop for cloud computing services on its GSA Advantage Web site before the end of the year, she says. The electronic storefront will feature software-as-a-service and infrastructure-as-a-service vendors that have contracts with GSA.
Coleman predicts agencies will use cloud computing initially for applications that are low risk, because they don't rely on data that is sensitive. These applications typically are used in communicating with the public, including blogs, Web hosting and collaboration. Between 40 percent and 45 percent of all federal IT applications fall in this low-risk category, she says.
Before applications with higher risk can be placed in a cloud environment, the working group plans to analyze issues such as information security, architecture, portability and interoperability. Coleman says that within 18 months the group will have examples of moderate-risk applications being operated in a cloud.
The Cloud and Virtualization
Most agencies are ready to shift to cloud computing because they've spent the past several years consolidating and virtualizing their data centers. Virtualization allows for logical partitioning of physical IT resources such as servers, desktops and storage devices so they can be used by more than one application.
By allowing agencies to reduce the number of servers they operate from as many as 30 to one, virtualization frees up data center floor space and slashes electricity costs. Virtualization also makes it easier and faster for an agency to boot up a new server or to migrate an application from a failed hardware platform to another.
For the Veterans Affairs Department, the need for faster disaster recovery and better performance has driven it to virtualize about 15 percent of its servers. "The primary objective there is rapid recovery of the server because it is not tied to the hardware platform," says Jeffrey Lush, executive chief technology officer for enterprise infrastructure engineering at VA's Office of Information and Technology. "Plus, virtualization gives us the ability to dynamically manage the performance of that server and bring on additional resources as the business dictates."
Lush says while cost savings and energy efficiency are benefits of server virtualization, what has pushed VA to the new business process is improving customer service - such as keeping computers up and running. A workflow technology analyzes performance and provides additional virtual services based on certain parameters. "It's kind of like predicting the weather," he says. "If there's a 70 percent chance that a server will go down, we have it automated to spin up another server so we can move customer service to a whole new level."
VA's goal is to provide a persistent computing architecture that includes multiple technologies, not just virtualization. The department is expanding into storage, removing duplication and backup efficiency. "We're talking about data recovery," he says.
Private Clouds
The federal government likely will rely on something called private clouds, rather than putting too many applications in a cloud environment that uses the Internet, cloud computing specialists predict.
The agency that's furthest along at developing a private cloud computing architecture is the Defense Information Systems Agency. DISA already operates like an IT service provider, recovering all its costs from its military customers and benchmarking its services against commercial industry.
The agency has been virtualizing servers in its 13 data centers since 2006, when it awarded capacity contracts to APPTIS, Hewlett-Packard Co., Sun Microsystems and Vion Corp. The eight-year contracts allow DISA to purchase server capacity on an on-demand basis and to pay for it like a utility, says Shelley Madden, chief of availability management for computing services at DISA. "We no longer as an agency have to use capital dollars and have two-year procurement cycles for hardware," she says. "We have capacity as a service from our vendors."
DISA hosts 6,000 operating environments - the term it uses to describe servers - and has virtualized 20 percent of them during the last two years.
The biggest benefit of virtualization for DISA is that it speeds the process of standing up a new server. What used to take two years in some instances now takes two hours. "That is really good news," she says. Another measurable benefit of virtualization is the reduction of excess capacity. According to Madden, DISA saw 3 percent to 5 percent utilization of its servers prior to the award of the capacity contracts. "We had so much excess capacity in sunk costs,'' she says. "Now that we're more of a capacity-on-demand environment, we can size the environment very small. We are driving our utilization metrics over 50 percent in the server world for those in virtualized environments."
Henry Sienkiewicz, technical program director for computing services at DISA, says cloud computing will be the next generation of data centers at the agency. "On the infrastructure side of the cloud, we do run a private instance of the Akamai network," he says. "On the services side, we're looking at how do we implement software-as-a-service for the applications and communications."
DISA has two software-as-a-service pilot projects. It offers CollabNet's SourceForge software development platform on a per-user basis, and it plans to offer a commercial customer relationship management platform to its Army and Air Force customers. The projects require changes to DISA's procurement methodologies and security concerns have arisen. "One of the components of software-as-a-service is that it is generally multitenant applications," Sienkiewicz says. "We're still trying to make sure that we understand the security ramifications that go on with multitenant environments."
To enable cloud computing, DISA has migrated to what it calls the Rapid Access Computing Environment for IT service delivery. RACE allows users to provision a server within 24 hours inside one of DISA's data centers, using a charge card. The agency plans to offer RACE on its classified network by the end of the year. Sienkiewicz says DISA wants to drive more military customers to migrate to virtualization and software-as-a- service models by offering lower prices.
Rapid Adoption Anticipated
Once users are comfortable with the idea of IT delivered as a service, adoption rates soar. That's what happened at the House of Representatives, which experienced significant demand for its IT-as-a-service offer. The House decided to put all the newly elected members of the 111th Congress on a virtual server architecture in its main data center. Costs dropped because members didn't have to buy hardware and software. The service immediately signed up 57 offices and that quickly grew to 85. Another 200 offices are waiting to be connected, said Rich Zanatta, director of facilities for the House.
The reason congressional offices are OK with giving up control of their hardware and software is because in return they receive lower prices. There is an economic incentive for members to go to our virtual server architecture, according to Jack Nichols, director of enterprise operations at the House. "Not only is it green, but it offers a better security posture for them and it saves money in their individual accounts," he says.
"They're no longer buying a server infrastructure; they're buying a service."
This is the second major virtualization project at the House. In 2007, it consolidated 200 test and development servers to 20, using VMWare's virtualization platform. The House was running out of floor space in its data center and had maxed out on its power usage. About four years ago, the House used 490,000 watts of electricity an hour. Today, it uses 125,000 watts an hour.
Another benefit of virtualization was providing more uptime for applications that weren't deemed mission critical. "With virtualization, you can bring those applications that didn't rise to the business need for high availability to near high availability," Nichols says. In the case of a system board failure, "you can shorten that period of downtime from hours to minutes."
Despite the savings associated with virtualization, House administrators warn that there are drawbacks such as increased complexity. That's why they preach standardization. "We have fewer physical servers, but we have just as many, if not more, virtual servers. The complexity level has risen dramatically," Zanatta says. "We have stringent methodologies about what can be virtualized and when it can be virtualized and what hardware it can be virtualized on."
Next up for the House is a push toward virtualized desktops, which would improve information security at congressional offices and provide them with enhanced mobility. By virtualizing the desktop, administrators can more easily and quickly patch security holes. "Our customer base is somewhat transient in nature. They have an office in Washington D.C. and an office in their district in every corner of the country." Nichols says. "If we can enhance their experience so they can access their own unique workstation whether they are in D.C. or a district office, that's a better user experience."
Nichols and Zanatta anticipate that some offices will worry about having their desktops stored in the House's data center. But they predict members eventually will get used to the idea because of better support. With virtualized desktops, "it's going to be far easier for us to deploy new applications . . . and to be able to deploy them more rapidly," Zanatta says. "We won't have to touch the close to 15,000 endpoints within our architecture."
In the long term, the House favors cloud computing, which will allow members to focus on legislative work as opposed to IT, Nichols says. "We're in the business of providing enterprise services," he says. "We want to be able to deliver them expeditiously and cost-effectively in a sustainable fashion using as little power and resources as we can. I think the concept of an internal cloud is the way to go."
Carolyn Duffy Marsan is a high-tech business reporter based in Indianapolis who has covered the federal IT market since 1987.
NEXT STORY: USPTO eyes unified communication system