Does counting data centers miss the point?
The number of closures is an easy measure in the ongoing data center consolidation effort, but it's hardly the most important one.
Closing data centers is a means to an end, not the end itself, says U.S. CIO Steven VanRoekel. (FCW photo)
One of the key measurements of the Federal Data Center Consolidation Initiative is how many data centers have been closed. But focusing on that as if it were the end in itself is a mistake, said U.S. CIO Steven VanRoekel.
The closures are more a means to an end, a necessary first step toward optimizing the federal government’s IT systems, he said -- an argument echoed by others in both industry and government.
VanRoekel was one of several federal officials who discussed data center consolidation at a Jan. 22 hearing before the House Committee on Oversight and Government Reform. (Read more FCW coverage of the hearing here.)
At the hearing, Rep. Carolyn Maloney (D-N.Y.) suggested that the data centers were put in place originally for good reasons, and asked how the government decided which ones to close -- or even whether they should be closed at all.
The question gave VanRoekel the chance to explain that closing a data center means moving its information elsewhere, not losing it. The overall effect is to retain the data, but lower the cost of storing and using it.
"In essence, we are going to optimize and close data centers by shifting the resources of one to other ones – to more efficient data centers – not taking away certain services, not deprecating any service we provide," VanRoekel said. "And if anything, while we make that shift, we modernize those systems and provide an even better service. It’s a really nice opportunity to build efficiency and effectiveness at a much lower cost."
Agencies are attempting to realize long-term savings in different ways, with each creating their own plans of action, according to David Powner, director of IT management issues at GAO.
In theory, when an agency closes a data center, it will realize savings from reduced energy consumption, eliminated facility costs and reduced labor, but a big chunk of potential savings depends upon how the agency optimizes its existing IT infrastructure.
Getting rid of a few obsolete servers that run singular applications can certainly save money, but if an agency is simply moving old, inefficient applications and systems to another facility, they’re losing big savings in the long run, said Rob Stein, NetApp’s public sector vice president. The key is to take the opportunity to make real changes in strategy.
"I think the cost benefits are huge if agencies go to a shared environment where multiple tenant applications use the same infrastructure, server, storage, networking and software in a secure manner," he said. "That way, they are reducing their real estate costs by closing centers and reducing the hardware and software infrastructure they need, too."
Mark Forman, former administrator for e-government and IT at the Office of Management and Budget, has expressed doubts about agencies’ willingness to "consolidate away the complexity" of client/server applications.
But Stein said some agencies are doing just that and saving large amounts of money in the process, considering data storage and management is one of the biggest expenses in the federal government’s $80 billion IT budget.
The National Institutes of Health, the Federal Aviation Administration, the Transportation Department's Enterprise Services Center and the State Department have cut data storage costs as much as 80 percent using NetApp’s software, Stein said.
EMC, IBM, HP and Oracle provide agencies with similar products.
Added benefits of the shared environment are non-disruptive upgrades, increased scalability and easy integration with numerous software and hardware platforms. As Stein put it, technology is out there that allows "agencies to do a lot more with a much smaller chunk of real estate."
NEXT STORY: Data center savings hard to track