Extreme makeover: Cutting data centers down to size
Under new pressure to curb data center spending, agencies must decide whether it's better to remodel old or build new data centers to keep up with growing IT demands.
Knowledge is power, as everyone in government knows. But to get and maintain power, you first need a place to store all that knowledge and then deliver it when and where it’s needed.
In government, just as in private industry today, that’s the job of the data center, a facility that houses computer systems and associated components such as telecommunications and storage systems. Data centers include everything from backup power supplies and redundant data communications connections to environmental controls, such as air conditioning and fire suppression, and — not least — high-end security devices. All of that equipment costs money and gobbles up precious energy resources.
How to decide whether to retrofit or replace your data center
Will data center history repeat itself?
Government to build $1.5B cybersecurity data center
Kundra: System will give agencies easier access to new technology
Take the guesswork out of green
So it should be no surprise that federal information technology managers are acting like audition candidates for “Exteme Makover: Home Edition” — in particular, that yet-to-be-aired episode on how to turn your 1990s McMansion into a 21st-century model for energy efficiency and cost savings. Agency leaders are under mounting pressure to reverse a decade-long building spree that has seen the number of U.S. data centers grow from 432 to 1,100.
“We’ve been building data center after data center, acquiring application after application and, frankly, what that’s done is it’s driven the cost and investments in technology across the board,” said federal CIO Vivek Kundra in a press conference last fall. “We cannot continue on that trajectory.”
The trend of unbridled data center growth began during the dot-com bubble, when agencies realized they needed fast Internet connectivity and nonstop operation to deploy systems and maintain a presence on the Internet. But a data center also uses 10 to 100 times more energy than a typical office space of the same size uses, according to the General Services Administration. Approximately one-half of a data center’s energy use is for the IT equipment, while the other half is for the building infrastructure — primarily to remove the heat produced by the IT equipment.
Governmentwide, federal servers now cost about $500 million annually to operate, according to the Environmental Protection Agency. And if no action is taken to reduce their usage, data center power requirements will double by 2011. However, if all agencies were to implement what EPA considers state-of-the-art practices, the government could lower overall data center power usage to 2001 levels by 2011 — a net swing of 90 billion kilowatt hours.
That’s the objective outlined in the Obama administration’s fiscal 2011 budget request, which makes reining in data center proliferation part of its overall deficit-cutting strategies.
Kundra signaled the move last fall when he publicly fretted over the cost and consequences of fragmented data caused by having too many data centers. That infrastructure serves more than 300 million customers with more than 10,000 systems. “And a lot of these investments are duplicative,” Kundra said. As examples, he pointed to GSA, which operates eight data centers, and the Homeland Security Department, which has 23 data centers.
But data center numbers have grown for a reason. Agency chief information officers are continually pressured to supply more computing power and data storage resources as the government puts more of its operations and services online. For example, the Defense Information Systems Agency estimated that its IT processing demands across its Defense Enterprise Computing Centers have increased sixfold since 2002.
The increased computing power has obvious benefits to the government and its citizens. At the same time, however, it can serve to increase the government’s vulnerability. “The more data centers you have, the more opportunities there are for points of failure,” said Norm Lorentz, former chief technology officer at the Office of Management and Budget and now a director at Grant Thornton’s global public-sector practice.
Building new data centers to keep up is also a costly strategy. In just two recent examples, the National Security Agency is spending $1.5 billion to build a new data center in Utah to support expanding cybersecurity operations, and GSA plans to spend $65 million to $80 million to build a new data center at the Denver Federal Center.
Kundra said he believes government can ultimately curb its need to own so many data centers by outsourcing more IT work to commercial cloud service providers. That is beginning to happen for select applications, but security and performance concerns are holding back wholesale shifts to the model for now.
It all leaves agency executives asking the ultimate makeover question: When is it better to squeeze more out of what you’ve got or, like a frustrated homeowner who finally faces reality, make the case for brand-new digs?
There are no easy answers, but veterans of the process say the decision goes way beyond the IT department.
Denser Centers
The Obama administration’s directives have come at an opportune time, at least from a technical perspective, because data center remodeling is more feasible than ever. Server virtualization can easily shrink hardware resources to a ratio of 5-to-1 or better. And the advances in multicore processors and modular blade servers help to relieve the constant pressure on CIOs to expand their real estate.
“I talk to a lot of federal agencies that first want to build some huge new facility, but once they realize they can get a lot more capacity out of their existing floor space, they completely scrap the project and decide to retrofit,” said David Cappuccio, chief of research for infrastructure teams at technology consulting firm Gartner.
If an agency decides to build a new facility, officials often scale back the square footage they thought they needed to achieve their computing requirements, Cappuccio said.
DISA is a case in point. After taking over responsibility for 192 Defense Department data centers in the early 1990s, DISA launched a series of streamlining projects that cut its facility count to 14 centers. The belt-tightening is paying off: DISA officials estimate they have seen less than 2 percent growth in IT costs since 2004.
“We have a three-year-out plan on how we make structural, long-term investments in our data centers,” said Henry Sienkiewicz, technical program director of computing services at DISA. The plan addresses everything from complete retrofits of power and cooling equipment to the energy needs of facilities.
“By smartly planning for growth, we can reuse our existing facilities,” Sienkiewicz said.
Advantages of Starting Fresh
But in some cases, remodeling can be penny-wise and kilowatt-foolish. An old energy-guzzling data center can’t always be retrofitted to take advantage of the latest power-saving technologies and building design techniques. When CIOs factor in those handicaps, a new facility might quickly become attractive.
For the first time in more than 15 years, the National Renewable Energy Laboratory (NREL) is expanding its data center real estate by constructing two new facilities. The lab acts as a clearinghouse for renewable technology research and works with other agencies and private industry on clean-technology commercialization projects.
Renewed interest in green technologies has spawned a 40 percent expansion in staff in the past year, which increases demands for physical space, processing capabilities and related IT services, said Chuck Powers, manager of NREL’s IT Infrastructure and Operations Group.
Lab officials decided to build instead of remodel for many reasons. The existing center is small, 30 years old and becoming obsolete. The space was originally intended to house mainframe computers, not racks of modern blade servers. But the biggest draw for NREL was to practice what it preaches by creating two new data centers in facilities that exceed the requirements for the Leadership in Energy and Environmental Design Platinum rating, the highest ranking by the U.S. Green Building Council.
Part of a 220,000-square-foot office building, the 3,000-square-foot primary data center will host business applications, network servers, storage and e-mail systems.
“We want to show that you can implement green IT in a building of this size,” Powers said.
The 10,000-square-foot secondary data center will expand NREL’s high-performance computing capabilities for research and development and will be housed in a new 130,000-square-foot building.
With new construction, the lab will be able to incorporate building design techniques that increase efficiency. For example, the lab will cool the primary data center by using outside air funneled through a series of tunnels in the foundation’s concrete. Concrete typically cools during the night and stays relatively cool during the day. Officials believe that will be enough to keep cool air flowing into the data center, which will eliminate the need for traditional air conditioning systems for all but a few hours of the year.
“This wouldn’t have been an option in our existing data center,” Powers said.
State Pressures
The Washington State Department of Information Services pushed for a new flagship data center after first considering a makeover of its existing facility, which is housed in the basement of a 1970s-era building. But after estimates put the reclamation at $110 million — half of which was necessary to meet the latest earthquake-resistance standards — the state opted to spend $255 million for a new 394,000-square-foot data center and office building slated to open late next year.
“The numbers just didn’t pencil out” to stay in the current facility, said Jim Albert, the department’s deputy director of operations.
In addition to tremors, power was a major consideration in the center. It can handle only about 40 watts of power per square foot, which is quickly becoming impractical for blade server racks that require as much as 35 kilowatts of power. And server consolidations are definitely in the state’s future. Officials expect to run 5,000 virtual and physical servers in the new facility, which is designed for 200 watts of power per square foot.
To cool all that capacity, officials will use a new ambient air-cooling system. It will chill hot server racks with air drawn from outside the building, which means the data center could avoid running its air conditioners except for a handful of the hottest days of the year. If everything works as planned, the savings could top $1 million a year, said Tony Tortorice, the state’s CIO. Ambient air cooling wasn’t a retrofit option for the existing basement-level data center.
But new data centers aren’t any more popular at the state level than they are at the federal level. Local lawmakers have blasted Washington state’s project and said they would prefer to see the money spent on upgrading out-of-date IT resources to improve citizen services rather than on a new physical facility. Critics have also argued that outsourcing or cloud computing would be better options.
But Tortorice said IT issues aren’t the only concern, and he’s not ready to rely on cloud computing at the expense of a state-run data center.
“We haven’t been able to find public cloud providers that can meet our security requirements just yet,” Tortorice said.
What's Next
Federal CIOs will be sorting out data center issues such as these for years to come. OMB’s first step will be to solicit agency input for governmentwide data-center policies, which aren’t expected until the end of this fiscal year, an OMB spokesperson said.
At that point, CIOs might feel like they’re at the helm of a battleship that’s just been ordered to change course. Although forward movement on changes might start in 2011, the size and complexity of federal data centers will require years for a significant reversal of the proliferation trend to take hold.
“Wait until the end of the first four years of the administration to really measure results,” said Olga Grkavac, executive vice president of the public sector at TechAmerica, a technology industry association.
Much of the time will be spent convincing agency staffers to loosen their hold on servers and storage resources dedicated to their individual programs. It could be a hard sell.
Systems in some large federal agencies actually carry labels that mark their department or program affiliation, said Jay Lambke, executive vice president of Government Acquisitions, a systems integrator that specializes in federal data center projects. “The user community is very invested in ‘my stuff,’” he said.
OMB will need to marshal its managerial will and budgetary pressure to overcome those attitudes. “The biggest stick of all is budget,” Lambke said. “If I force someone’s budget to a place where they have to think about different options, people get creative and come up with good solutions.”
Another key will be OMB policies that promote IT modernizations to attack the core management and security problems that make data center proliferation a bad thing to begin with.
“A plan that consolidates and virtualizes where appropriate is a full solution,” Lambke said. “Just ending proliferation in and of itself doesn’t get it done.”