High-end servers crack the glass house
The federal government data center, once the exclusive domain of the mainframe, increasingly is inhabited by highend Unixbased servers and Microsoft Corp.
High-End Servers
Status: High-end Unix servers and, to a lesser extent, Microsoft Corp. Windows NT servers, are being used increasingly in roles traditionally served by mainframe computers. While low price is a benefit, what makes high-end servers more appropriate for these jobs are the mainframe-like features they incorporate.
Issues: While Unix-based servers have proven that they can scale up to meet a data center's processing needs, the technology that will enable Windows NT-based servers to do the same is still in the early adopter stage. Concerns about reliability also keep some high-end servers from gaining wider use in mission-critical applications.
Outlook: Good, but the switch from mainframes to high-end server alternatives will not happen overnight.
The federal government data center, once the exclusive domain of the mainframe, increasingly is inhabited by high-end Unix-based servers and Microsoft Corp. Windows NT-based servers. Sporting budget-friendly price tags, ever-increasing horsepower and mainframe-inspired designs, these small-package powerhouses are winning over federal chief information officers.
Although CIOs and industry observers concede that high-end servers don't always match the near-perfect performance guarantees that mainframes can make, agencies are adopting them nonetheless. Agencies carefully select those applications where the cost and performance benefits outweigh the risks of system downtime.
"The applications are driving the platforms rather than vice versa," said Dennis Gaughan, senior research analyst at AMR Research Inc. "The platform discussion is becoming less of an issue."
Some of the most popular uses for high-end servers include consolidating distributed departmental servers on to one, easier-to-manage platform and supporting World Wide Web-based versions of legacy applications. While the low prices are a big incentive, what really makes high-end servers acceptable for many mission-critical applications is the way they are co-opting many of the attributes long associated with the glass-encased world of mainframes.
Unisys Corp., for example, this month began shipping beta versions of the ES 7000, an eight-way Windows NT server that can scale up to 32 Intel-based processors. It can be configured to have no single point of failure, which is a design quality typically available only in mainframe systems, said Kevin McHugh, vice president and general manager of Unisys' ClearPath initiative.
"You have redundancy throughout the configuration," McHugh said. "The physical cabinet...it's actually split in half. You have separate power, separate air, separate memory. If a component fails...the other side can get the job done."
McHugh said that federal IT shops that want to consolidate their existing server-based applications are among the target markets for the ES 7000. Some agencies are choosing high-end servers as the platforms for new applications. For example, the Coast Guard chose Windows NT as the foundation for a mission-critical computer network that is designed to feed real-time information from search-and-rescue missions back to headquarters.
Data General, now owned by EMC Corp., also is aggressively targeting its high-end server solutions as mainframe alternatives. David Flawn, Data General's vice president of Windows NT marketing, noted that building in the resiliency and reliability associated with mainframes is key to providing products in this market.
"Mainframes are very large systems supporting very large populations of users and are hardened to perform in almost fault-tolerant fashion," Flawn said. "The way you achieve production-oriented computing environments is by building in redundancies. The whole idea here is to eliminate single points of failure."
For example, Data General offers clustering technology, which connects two servers so that if one server fails the other will assume the IP address of the server that is down.
Flawn said Data General also offers support agreements that guarantee availability of the operating system seven days a week, 24 hours a day.
Although Flawn noted that there still are applications well-suited for the traditional mainframe environment, he said Windows NT provides a development environment that is attractive to data centers.
"The stellar advantage of NT is that the applications are so rich," he said. "The development environment is just awesome. When you start running [enterprise resource planning] applications on a very large computer or a combination of computers on a rack, that becomes a mainframe alternative."
Ousting the Mainframe
The U.S. Special Operations Command has eliminated "anything resembling a mainframe" to consolidate on Data General's high-end Windows NT server offerings, said Col. Ted Hengst, the command's automated information systems division chief. For e-mail and all other applications that run on Windows NT, the command is using high-end servers to support the enterprise.
"Instead of having multiple platforms from multiple vendors, we could go with a single solution, which made integration a lot easier," Hengst said.
He also noted that using the high-end servers makes it less expensive to keep up with technology changes. Instead of a "massive bulldozing" effort associated with a mainframe purchase, the command can afford to replace one-third of its 33-server fleet every year, he said.
"Whenever you invest in a large mainframe, you can't turn the technology as fast," he said.
Gary Newgaard, vice president of Compaq Corp.'s federal region, said his company has seen "tremendous growth" in the federal government market for high-end servers. Compaq just began shipping its top of the line Windows NT ProLiant Server in August.
And the company's 400 AlphaServer, which runs either Unix or Linux, is being used by federal agencies, he said, including the Department of Veterans Affairs and the Federal Emergency Management Agency.
"The government continues to look for cost-effective ways to deploy commercially available products," Newgaard said. "They're looking for easy ways to buy them, install them and maintain them.
Even IBM Corp., which has dominated the traditional mainframe market, is offering high-end server solutions that target the data center. In September, IBM announced its System/390 Multiprise 3000 (MP 3000) servers, which are geared toward a new generation of mainframe customers who want to start small and work their way up.
"Disk can be included in the server frame, and the software can be pre-loaded," said Wendy Culberson, server brand manager for IBM's federal sales team. "This feature helps customers get started quickly with minimal system programmer installation."
Culberson emphasized that traditional mainframes still are the best systems to handle large-scale, mission-critical applications.
"Alternative mainframes have a different heritage, and so integrating these traditional [mainframe] strengths will take time," she said.
Product Limitations
Not all agencies are ready to shut down their traditional mainframes. John Garing, the Defense Information Systems Agency's commander of the Western Hemisphere, noted that while DISA has purchased high-end server products, it has not replaced a mainframe in a data center with one of the new server solutions.
One of the problems DISA faces is, How does a data center allocate the cost to customers of its processing services if they are delivered by an array of servers instead of by just one mainframe?
"[With the new servers] it's not as easy because the discipline is not there on the business side that has been developed over years and years with the mainframe," Garing said.
Mainframe alternatives have other drawbacks as well, according to industry observers. Mike Kahn, chairman of The Clipper Group Inc., said that CIOs must understand the differences among the competing platforms and accept the trade-offs.
"There are many enterprises today that are running mission-critical computing transaction processing...on a Unix system and are expecting...the reliability and availability they had hoped for in a traditional mainframe environment," he said. "They're still not mainframes. The customers are willing, because of perceived cost savings, to go with Unix servers instead of going with a [mainframe]."
Kahn pointed to the severe bottom-line impact system outages have had on some online retailers as a reminder that mission-critical applications must be operated in a high-availability environment.
In addition, Kahn said that Windows NT does not scale well enough yet to be suited for large enterprise applications. While these solutions eventually will have the horsepower needed for large applications, they are better suited for point solutions, he added.
AMR's Gaughan agreed that Unix has the edge over Windows NT in the mainframe-alternative arena. He said that while the high-end Unix servers all deliver the processing power and reliability promised by their vendors, the high-end Windows NT servers are still unproven.
"NT has always struggled, and I don't think it can really play in this market," he said. "If you're talking about bulletproof, mission-critical stuff...NT has had trouble gaining mindshare there."
When seeking a mainframe alternative, users must define what levels of failure are acceptable, Gaughan said.
"It's a question of gauging what the acceptable levels [of risk] are and matching those platforms to your needs," he said.
DISA's Garing added that the increasing value of data is shifting the data center's focus from that of a processing-centric environment to that of a storage-centric environment.
"It is pretty easy to use any kind of processor...if you manage the storage properly," he said. "Information is what's important here. The means of processing it is less important than the information itself."
-- Harreld is a free-lance writer based in Cary, N.C.
NEXT STORY: Blame for Year 2000 hysteria starts at GAO