High-performance horse race
PC alternatives gain on Unix workstations
Although its roots and main focus are in the Unix-based high-performance computing market, NASA's latest iteration of the Scientific and Engineering Work.station Procurement contract (SEWP III) includes many less expensive PC products that run Microsoft Corp.'s Windows NT/2000 or Linux operating systems on Intel Corp. processors. As a reflection of the general market, SEWP III illustrates how the PC platform thanks to steady improvements in performance and reliability has made inroads into many markets once dominated by Unix. But it's a different story when it comes to the traditional high-performance products on SEWP III used for some of the government's biggest computational jobs, such as modeling global climate trends or nuclear weapons blasts.
For now anyway, Unix is still the platform of choice for such heavy-duty tasks. In fact, NASA required that the main systems on SEWP III use Unix in order to ensure compatibility with the large installed base of Unix systems at the agency, according to Joanne Woytek, SEWP program manager at NASA.
However, PC products strung together in clusters are beginning to offer some interesting options for building in.expensive yet powerful systems. And as Intel's new 64-bit family of processors, led by the Itanium chip, begins to catch up with the 64-bit chips that Unix vendors have offered for several years, the differences between the two platforms are expected to narrow.
In the low end of the scientific and engineering market, the computers are referred to as workstations, whether they run Unix, Linux a freely distributed Unix variant or Windows NT. Workstations are distinguished from general desktop computers by their large amounts of memory, advanced graphics capabilities and fast processors.
In the past few years, Windows NT- and Linux-based workstations have had considerable success and have surpassed Unix workstations in terms of units shipped and total dollar value, according to Kara Yokley, senior analyst for workstations at IDC.
A key selling point of the Windows NT and Linux workstations has been the price. Unix workstations can cost up to 50 percent more than comparable Windows NT systems, even though Unix vendors have dropped their prices to narrow the gap.
Besides the better price, the steady improvement of Windows NT has been another important factor in the growth of PCs in the workstation market, according to Barry Crume, the product line business manager for Itanium systems at Hewlett-Packard Co. "It was really the element we needed to make a high-performance computer out of commodity-type machines," he said.
HP sells both Windows NT and Unix workstations. Crume said HP's Windows NT-based systems outsell its Unix systems nearly three to one. In the past, NASA did not track transactions through SEWP at a level that indicated the type or volume of specific operating systems or platforms bought, according to Woytek. The agency plans to provide more of that type of data in the future.
The Unix vs. Windows NT picture begins to change dramatically as you move up the computing food chain. In the high-performance server market, Unix enjoys a sizable market advantage over Windows NT systems and, by most accounts, a significant technical edge as well.
Nowhere is that technical gap more evident than the processing jobs that involve enormous datasets. When large numbers of Unix systems are strung together in clusters to tackle those jobs, they run one instance of the operating system and have a giant shared memory pool to process the single set of data. Clusters of Windows NT-based systems, on the other hand, do not share their memory. Instead, they divide the job into small pieces, work in parallel, then reassemble the work at the end of the task.
"If you've got a problem that breaks down and doesn't require lots of communication between processors, then you can use lots of less expensive computers in a cluster," said Lang Craighill, director for advanced programs for SGI Federal Inc., which has focused on Unix systems, but also sells Windows NT and Linux systems. "But for jobs with large datasets, like climate modeling or 3-D explosions, [PC] clusters have problems. The message passing required is much more complicated and performance can be very low."
But that challenge isn't stopping some government scientists from developing PC-based clusters, many of which use the Linux operating system. For example, the Energy Department's Accelerated Strategic Computing Initiative is testing the feasibility of using PC clusters to model and evaluate the reliability of the country's nuclear stockpile. For comparison purposes, it's developing a shared-memory Unix system from SGI to do the same work.
In a joint effort among the Sandia, Los Alamos and Lawrence Livermore national laboratories, scientists have created a cluster of 16 Linux-based PCs. Meanwhile, scientists at Los Alamos National Laboratory are using an eight-processor SGI Onyx2 Unix system to run similar modeling software.
Although the Linux cluster will involve more custom programming to operate, the difference in hardware acquisition costs is significant. The Linux cluster costs $35,000 compared with the $400,000 SGI Unix system, according to Jeff Jortner, a principal member of the technical staff at Sandia National Laboratories.
The national laboratories' project is ongoing, so final assessments are not available. But it's clear from this project and others in the federal government that alternatives to Unix for high-performance computing are being taken much more seriously.
NEXT STORY: Security patch RFP delayed