Opening the 64-bit data pipeline

In most aspects of computing, bigger, faster and more is better: Faster clock and network speeds, more RAM and bigger hard disk drives are all benchmarks by which we measure the pace of computing progress.

In most aspects of computing, bigger, faster and more is better: Faster clock and network speeds, more RAM and bigger hard disk drives are all benchmarks by which we measure the pace of computing progress.

The same thinking, on the face of it, could be applied to 64-bit computing. But the reality is much hazier.

The term 64-bit refers to the size of "words" or chunks of data that a computer processes at one time. Most computers today are 32-bit, so common sense says 64-bit computing must be twice as good.

Not necessarily. When a microprocessor and its associated chipset are 64-bit instead of 32-bit, data is processed and moved between the processor and memory in 64-bit chunks instead of 32-bit chunks. Moving more chunks makes a machine process information faster—but not necessarily twice as fast, because many other variables affect system performance. To fully exploit the design feature, the computer also must run 64-bit software, or it will process data in 32-bit pieces even if it has a 64-bit design.

"Not all applications today take advantage of 64-bit hardware," said Pia Rieppo, principal analyst for workstations at Dataquest, a Gartner Group marketing firm. Today, most of the applications that are 64-bit are electronic design and simulation applications, she said. Later on, it will be on the digital content creation side."

Applications that especially benefit from 64-bit design are digital signal processing, image processing, digital content creation and very high-precision calculation, according to International Data Corp. Wolfram Research Inc., the company that makes the Mathmatica application for mathematical calculations, found that 64-bit systems cut the number of operations for complex calculations in half and improved performance nearly threefold compared with 32-bit systems.

Numerous federal agencies involved in defense, nuclear and seismic research need 64-bit processing to test weapons, theories or earthquake activity. The National Weather Service also needs 64-bit systems to conduct complicated calculations for weather pattern models. "There are very large numbers of calculations of numbers that are very close to one," said Stephen Lord, acting director of NWS' Environmental Modeling Center. The center still is learning how to best exploit its 64-bit IBM Corp. SP2 computer, he said.

NWS has learned that some parts of the computer's software benefit more than others from being written in 64-bit code. "It is an experimental thing right now," Lord said. "We will work through the code to find where the sensitivities are."

Another significant aspect of 64-bit processing is the ability of the processor to track information in memory. The processor's number of bits determines how much information a computer can keep track of in RAM. Users want to hold data in RAM because data there is retrieved at a rate that is 10,000 times faster than data from disk drives, according to an abstract on 64-bit computing from Hewlett-Packard Co.

The amount of memory a processor can track—or address, in technical terms—is determined by the number two (because computers are binary devices) raised to the 32nd power, if the computer is 32-bit. That equals 4G of addressable memory, which is more than enough for common computing tasks but increasingly is insufficient for large, technical modeling chores.

Also, Internet database servers can work faster if they can store their entire database in RAM, which boosts the speed of World Wide Web service.

The solution for memory-hungry users is 64-bit computers, which can address 2 to the 64th power—18 exabytes. This limit is theoretical, because, at today's prices, 18 exabytes of RAM would cost $400 billion, according to HP.

To visualize the difference in addressable memory between different word sizes, HP describes it this way: If the memory accessible by an 8-bit machine, such as an early IBM PC, is represented by an area the size of a business card, then the memory limit of 16-bit systems such as a 286 PC would be the size of a desktop. The limit for 32-bit systems, in comparison, would be the size of a city block. For 64-bit systems, the area would be the size of the entire surface of the earth.

No one needs that amount of memory today, but the 32-bit system's 4G limit is too confining for many complex jobs, such as those performed by the Army Research Lab, which operates Silicon Graphics Inc., Sun Microsystems Inc. and Cray Research Inc. machines loaded with as much as 128G of RAM. The lab operates 64-bit systems to support Defense Department research and development, testing and evaluation. The work involves computational fluid dynamics, solid dynamics and computational chemistry.

"The advantage 64-bit computing provides these applications is that we can run larger problem sizes," said Thomas Kendall, chief engineer of the ARL's major shared resource center. "Thirty-two bit systems are limited to relatively small problem sizes." Kendall said some of the lab's customers use all of its 128G of memory.

Energy Department labs use 64-bit systems to design components for weapons systems, said one lab engineer, who did not want to be identified. "We design parts we need for experiments," he said. "Sometimes we design systems, and some of them have been complex, but usually we just design parts of a system."

Another benefit of 64-bit designs is that they can address larger files stored on hard disk drives. Like the limit on addressable memory, hard-disk files are limited to 4G on 32-bit systems. For larger jobs, 32-bit systems can access several file segments that total more than 4G. But having a single, continuous file increases disk performance and simplifies management.

An example of a huge database file that will benefit from running on a 64-bit system is that of the Human Genome Project. According to IDC, the resulting database will be 80 terabytes in size, or 80,000G, which would be 20,000 separate 4G file segments on a 32-bit system.

Digital Equipment Corp. was the first company to offer 64-bit hardware in reasonably affordable technical workstations when it introduced its Alpha systems in 1992. But the company was slow to exploit its advantage and few customers had applications that would benefit from it. "Over the years, they've had a secret weapon that unfortunately has remained secret," said Terry Shannon, a longtime DEC expert, consultant and publisher of the Shannon Knows Compaq newsletter. The company has installed more than half a million 64-bit systems, he said.

Alpha systems are popular with DOD for wartime simulations and with National Oceanic and Atmospheric Administration for weather modeling, said Gary Newgaard, vice president of Compaq Federal Systems. "It has a lot of appeal in the government sector," he said. "We're seeing this in engineering and prototyping."

Today, IBM, Sun, HP, SGI and others offer 64-bit desktop workstations. Intel Corp.-based workstations still rely on PC-style 32-bit microprocessors, but that will change when Intel introduces its 64-bit design Itanium processor for workstations and servers next year.

IBM was later on the scene, but federal customers are starting to look at the company's 64-bit workstations. "Initially, we didn't see a lot of interest, but over the last months there has been a lot more interest," said Mladen Karcic, IBM's RS/6000 brand manager for the federal market.

"I would attribute that [interest] to our success with the supercomputers," Karcic said. For example, NWS scientists who use an IBM supercomputer want to be able to develop applications for the supercomputer on their own desktop workstations, he said. "These engineers can develop code right on their desktop and run it on the supercomputer," Karcic said.

Intel collaborated on the Itanium's design with HP, which based the design on its experience with HP's 64-bit reduced instruction-set computing Precision Architecture chip. "[Itanium] will be the industry standard," said Jim Carlson, HP's director of marketing for Intel Architecture-64 (IA-64). "Eventually, it will work its way into PCs," he said. "Initially, it will be in workstations and servers."

How close is Itanium to HP's Precision Architecture? It close enough that today's HP customers will be able to run existing applications directly on the Intel-based machines, he said. Likewise, software that runs on today's Intel-based systems also will run on the new 64-bit machines.

But running existing software won't exploit the 64-bit capabilities of the new hardware, so users probably will want to recompile their applications. "You may eventually want to recompile for better performance," Carlson said.

It will be a while before desktop PCs have 64-bit supercomputers inside them, said Ron Curry, Intel's marketing director for the IA-64 architecture. "Most people don't have 4G of RAM on their desktop," he said.

The early versions of Itanium will appear in multiprocessor computers ranging between two and 32 processors, with some supercomputer-class multiprocessors supporting as many as 512 processors, Curry said. Available 64-bit operating systems will include Windows 2000, Linux, HP-UX, Sun Solaris and IBM's 64-bit version of its AIX operating system.

Itanium systems probably will be available in the second half of 2000, running at clock speeds of between 800 MHz and 1 GHz. However, these early chips will not be able to access the theoretical 18 exabytes of memory with 64-bit computing. Instead, the computers will be limited to a maximum of 16 terabytes, or 16,000G, of memory, according to Curry.

These machines will outperform today's HP P8500 systems by almost two-to-one, Carlson said. While today's HP P8500 can churn through 3.5 billion floating point operations per second, Itanium machines will be capable of 6 Gflops, he said.

MORE INFO

Status: IBM Corp., Sun Microsystems Inc., Hewlett-Packard Co., Silicon Graphics Inc. and others offer 64-bit desktop workstations. Intel Corp.-based workstations still rely on PC-style 32-bit microprocessors, but that will change when Intel's Itanium chip is released next year.

Issues: Not all applications can take advantage of 64-bit hardware. Those that do usually are electronic design and simulation applications.

Outlook: Good. Although 64-bit supercomputer desktops are a ways off, early Itanium versions will appear in multiprocessor computers with two to 32 processors, with some supercomputer multiprocessors supporting up to 512 processors. Available 64-bit operating systems will include Windows 2000, Linux, HP-UX, Sun Solaris and IBM's 64-bit version of its AIX operating system.

BY Dan Carney
Nov. 22, 1999

More Related Links