Computer, heal thyself

Autonomic computing aims to ease complexity, cut costs

The phrase "autonomic computing" smacks of science fiction, conjuring images of self-sufficient machines. But it's real and numerous vendors are developing this technology. IBM Corp. has its autonomic computing effort, Hewlett-Packard Co. pursues the concept with its Adaptive Enterprise Initiative and Sun Microsystems Inc. is developing the N1Grid. Specialty firms such Cassatt Corp. and Stottler Henke Associates Inc. also figure in the mix.

The industry aims to develop solutions that let computers configure, heal and optimize themselves. The technology is important for organizations that deploy highly distributed computing models such as information grids. For the rest of us, autonomic computing promises reduced hands-on maintenance, improved reliability and greater resiliency.

Elements of autonomic computing are already available in systems management products and utilities. But the full blossoming of the technology is five or more years away, however, according to analysts.

In the meantime, some government agencies are exploring autonomic computing. For the most part, scientific and technical government organizations have expressed the most interest, such as Energy Department labs, NASA and the Defense Advanced Research Projects Agency. But agencies such as the Internal Revenue Service are looking into self-managing technology for business computing.

Some industry and government executives believe now is the time to prepare the information technology infrastructure for the autonomic future. They emphasize the need for well-defined security and incident response policies before embracing the new feature set.

"We have to grease the skids now," said Peter Hughes, assistant chief for technology at NASA Goddard Space Flight Center's Information Systems Division.

Autonomic benefits

Autonomic computing aims to take human intervention — and therefore cost — out of systems management. "It addresses the out-of-control costs of doing basic monitoring of operations and maintenance of IT systems," said Ric Telford, director of architecture and technology for Autonomic Computing at IBM. "And a lot of that cost is tied up in mundane tasks of monitoring and optimizing and tuning IT components."

IBM's Autonomic Computing initiative seeks end-to-end self-regulation of IT environments — hardware, middleware and applications.

Besides mere cost-cutting, some officials see the potential for enhanced security in autonomic computing. DARPA's Self-Regenerative Systems program, for example, seeks to develop

systems that can respond automatically to cyberattacks. "Desired capabilities include self-optimization, self-diagnosis and self-healing," according to DARPA's request for proposals.

But the autonomic model also is about managing complexity. "The strategy behind N1, and also behind some of the other technologies, is to try to manage [multiple] computers as if there were one computer," said Dennis Govoni, chief technologist at Sun's government

division.

Government customers seem most interested in the technology's ability to handle complex scientific and technical computing. Science projects increasingly involve distributed data analysis as researchers turn to clusters of systems to handle computational chores. The self-managing aspects of autonomic computing could help jobs flow smoothly through numerous and potentially geographically dispersed computing nodes, according to those familiar with the technology.

"In order to really be cost-effective, reliable and meet customer needs

NEXT STORY: SI lands $800M Air Force deal