There's plenty of time before you'll have to make any decisions about p2p.
At first blush, the peer-to-peer (P2P) architecture appears antithetical to everything the government expects in information technology systems. Things such as security, network capacity planning and the ability to back up data require centralized servers locked in climate-controlled closets and connected to uninterruptible power sources.
PCs, on the other hand, are unpredictable. You never know where they've been or where they're going. And relying on users to back up files can be problematic.
Adding to the fear of P2P is the fact that the most famous P2P service, Napster, was deemed by the courts to have engaged in an illegal activity: pirating of copyrighted material.
Despite this, P2P will come to government agencies. And the barbarians won't tear down the walls, security won't be compromised and life will go on pretty much as it always has. Here's why.
First, it's important to understand that there are two distinct P2P models. The unstructured model, exemplified by Napster, enables uncontrolled access to a centralized directory and to resources on a network of desktop computers. The structured model — represented by the collaboration tools from NextPage Inc. and Groove Networks Inc., for example — offers security features such as peer-to-peer authentication, remote authorization, secure transport and predetermined workflows. Obviously, only the structured model will be implemented by government agencies.
Second, P2P doesn't necessarily mean desktop-to-desktop file sharing. The architecture can also be used to connect servers. The Federal Interagency Council on Statistical Policy is taking that approach with its FedStats.net test-bed Web site, an exploration into the use of P2P technology to improve access to up-to-date statistical data by enabling file sharing among many agency servers. As of this writing, the future of the project, described as "a proof-of-concept demonstration," is unclear.
Third, a potentially disruptive P2P application, distributed processing, will probably never be implemented on a large scale either by government or business. The very few current examples of this — such as the SETI@home project, which uses a network of millions of PCs to search for evidence of extraterrestrial intelligence by analyzing radio-signal data — are those where computation can easily be broken into discrete parts. There are no business applications that make use of this model, and chances are there never will be.
Finally, there's plenty of time before you'll have to make any serious decisions about P2P. Consultants at Gartner Inc. estimate that by 2003, only 30 percent of corporations will have experimented with P2P applications.
Smart managers will spend the next few years familiarizing themselves with P2P and the technologies that emerge to support it. But it won't be until the middle of the decade that you'll be called upon to act on it.
Stevens is a freelance journalist who has written about information technology since 1982.
NEXT STORY: Life insurance basics, Part 5