Tools demystify Web site performance
New vendor wares increasingly blend two main types of system monitoring
As agencies conduct more of their affairs using Web-based technologies, government Webmasters and network administrators are under greater pressure to keep those Web sites humming.
Fortunately, developers of Web performance management tools have not been idle and instead offer agency staff members a blend of solutions to give them a leg up on maintaining their increasingly important — and complex — systems.
A far cry from simple products a few years ago that did little more than log the number of visitors to a site, the current crop of tools monitor a Web infrastructure from end to end. They isolate and flag problems with specific components — databases, application servers, network gear and storage — and can even drill down into data to track the performance of individual applications.
In fact, for many agencies there's increasingly little difference between what they expect performance management tools to do for Web applications and for the network overall. That's because Web-based architectures have become mainstream for many internal- and external-facing applications.
"Our customer is the warfighter," said a spokesman for the Defense Information Systems Agency (DISA). "And, due to this, performance monitoring tools utilized within the technology insertion division must have a wide range of features [that] enable monitoring and testing of all IP traffic types utilized on [Defense Department] networks."
The focus for performance monitoring tools has shifted dramatically in just the past year, according to Mike Baglietto, senior product manager for Keynote Systems Inc.
"Even as recently as two or three years ago, people were basically just measuring one Web site against another to give them an idea of where they should be" on Web performance, he said. "Since then, [business objectives] have driven the move away from measuring the performance of Web sites to actually improving the user experience with the application itself."
At the same time, tools have evolved from seeing how long it took Web site visitors to execute a certain number of steps to asking whether they can quickly get what they need from the site, such as information or completing a transaction, he said.
"You are trying to connect end-user behavior to application performance," Baglietto said.
For Computer Associates International Inc. (CA), that objective means adding components to its Unicenter family of integrated enterprise management solutions to follow a user through a Web-based transaction, according to Bob Ure, the company's Unicenter brand manager.
"To the end user, their Web experience is cumulative across a series of smaller tasks, such as searching or adding items to a shopping cart," he said. "We monitor the response times of each of those interactions."
Mirroring a trend in the industry, CA's solutions now employ both active and passive approaches to performance monitoring.
With the active approach, the CA system simulates a typical user transaction and periodically sends that through the Web site to check the system's response. Meanwhile, another part of the CA system uses passive monitoring to watch in real time how such activity as requests to a Web application server affect the system and, by extension, the user experience.
"The industry has finally realized that it's not enough just to know whether the end-user experience is fast or slow," Ure said. "If the transaction response time is slow, they need to know why."
Some agencies have come to the same conclusion. NASA uses both active and passive approaches to monitor latencies on its internal networks, an important measure of performance given the Web-like integration of applications used by its various centers. It also measures the "hit rate" at its public Web sites.
That provides NASA with an integrated view of the performance of its information systems, said Al Settell, vice president of NASA programs at NCI Information Systems Inc., a contractor that oversees performance management at NASA sites.
Officials at Brix Networks Inc. believe the active approach is necessary if you want to get an in-depth view of how applications behave from one end of the network to another. The company's Verifier solution involves installing a number of tightly coupled hardware boxes and management software throughout a network and then generating application traffic every few minutes to flow between this mesh of devices.
By monitoring the performance of this traffic and seeing where any degradation occurs, the company claims it can proactively flag problems before they can seriously affect the performance of Web-based applications.
"If those degradations are not caught in time, they can seriously disrupt the performance of applications," said Jamie Warter, the company's vice president of marketing and business development. "With higher bandwidth, real-time applications being developed, even small jitters in the 10 millisecond range can create problems."
Brix's solution, which also includes passive monitoring capabilities, has been deployed in many telecommunications carrier and service provider networks. DISA is one of its first federal government customers.
Candle Corp. officials also believe that both active and passive approaches must be employed to get a "granular" view of what's going on with Web-based applications. But many organizations have still not installed much passive monitoring, which means they are getting an incomplete look at their Web performance, said Jim Hamm, Candle's senior director of service-level management business operations.
"A problem for people enlightened enough to look [at their Web applications] from the user experience is that, if they want a certain quality of data for each user transaction measured against each [piece of infrastructure] on each site, they can quickly end up with large quantities of data, and they just don't want to handle that," he said. "Unless you as a vendor can provide the whole solution to dealing with this, they don't want to know."
Candle's eBusiness Assurance Network managed service measures how applications are performing from an organization's external and internal users' perspectives for any geographic site, at any time and for any server. The eBA ServiceMonitor measures how long users wait at a particular Web site and reports on navigation problems. The CandleNet ETEWatch measures each Web-based transaction from when it is initiated on a user workstation until it is completed.
The results can be provided through a secure portal to the company, or various packaged surveys and reports can be filed according to the organization's specific requirements.
The goal is to allow systems administrators to address problems before they get out of hand. "Everyone wants to be as proactive as possible," said Russ Currie, director of product marketing for NetScout Systems Inc.
NetScout's nGenius performance management suite uses a system of probes to gather information throughout a network. It looks at traffic according to the URL it is coming from or going to and estimates what the traffic capacity should be based on that information.
"Because we can establish thresholds [for traffic flows], if those are threatened, we can send alarms and [e-mail messages] to various places warning them of that, so they can intervene before something happens," Currie said.
Be aware that choosing Web applications management tools will likely not get easier, at least in the short term. Market watcher META Group Inc. sees applications management as a still- maturing process because the number and complexity of Web applications are increasing and the need for those tools is expanding into the operational side of organizations.
With this lack of tool maturity and a continuing churn in the more than 60 vendors currently offering some aspect of Web management capability, "users should view any investment made in the next year as a [one- or] two-year investment, expecting the technology to be obsolete after that time," META analyst Corey Ferengul said.
Robinson is a freelance journalist based in Portland, Ore. He can be reached at hullite@mindspring.com.
***
Web services pose new challenges When Web applications using the still-emerging Web services family of standards become more commonplace, they will likely require a major change in performance management tools. Right now, those applications, which can be highly distributed and reliant on interactions among multiple components, are still in pilot testing, but they will eventually move to the public Internet. "You'll need a distributed set of tools to monitor the behavior and response [of Web services] because you don't [necessarily] know where any particular service is being originated from," said Mike Baglietto, senior product manager for Keynote Systems Inc. This will "drive the need for performance management tools that include a lot more intelligence" than the current crop has, according to Bob Ure, Unicenter brand manager for Computer Associates International Inc.
NEXT STORY: HHS publishes HIPAA security rules