Software that keeps score
Managers fine-tune mission performance with balanced score cards, but success takes much more than technology.
Managing government like a business has become trendy for the federal government, symbolized by initiatives such as the President's Management Agenda, the Government Performance Results Act and many agency directives.
To coincide with that trend, many government officials have adopted a new mantra: "You can't change what you can't measure." And the measurement tool du jour for many agency executives is balanced score cards, which are a framework for establishing and tracking an organization's strategic goals.
Some federal agencies are finding that their score card investments are paying off.
"People like me who've been [working] for 30 years have often been frustrated when accusations start flying back and forth" about performance problems, said Michael Jackson, chief of the Business Operations Flight for the Aircraft Supply Chain Manager at Hill Air Force Base's Air Logistics Center, Ogden, Utah. The center is responsible for maintaining F-16 fighter planes and older aircraft.
"For the first time, score cards give us empirical data to show exactly where we stand and how to support warfighters," Jackson said. "The information has always been there, but there hasn't been a tool for digging it out."
Hearing the message, many top software companies now offer feature-laden score card programs designed for public and private organizations. Those products usually include management dashboards, which display easy-to-read summaries of performance metrics that executives can use to keep tabs on their organizations. But while score cards are gaining traction, not everyone is willing to pay for high-end, shrink-wrapped solutions.
"The software is really nice, but it just doesn't make sense for us to spend the money on it," said Stephen Logan, program manager of balanced score card performance management at the Energy Department's Office of Procurement and Assistance Management (OPAM).
For the past decade, Logan's department has used score cards built with a basic spreadsheet program. Despite its modesty, the solution has helped the department dramatically improve service satisfaction among its internal department customers.
Perhaps that's because score card success ranges far beyond the information technology department. "If you don't know what your mission is or you don't know what competencies you need to get the organization moving in the right direction, buying all the IT products in the world isn't going to help," said Jason Fieger, senior associate at Booz Allen Hamilton.
Underlying concept
More than a decade ago, Robert Kaplan and David Norton, two Harvard University professors turned consultants, introduced balanced score cards to the management world. The approach defined a framework for measuring four key indicators covering financial performance, customer knowledge, internal business processes, and learning and growth. Score card variations with alternative categories have subsequently surfaced, but the goals remain the same: provide a tool for setting strategic goals and managing performance.
Federal government adherents of score cards and dashboards include a wide range of departments, including Commerce, Defense, Energy, Health and Human Services, and Transportation, among others.
Although different versions exist, the basic concepts are similar from implementation to implementation. Score card projects determine the relevant performance data points for a particular agency, and manual or automated processes then collect operational data and create performance summaries.
Different managers may see different views of the organization. For example, chief financial officers track financial performance, while chief information officers closely monitor IT system reliability. Operations people watch order fulfillments related to the agency's mission. If performance begins to dip in any of those categories, managers can proactively take corrective action, according to the score card theory.
"We refer to it as 'corporate therapy,'" Fieger said. "It's a management and decision support tool that allows an organization to figure out what it needs to do and how to do it more efficiently."
The score card concept is straightforward, although successfully launching a score card program isn't always simple. Veterans say several technical and cultural challenges can plague the efforts, resulting in projects that never get off the ground or crash and burn because of neglect after initial success.
"Score cards can fall in and out of grace depending on how well the organization understands their usage and how well score cards are bought in as a performance-reporting mechanism," said Dave McClure, research director of the government sector at Gartner.
In the federal government, many agencies and departments began using score cards years ago, "but whether they currently are or are not using them is always a good question," McClure added.
Simplicity reigns
At OPAM, Logan uses his simple score card process in an annual review of procurement activities. "We don't really need to see how we're doing in particular measurements each day, like commercial organizations do," he said. "A lot of what we do depends on annual funds."
Various field offices funnel year-end data to Logan, and he manually enters the information into a spreadsheet. A data reduction model, originally developed by the Labor Department, then boils down the data into performance summaries. One important category is the results from surveys of internal DOE users who evaluate OPAM's services, such as negotiating agreements between DOE departments and outside contractors.
"The program calculates the results and processes the percentage score for each of the key measures that are programmed into it," Logan said.
An important component to serving internal customers is timeliness. "Since Day One, they have said, 'You folks in procurement take too long to get a contract in place,'" Logan said.
Score cards have helped change that. In the 1990s, a contract worth more than $1 million dollars took an average of 280 days to formalize. "We've got that down to 100 days on average," Logan said. In turn, customer satisfaction levels increased. One OPAM assessment of its internal quality control systems rose from 85 percent in 2001 to 90 percent in 2004, Logan said.
But data alone doesn't bring about change. OPAM executives use the score card summaries to develop action plans for the coming year. After a review of the core measures with headquarters, each procurement office submits an annual report to Logan with a management plan for improvement in problem areas.
"In an area where the department is not doing that well, we'll develop strategies for what we want to do about it," Logan said. "We'll also look at our balanced score card program nationally and determine what we can do here at headquarters that will help the field offices or help the department do better in a particular area. We'll develop an action plan for the accomplishment of that initiative during the fiscal year."
Homegrown success
Like OPAM, the Naval Criminal Investigative Service (NCIS) at the Washington, D.C., Navy Yard, relies on a homegrown, spreadsheet-based score card application, although that's about to change. For the past year and a half, NCIS has used a Microsoft Access database to feed information into summaries that track the type, volume and status of investigations. Other summaries monitor field offices' financial performance.
Although the system achieved some success, it's running out of headroom, said Rick Holgate, NCIS' assistant director of IT and command information officer.
"One of the long-term issues we're dealing with is a lot of our data is in older systems that in the near future are going to be replaced with more modern systems," he said. That makes moving the data into the dashboard summaries a manual process.
The modernization will include an Oracle database system that underpins dashboard technology from Business Objects. The project should be complete in the next six months. Holgate expects the new dashboard tool to be easier for managers, most of whom aren't IT experts.
"Today, if someone wants to see the information in a different form, they can't readily do that as an end user," he said. "The dashboard tool will give them that flexibility."
A second refinement will be a Web interface that will make it easier to distribute dashboard information via NCIS networks.
Although the homegrown system now needs a replacement, Holgate said, he believes it played a critical role in the organization's dashboard efforts.
"It really helped to generate a significant level of internal interest in dashboards," he said. "It provided a capability that end users could see, interact with and understand the value of dashboards. And generating user interest and demand makes the leap to an enterprise solution much easier because there's already that spark there in the organization. I'd be much more leery of jumping into an enterprise solution otherwise."
Keeping jets flying
At Hill Air Force Base's Air Logistics Center, score cards are helping solve longtime supply problems that inhibit proper maintenance of fighter planes.
"In 1999 we looked at all of our support metrics for warfighters, and some of them were pretty sad," Jackson said. "We had people out there fighting for the country, and we weren't supporting them as well as we would have liked."
Problems included poor management of the parts inventory and excessive back orders. "We began an aggressive campaign to focus on warfighter support," Jackson said. "We came up with several metrics using balanced score cards to establish where we were vs. where the warfighters expected us to be."
The score card application is a Java-based Web program that pulls data from various sources and presents summary charts and graphs to managers and team leaders. The center began using an early version of the tool in 2001, then launched it as an enhanced Web application in 2004. Corda Technologies provided the score card dashboard foundation, and an outside contractor completed further customization.
While deploying the system, the center encountered a few glitches, but problems were more procedural than technical. Developers noticed early on that some metrics weren't relevant to the service's goals.
"People were using them just to have a metric," Jackson said. The data "was working for the metric instead of working for the warfighter."
For example, data showed that the base's parts shops was producing 110 percent of the volumes requested by the center. But reports from the field complained of frequent shortages. The score cards helped Jackson determine that although aggregate parts production exceeded orders, much of the work went to low-priority parts, which created shortages for many critical items.
Representatives from the various departments gathered to choose data points that would "tell us the story of the warfighter," Jackson said. "We made sure that people are not covering up a problem just to make the metric look good."
The results have been impressive. Jackson said parts deliveries to the field improved 20 percent while overall back orders dropped almost 65 percent. Inventory effectiveness improved about 45 percent. The statistical improvements translated into cost savings.
By identifying budget problems, including payments made to contractors who billed but failed to perform work, the organization recouped or avoided paying about $25 million.
"This tool isn't responsible for every bit of that improvement," he said. "There's been a lot of management attention given to these areas. But with this tool, we are able to focus in on improving areas that are more cost-effective and attack the worst problems that are eating our lunch."
Joch is a business and technology writer based in New England. He can be reached at ajoch@monad.net.
Nine steps to success
1. Plan for initial skepticism.
Any organizational change can result in reticence and feet dragging. When Michael Jackson, chief of the Business Operations Flight for the Aircraft Supply Chain Manager at Hill Air Force Base's Air Logistics Center, hit a speed bump among internal team members, he took corrective action. "I went over their heads, to my boss, who is also their boss, and gave him the score card visibility," he said. That gave them "a reason to use score cards -- to keep the boss off their backs."
2. Create realistic goals.
Each year, Stephen Logan, program manager for balanced score card performance management at the Energy Department's Office of Procurement and Assistance Management, adjusts performance targets and invites feedback from those who will be held accountable. "They know they can discuss whether the goals are reasonable in an open fashion without any fear of retribution from the folks here at headquarters," he said. "We've developed a successful partnership, but it took a couple of years to get to that point."
3. Consider the big picture.
Some organizations undertake superficial score card implementations by saying, "'OK, we've got four perspectives. We've got some metrics that measure each of those perspectives. Let's just throw them in a spreadsheet and we'll have a balanced score card,'" said Craig Symons, principal analyst for Forrester Research. "When that happens, you miss the linkages, such as the fact that score cards are tied to strategic objectives. You miss the balance of leading and lagging indicators. So you really don't get the results."
4. Make the stakeholders create the score cards.
"We never come in and present them with a score card," said Jason Fieger, senior associate at Booz Allen Hamilton. "Stakeholders have to do the work of developing the organization's mission, deciding what the core competencies are, determining what they need to do to get their jobs done," he said. "That buy-in is really the key to making it work."
5. Dedicate adequate time.
Organizations unwilling or unable to devote the amount of time necessary to the process court score card failure, Fieger said. "Ideally, we'd like to get people together twice a week for three- or four-hour sessions, maybe over a four- to six-week period, to work through the various parts of the score card," he said.
6. Keep it simple.
Score cards can get bogged down in information overload, said Terence Atkinson, industry director for the public sector at software vendor Cognos. "Typically, when you get a bunch of executives in a room and they look at their business drivers, they start naming dozens of them," he said. "So if you have 15 people in a room, you could end up with hundreds of drivers, which is too many. There couldn't possibly be hundreds of key drivers, and you couldn't manage that many anyway." The best score cards pare the list down to perhaps six essential data points, he added.
7. Plan for the long term.
Developing and maintaining score cards is an ongoing process, Fieger said. "Organizations are dynamic and their missions change, so the process is iterative," he said. "You have to come back from time to time and run the process again, to perhaps revise your metrics."
8. Be flexible with the dashboard views.
Realize that operational staff, midlevel managers and C-level executives all need unique views of data, said Dave McClure, research director for the government sector at Gartner. Don't make the mistake of displaying information that's too granular for senior executives. "They often don't want to see pages and pages of performance data on every individual project," he said. Instead, they typically prefer aggregate data about customer satisfaction, success at meeting internal operational improvement targets, financial health and mission goals.
9. Consider creating a score card data repository.
A centralized data store can eliminate data redundancy and overlap, McClure said. "Security and quality control are simpler -- not easy, but simpler -- if you manage one central repository of information," he added. "Having high-quality data means people won't go into meetings arguing over the validity and timeliness of the information, which is a side-rail discussion that takes the whole approach off-line very fast."
-- Alan Joch
NEXT STORY: CSC gets $35M FEMA flood insurance contract