Metrics take the guesswork out of Gov 2.0
Agencies need the right mix of best practices, standard tools and custom benchmarks to demonstrate the legitimacy of their Gov 2.0 efforts.
Call it Gov 2.1. After a rush to get social media sites up and running in the name of openness and collaboration, agencies are now facing a hard reality: how to prove the worth of those efforts under the looming shadow of budget hawks. To do that, they’ll need to answer a fundamental question: Does social media truly create value for agency missions or is it just another cool and unusual technology?
Unfortunately, the reality is that social media assessments still involve some subjective interpretation. But that should be far from the only measure and not the primary basis for justifying a project.
“I’m a huge proponent of not doing things just because they are cool,” said Jeffrey Levy, director of Web communications and a social media strategist at the Environmental Protection Agency. EPA is an aggressive user of social media and relies on multiple platforms to achieve its goals. It is also taking a more systematic approach to those efforts.
But even those who promote the game-changing potential of wikis, social networks, data-distribution platforms and idea contests acknowledge that it’s difficult to clearly show a return on investment. Unlike traditional IT, Gov 2.0 success isn’t defined only with hard numbers.
For example, Web traffic tools are important for gauging how many unique visitors come to a site and how traffic volume is trending. But those numbers alone don’t always speak to the initiatives’ mission value.
“Those statistics really tell me nothing,” said Emerson Keslar, CIO and director of MilTech Solutions, a portfolio of Gov 2.0 applications at the Army's Program Executive Office for Command, Control and Communications–Tactical (PEO-C3T).
Nevertheless, he said he believes the Defense Department’s internal wiki — milWiki — might be the knowledge-management solution he’s been seeking for the past decade.
“When we fell into Web 2.0, it was ‘Aha!’” he said. “Web 2.0 is a savior when it comes to knowledge management.”
Intuition has its place, but many agencies are starting to adopt more structured Gov 2.0 game plans using steps similar to those outlined below. Within that framework, agencies can find their own combinations of emerging best practices, standard tools and custom benchmarks to demonstrate the legitimacy of their Gov 2.0 efforts. In the end, that’s what will help them find common ground with budget hawks.
Step 1: Define Gov 2.0 goals
Before agencies can measure the effectiveness of their Gov 2.0 initiatives, they need to clearly identify the projects’ goals. Under the best circumstances, that work should be done well before the social site goes live. But in the build-it-and-they-will-come atmosphere that surrounds some Gov 2.0 projects, goal setting can easily get rushed or bypassed. Even after the fact, it remains a crucial step not only for benchmarking success but for helping agencies make sure they’re tailoring applications to real-world needs.
Managers of internal social sites should start by interviewing target users to understand their day-to-day problems. Emma Antunes, Web manager at NASA’s Goddard Space Flight Center, did that in the early stages of Spacebook, an internal site inspired by Facebook. She discovered ways Spacebook could help the human resources department meet diversity goals and help procurement professionals address some regulatory audit challenges.
“Each community had its own goals, so my matrix included the ‘what’s in it for me’ for each of these different groups,” Antunes said.
The site’s success has been short-lived: Spacebook is now dormant because of funding cutbacks.
Social networkers might take a different approach to public-facing sites, for which officials don’t need to overthink the goal-setting stage. For example, EPA simply wanted to augment its traditional press releases and website content with less formal communications when it created its Facebook and Twitter accounts.
“There wasn’t anything more highfalutin than that,” Levy said.
Step 2: Identify meaningful metrics
Tougher challenges crop up when it’s time to determine exactly which data points are most appropriate for measuring Gov 2.0 success. The problem is twofold. First, agencies must balance quantitative and qualitative measures. Second, the content and delivery mechanisms for each social media variation require their own metrics.
Some benchmarks are obvious. For example, sponsors of Facebook pages, Facebook knockoffs, Twitter feeds, wikis and discussion forums should start by reviewing the raw page-visit numbers each day.
But because social media is about collaboration and community building, agencies will need to probe deeper. Engagement is a key indicator, and organizers can begin to assess that by graphing how many visitors choose to become fans or click to show whether they like or dislike specific content.
The value of wikis, blogs and idea sites will partially lie in the number and diversity of contributors they attract and how often visitors link the discussions to other forums.
Another barometer is conversion rates — the percentage of unique visitors to a site who decide to register or become engaged in a discussion.
“Essentially, it’s a measure of how many ‘browsers’ became ‘buyers’ for any particular initiative an agency may be conducting,” said Daniel Honker, an analyst at the National Academy of Public Administration.
The numbers needn’t be large for communities to take off. “Our key number was 10 full-time, really active people who used Spacebook to the exclusion of other tools,” Antunes said.
Some Gov 2.0 veterans see senior managers’ involvement as a key indicator. “It becomes powerful the minute you get a leader participating in the conversation or when people become aware an executive is watching to see what’s happening in the organization,” said Douglas Palmer, a principal who leads the Social Media and Collaboration practice at Deloitte Consulting.
Some agencies probe even deeper. A chat forum designed to augment help-desk services might track how many problems were resolved and how long it took for the community to resolve them. Other sites will apply standard business metrics to determine, for example, the procurement department’s ability to supply equipment in a timely fashion to support agency projects.
“One of the metrics that is dear to my heart is how we support weapons systems once we field them,” Keslar said. MilWiki now aggregates information about service issues, and the Army links the information to the Facebook-like milBook, which becomes a discussion forum.
“We get some very frank feedback about how the equipment is working and how good the training is from the guys who are actually using it,” Keslar said. “That is just absolutely critical because we weren’t getting that level of granularity before.”
Step 3: Choose measurement tools carefully
When it comes to tracking standard quantitative measures, agencies have a wide variety of tools to choose from. Some are free, but others require upfront purchases or subscriptions. And some products offer highly automated data-collection capabilities while others require hands-on work to gather traffic numbers.
Social site managers should evaluate tools carefully to avoid products that ultimately drain time and money. “We found they can be costly and tend to overdeliver with features you don’t really need,” Levy said.
They should also be skeptical of products that purport to capture community sentiment — indicators beyond “like” buttons that show how much visitors appreciate the content.
“Machines can’t read sarcasm, and a lot of snide commentary on the Web is in the form of sarcasm,” Levy said.
Instead, collecting qualitative information will require heavy lifting in the form of monitoring comments, directly soliciting feedback from visitors and perhaps conducting online opinion surveys.
Step 4: Analyze the results
With raw numbers and qualitative observations in hand, managers will need to correlate the data into a measure of mission effectiveness. But don’t look for hard-and-fast guidelines.
For example, San Francisco launched SFFire App, a mobile application to help citizens locate automated external defibrillators in public places for emergency CPR. A collaborative component of the initiative is public help in doing on-site inspections to confirm that a working device was indeed at every location identified in the application.
How many visitors are needed for the city to claim success with the SFFire App?
“It doesn’t actually matter how many people look at the site,” said Adriel Hampton, co-founder of Gov 2.0 Radio and a social media consultant for the city. “What matters is how many are encouraged to actually do bystander CPR.”
Interpretation is also required for sites designed to solicit ideas about how to run an agency more effectively. A niche site with a few dozen highly engaged participants could produce as many new and actionable ideas as a public-facing site with hundreds of thousands of visitors.
“What matters is if you get three or five ideas that actually reduce costs or improve efficiency,” Palmer said.
Those aren’t abstract discussions. Quality over quantity could ultimately determine the fate of Data.gov, a clearinghouse for data routinely collected by the federal government. The site is supported by the E-Government Fund, which now faces the specter of deep budget cuts. Some critics point to declining site-visitor rates to argue that Data.gov is a prime candidate for cutbacks.
But people should consider site engagement, which is reflected in the number of times people download data files, rather than overall traffic to evaluate Data.gov’s success, said James Hendler, a professor at Rensselaer Polytechnic Institute and a Web consultant for Data.gov. Using that measure, interest in the site is growing. There were almost 190,000 downloads in March, up from 103,000 a year ago.
However, Hendler cautioned that even that statistic might not tell the whole story.
“If I download something and use it in a product, it’s only one download,” he said. “But I may have done something significant with it.”
Step 5: Turn analysis into action
Justifying the existence of a social media site isn’t the only reason to sweat the numbers. The results will also tell site administrators how to meet the community’s larger needs.
“It’s demoralizing if you stand up a site and say, ‘You are going to help us choose how to cut the budget,’ and at the end of the day not actually implement any of the solutions,” Hampton said.
After PEO-C3T officials comb milWiki and milBook for discussions about problems with weapons systems, they decide when to revise or expand training or when a systemic problem requires a technical redesign of equipment.
Eventually, organizations might decide that such assessments are important and time-consuming enough to warrant a full-time position, a kind of chief innovation officer, said Sean Rhody, chief technology officer at Capgemini Government Solutions. To be effective, that person would need enough clout to cross departmental boundaries and make real changes, he added.
With or without an innovation czar, social network managers will need to keep promoting the viability of their communities as budgets get tighter.
“This is a public good, and it’s always hard to measure a public good,” Hendler said about successful Gov 2.0 sites. “It’s important that the agency people and politicians who make funding decisions realize the importance of this public good.”
EPA’s social media mantra
The Environmental Protection Agency isn’t a social media neophyte. Among other activities, it cultivates a presence on Facebook, a Twitter following and interactive discussion forums when seeking public comment on evolving regulations.
Jeffrey Levy, director of Web communications and a leading force in the agency’s social media efforts, sums up EPA’s strategy as a four-part process he calls his social media mantra.
- Define the mission goals. Before launching a new social media initiative, identify the goals the agency is trying to accomplish. For projects initiated by the public affairs department, the goal might be as simple as finding a new way to communicate with the public at large and “be where the public is,” Levy said.
- Pick the right platform. Next, match the social media tool with the project goal. To communicate with the public, a Facebook page or Twitter account might work best. When soliciting input about policy, a better choice might be creating a variation on a blog, where each entry is a question designed to generate comments on a specific policy issue.
- Measure what’s most important. For public communications sites, such as Facebook and Twitter, look for statistics about the number of fans and followers. For blogs and discussion forums, consider the number and quality of the comments being generated. “To the extent possible, evaluate the sentiments of participants,” Levy said. “Are they happy or not happy with what you are doing?”
- Share your lessons. The process doesn’t stop with linking metrics to goals. Because Gov 2.0 is ubiquitous across government — and at the core is focused on the benefits of information-sharing communities — Levy said he believes agencies have a responsibility to share the lessons they’ve learned with peers inside and across agencies.