Forum moderator: Friend, not foe
The old adage about “moderation in all things” takes on new meaning when it comes to national online dialogues.
The old adage “moderation in all things” takes on new meaning when it comes to national online dialogues.
For this new breed of Internet forums to be successful, moderators must actively define and enforce ground rules for the forum and for ensuring the dialogue stays on track from beginning to end — without censoring any point of view.
This is a far cry from the simple housekeeping role in traditional forums, where moderators only have to keep an eye out for miscreants using abusive language or otherwise disrupting conversations.
But a national online dialogue, as the Obama administration envisions it, is not your typical forum. The administration has launched several dialogues in the past six months, the most prominent being the Open Government Dialogue, which the National Academy for Public Administration hosts.
The initiative, which enters its third phase this week, seeks to get public input on how to make government operations and information more transparent. The result will be an official open-government policy, administration officials say.
An initiative such as the Open Government Dialogue is a community with an organizing purpose, said Barry Libert, social-media business consultant and author of “Barack, Inc.: Winning Business Lessons of the Obama Campaign.” “You are trying to achieve a purpose,” he said. “It is not just community for the sake of community.”
Public policy and social-media experts say moderators will be essential to the success of such initiatives, and federal officials must ensure that moderators have the policies and technologies they need to carry out their jobs. Otherwise, they are putting the credibility of the forum at risk.
Libert and other experts caution against asking a moderator to serve as a policeman or, worse yet, a censor. Nonetheless, the community needs to have clear parameters and someone responsible for enforcing them.
Poorly run dialogues “inhibit public participation in the same way that citizens often avoid an unruly meeting,” said Kim Patrick Kobza, chief executive officer of Neighborhood America, which helps organizations build online communities. “If the dialogue is not constructive, it will not be credible.”
Experts say the ideal is to provide communities with the tools needed to moderate themselves. That is the approach NAPA took with the Open Government Dialogue, encouraging participants to rate one another’s comments — thumbs up or down — and flag off-topic or duplicated entries.
That worked until a number of individuals tried to game the system, flooding it with comments questioning the legitimacy of Barack Obama’s presidency and voting on their comments to ensure they all had positive ratings. Eventually, NAPA moderators stepped in and began removing many of the entries.
Some participants cried foul, calling it censorship. But experts agree that such charges were baseless because the administration and NAPA had defined a clear moderation policy before the dialogue began. The policy specified that moderators could remove redundant comments.
Moderators should “ensure the submission guidelines are prominent, simple and displayed on comment forms,” said Martin Reed, who runs CommunitySpark.com, which provides best practices on managing online communities.
They might even consider reviewing all submissions before posting them he said. “This need not result in claims of censorship, as long as all comments are simply judged by the existing guidelines.”
The idea of moderating public comments predates the Internet, Kobza noted.
“To meet the legal standard of ‘public comment,’ comments must be on point, attributable, relevant and not offensive in language,” he said. “Virtually every agency, whether federal, state or local, has the ability to filter for each characteristic and usually does whether the comments are in written form, in person at public hearings, or electronically.”
It is important that all comments be preserved as part of the public record, experts agreed. But that does not require allowing the forum to be cluttered with junk.
One option is to create an online “junk drawer,” where moderators can send comments that do not comply with their guidelines. The idea is “not to make them inaccessible but to make them a click away,” said Steven Clift, executive director of E-Democracy.org, which promotes online civic engagement.
But government agencies also could take steps to curb abuses upfront, Clift said. For example, they could apply frequency controls, limiting the number of comments participants can submit or rate in a given time frame. That would make it more difficult for a group of individuals to overwhelm the community and require moderators to step in.
This type of nuts-and-bolts moderation is essential, Clift said. “This is a government-hosted environment with taxpayer dollars on the block,” he said. “They have every right to attempt to bring up the level of decorum and improve the public input.”
Beyond etiquette
Such moderation might be essential for online public engagement, but it is not sufficient, experts say. The moderator also must be a facilitator.
The Obama administration has heightened public expectations by inviting input on policy development. But the public’s expectations will be dashed if government officials do not spell out their own hopes, experts said.
“We can all send in a thousand comments, but then what happens?” asked Nancy Tate, executive director of the League of Women Voters. Any time government officials invite public participation, whether online or in person, “they must be clear about what they want the input for and how they are going to use it,” she said.
Precision is the key. An agency might ask the public to identify best practices for public participation, Tate said. “But public participation to do what?” An open-ended question might work early in a brainstorming exercise, but the sooner the moderator focuses the question, the sooner the public can provide concrete suggestions.
William Leonard, principal of the Leonard Consulting Group, agrees, saying it is important to differentiate between ideas that are aspirational and those that are actionable.
Leonard retired in January 2008 as director of the Information Security Oversight Office at the National Archives and Records Administration. “I was a professional rules writer," he said. "When writing rules, there is a particular form or format in order for it to be meaningful or workable."
For a dialogue to be productive, the government needs to help the public frame ideas that can be easily translated into policy, he said.
One solution is to develop a form for the input, highlighting the type of information that might be useful to incorporate. “It would be very useful, even if it’s just a template, in order to get people thinking along those lines,” Leonard said.
Evolving dialogues
Libert said he believes that as more dialogues develop, the community will adapt and learn to provide more useful information.
Online communities will always seem unruly, unorganized and unstructured, but they tend to develop organizing principles as the participants interact, he said.
The communities can build on basic tools, such as the ability to rate comments and sort for the best-rated and most-recent comments. But in the future, Libert said, “I think you will see even more complex contextual engines that surface the stuff you are looking for out of those conversations.”
And unlike physical communities, online communities can scale exponentially.
“They are scaling their relationships across borders and boundaries, across age groups and time zones, faster than any of us could imagine,” Libert said. “You might call it chaos, but I would argue it’s self-organizing.”
For an example, you need to look no further than Obama’s run for the presidency. “If you think about Barack Obama, he organized 13 million people having conversations about Barack Obama,” Libert said.
NEXT STORY: Rules of the road