Improve your visibility

Agency practices often hinder information access, but search engine optimization helps.

The government maintains copious amounts of data online, but getting to it through commercial search engines can be a hit-or-miss proposition. The issue of data visibility came to the forefront late last year. Sen. Joseph Lieberman (IConn.) noted during a hearing that 2,000-plus federal Web sites fail to turn up in search engine results. In December, the Center for Democracy and Technology (CDT) and OMB Watch published a report that describes key government information as “hidden in plain sight” from commonly used search engines. A discussion about data visibility is occurring as Congress considers legislation to reauthorize the E-Government Act, a 2002 law intended to make agency data easier to locate. The problem of data visibility is multifaceted and the workarounds addressing the issue vary. Some fixes are technical, while others are a matter of vocabulary and word choice. Efforts to boost search engine standing generally fall under the heading of search engine optimization, or SEO. Agencies can achieve improved visibility through SEO without a significant investment in time or resources, according to government and industry sources. “The most cost-effective thing agencies can do to improve services to the public is SEO,” said Sheila Campbell, team leader for USA.gov’s Web best practices team. The General Services Administration’s Office of Citizen Services and Communications operates the USA.gov Web portal. The problem The CDT/OMB Watch report noted that “many federal agencies operate Web sites that are simply not configured to enable access through popular search engines.” The report said dynamic databases and specialized interfaces impede the automated programs, or Web crawlers, that search engines use to index online content. J.L. Needham, manager of public-sector content partnerships at Google, said the use of online databases is the principal barrier. At a typical Web site, the online database sits behind a search form. Users enter their queries and receive specialized Web pages drawn from the database. Google officials said their crawlers generally can’t access and index such dynamically presented pages, which lack static URLs that crawlers can follow. Technical decisions may inadvertently keep content out of search engines, but agencies can also purposefully block Web crawlers. A robots.txt file lists locations and directories that crawlers are asked to ignore. The CDT/OMB Watch report observed that agencies may have legitimate reasons to keep some content out of search engines to prevent wide distribution. However, the organizations challenged the overuse of robots.txt files. “The widespread use of robots.txt on federal government Web sites is a questionable practice that serves to limit the availability of information, as shown in our previous examples,” the report states. The government sector faces a larger problem with content visibility and search engine optimization. Culturally, government entities aren’t geared to think in those terms. “The government is unique in that when they post content up on the Web, they aren’t trying to make money off of it,” said Erik Arnold, director of Adhere Solutions, a search engine consulting firm in Chicago. “They do not do the very basic things that…the commercial sector would do from the get-go.” Arnold said the key question for private-sector sites is, “How are people going to see this content? Once you start with that question, even the most basic Web site will think about the right words to use, where to get links to get traffic, how to get up into Google.” Solutions Agencies can take many steps to get up to speed on search engine optimization. One place to start is Web Manager Universi ty, a service of GSA’s Webcontent.gov resource that provides webinars, seminars, and courses for federal, state, and local government Web managers. “SEO is a core competency that every Web manager should know about,” Campbell said. “More and more people are seeing that there isn’t a huge mystery to [SEO]. There are a lot of straightforward best practices that they can apply.” Campbell suggested building SEO into the content creation process. “Part of the challenge now is content is created and after the fact an agency thinks, ‘How do we make this available on commercial search engines?’ ” Campbell said. Using the correct keywords is paramount when creating content. Web managers can review the site’s search logs to see which terms users employ, Campbell said. They can also get a feel for global search terms by using products such as Wordtracker. Campbell said governments end up using organizational buzzwords, rather than terms users tend to employ in searches. For example, a government agency may use “surplus property” rather than “real estate.” Agency employees, Campbell said, should review search logs at least quarterly, adding that some agencies perform the task monthly or weekly. “The key thing…is to focus on the words,” he said. An agency that uses keywords can focus next on strategically placing them. Campbell said agencies should eradicate Click Here links and instead use descriptive link titles that include keywords. The use of the Sitemap protocol can also increase an agency’s content visibility. For example, Sitemap helps address the search limitations of online databases, Needham said. Sitemap is an open protocol supported by Google, Microsoft Yahoo and other search engines. Sitemap allows an agency to create a list of URLs for a Web site, making content easier for search engines to locate. The effort involved in building a sitemap depends on the scale of the Web site and the online database, Needham said. “The Sitemap piece is not terribly difficult,” Campbell said. She added that the creation of the USA.gov sitemap took about 15 minutes.  

The savvy executive

Here are some questions executives can ask to jump-start efforts that can make Web content more accessible.

About the problem


  • Do our Web sites employ online database applications that hinder content visibility?

  • Is our organization mired in agency terminology and acronyms that frustrate Web searches?

  • Are we using robots.txt files unnecessarily?

  • Are content creators aware of the need for search engine optimization?


About solutions

  • Have we considered using the Sitemap protocol to improve content visibility?

  • Do we regularly review the Web site’s search logs for keywords and monitor global search terms?

  • Do we seek back-and-forth links with agencies that have related content?

  • Do we have a strategy for obtaining sponsored links?

  • Are we distributing content through channels beyond the Web site — RSS feeds, for example?


— John Moore