EPA hopes to deliver better search results
The agency plans to ask industry experts for help in making information more accessible to the public.
The Environmental Protection Agency will seek to make public information more easily accessible through the Web in 2008, according to the agency's chief information officer. EPA plans to address some basic technological problems that have made it difficult for people to find information through search engines. However, agency officials are open to other ideas for improving accessibility. They will solicit input from industry experts as a part of a program named the National Dialogue to Enhance Information Access, which will begin in 2008. “We seek to further understand what information audiences need and how they would prefer that EPA provide that information,” Molly O’Neill, EPA's CIO, said in e-mail comments. During a recent hearing on the E-Government Reauthorization Act, some lawmakers criticized agencies for not making public data easier for people find. At the session, Sen. Joseph Lieberman (I-Conn.), one of the sponsors of the bill, pointed out that more than 2,000 federal government Web sites are not included in commercial search engine results. EPA is seeking to provide users with improved results, whether they are using commercial engines or the search engine on the agency's Web site. One way to provide faster and less-cluttered results is to implement an agencywide content management system that uses meta tags, O'Neill said. A meta tag is a piece of HTML code that provides basic information about the make-up of a Web page. Search engines can use meta tags to build search indexes. Many agency Web sites lack the basic tools employed by many commercial sites, said John Needham, Google’s manager for public-sector content partnerships, speaking to Federal Computer Week earlier this month. One of the key tools is a Sitemap, a file on a Web server that lists the URLs and other information about its Web pages (see www.sitemaps.org). If agencies do not provide Sitemaps, Web crawlers, which create search indexes, will have trouble navigating their Web sites, Needham said. The problem, Needham said, is that agencies typically are more interested in revamping Web sites and making more information available than in making that information easy to find. O’Neill said the Sitemap is on the agency's priority list.