Federal IT program failures: It's the content, stupid
Many federal IT programs fail because agencies focus on systems rather than content, writes consultant Barry Schaeffer.
Barry Schaeffer is principal consultant at Content Life-cycle Consulting and senior consultant at the Gilbane Group.
In a world increasingly dependent on automation, it's little wonder that IT spending is exploding, along with the interest in what is spent and what we’re getting for it. That no community spends more for less in return than the federal government has sparked numerous studies, including one currently under way, of reasons and remedies for the historically abysmal success rate of federal IT projects.
If history holds serve, those efforts will focus on the acquisition process, procurement rules, contract types and budgeting vehicles but ignore what is arguably the most important factor: failure to recognize the critical role of content in many programs. In effect, the designation of many major programs as information “technology” projects can help doom them to failure.
That is counterintuitive for many IT people. How can the content be the problem when technology presents the most complex challenges and accounts for most of the wasted funds? Yet in government, complex, trusted content and its life cycle from creation to delivery are the foundation variables that, if ignored in the design phase, can compromise even the most expansive — and expensive — program.
An agency can spend millions trying to integrate disparate or even incompatible systems — often with marginal success — and end up missing the point. The goal is not to make systems connect but to enable content to be managed, accessed and shared. To design a system without thinking about the content is the equivalent of building the railroad locomotives and cars before deciding the track gauge.
When engineers start by focusing on the technology, they often end up making an entire range of architectural, hardware and software design decisions that saddle the project with too much cost and tools and vendors poorly suited to the real problems.
It doesn’t have to be that way. A more content-centric approach can allow projects to define the content architecture, interchange forms, and process requirements and logical transfer pathways. That approach can then encourage participating organizations to meet requirements in their own ways, based on the limits of their technology resources. A growing number of successful efforts are completed this way, with technology supporting but not dictating the content life cycle.
For example, the Defense Intelligence Agency, under a 2002 mandate from Vice Adm. Lowell Jacoby to “standardize at the content instead of system level,” succeeded in building a Library of National Intelligence built on Extensible Markup Language assessments from across the agency worldwide. Likewise, the Office of Management and Budget has significantly improved the federal budgeting process at modest cost by adopting an XML content architecture.
The transition will not be easy. Technology professionals are most comfortable with what they know, and many CIOs are actually CTOs with little experience in or respect for the world of content.
Likewise, the IT industry, ruled by huge firms still mired in 1980s thinking when technology was the major challenge — and revenue producer — will not willingly relinquish center stage and the huge revenues that go with it. That is why the change must originate on the buying side. Bidders are unlikely to propose content-based solutions, no matter how appropriate, in competitions in which the evaluation team is thinking technology approaches and will favor them.
But change must come. Until we learn that, in an increasingly content-centric world, we cannot continue to base automation efforts exclusively on a technology-centric model, agencies will continue to invest millions of dollars in programs and have little to show for it.
NEXT STORY: Second Senate panel approves Lew to head OMB