Modernizing Government IT Starts with Data
When thinking about IT modernization, organizations too often think at an application level.
Critical issues with legacy technology, combined with the need to address mission opportunities at an accelerated pace, are forcing federal agencies to develop new IT strategies to meet these pressing demands.
The federal government has responded with the Modernizing Government Technology Act, and the President’s Management Agenda that calls for an integrated data strategy across all departments. More recently, the General Services Administration and Office for American Innovation Centers of Excellence have issued a request for proposals focused on data center optimization, and the Office of the Federal CIO released a draft version of its new Federal Cloud Computing Strategy known as the “Cloud Smart Policy.”
Many initiatives have been launched to move agencies to multi-cloud environments or services-based architectures. Still others are pursuing advanced artificial intelligence strategies that require a significant “leapfrog” maneuver. But these strategies can be interrupted or halted by the most challenging aspect of an IT modernization initiative: their data layer.
A “Data-First” Approach to IT Modernization
It’s essential that agencies consider information privacy and security, but what often goes unnoticed is the data layer itself. The petabytes-on-petabytes of data that agencies generate, collect and retain, is typically scattered across IT silos. Inside this data are the insights to alerting agencies to failing systems, malicious hacker activity and other problems before it’s a full-scale crisis or the information government needs to make more accessible to its citizens.
When thinking about IT modernization, organizations too often think at an application level when the conversation needs to start at the data layer. Public sector IT teams, data owners and program managers can revitalize their legacy IT architectures without painful “rip-and-replace” efforts and, more importantly, without having to move their data.
The most effective IT modernization strategies are driven by a data management strategy. To adopt a “data-first” strategy, government IT groups will need the ability to analyze all of their information (text, logs, machine data, etc.) for full visibility, transparency and analysis. They will also need to become proficient in the profiles of both the producers and consumers of their data and its downstream needs. And they will need to improve their ability to manage, search and analyze it in real-time, akin to enabling a “speed layer” for millisecond response times across terabytes of diverse data types.
Assessing a Data-First Initiative
A key advantage of a data-first approach is the ability to work with any given IT infrastructure. Typically, new IT projects require an assessment of the history and current state of IT systems. While that understanding is important, a data-first project isn’t constrained by “Who built this system? When?” Instead, these are the most valuable questions agencies can ask:
- What are your future information needs? Consider new or pending initiatives that could increase the volume of data.
- How are you going to acquire and govern the data to meet those needs?
- What is the best architecture to support these needs?
- How do you provide access to those who need it, while maintaining security and privacy controls? For example, Defense Department Impact Levels and FedRAMP levels are all driven by data sensitivity requirements.
Agencies also need to consider implementation strategies best suited to their unique needs. Data-first initiatives should be incremental and agile, not “big bang”, and should be consistent with their data layer. For example, they first will need to integrate the data layer’s approach to metadata management and indexing and then iteratively establish data pipelines over time.
Data-First Success Stories
As part of the Digital Accountability and Transparency Act, the U.S. Treasury and Office of Management and Budget embarked on making government spending information more transparent, accessible and usable for anyone. They used agile development methods and open source technologies to link data from the budget, accounting, procurement and financial assistance databases into a single format allowing for comparison across the entire government. Using a familiar search engine interface, citizens can use USAspending.gov to quickly gain insights into spending information to see exactly where their tax dollars are being spent.
The Center for Enterprise Dissemination Services and Consumer Innovation has also used a data-first strategy to drive new advancements for the 2020 U.S. Census. CEDSCI has revitalized its legacy systems to replace a previously siloed approach with an integrated, shared-services platform that will deliver mapping, visualizations and mash-ups across all of its data sets. The most visible example will be a Google-like interface to search across more than one trillion Census estimates and combine census and geospatial data.
While government IT modernization efforts are often framed around the challenge of aging systems, data-first initiatives enable agencies to shift the focus to the most valuable asset. By putting data at the center of modernization efforts, agencies can gain a whole new perspective that can impact their operations and performance substantially.
Dan Tucker is the vice president at Booz Allen Hamilton: Digital Solutions and George Young is the vice president of U.S. Public Sector at Elastic.