4 Data Analytics Trends to Watch in 2021
Expect more live data dashboards and APIs as a key enabler.
From entity resolution to supply chain logistics, spend analytics, geospatial data conflation, and more, the applications of big data analytics in the federal sphere are varied and extremely beneficial in meeting mission requirements. Government agencies have made tremendous progress to date, but what can we expect to see in 2021 as data analytics initiatives in federal agencies continue to advance?
As a company with clients that include the Air Force, the intelligence community, Customs and Border Protection, and other federal agencies, here are a few trends we’re seeing for 2021.
1. Federal agencies will see the rise of live data dashboards, and they’ll move from static data to dynamic, real-time data.
Until now, most federal reports were built by hand, and most briefings with executives and decision-makers have been delivered using Microsoft PowerPoint presentations packed with static data. In 2021, you’ll see more interactive dashboards powered by real-time data. The infrastructure for this shift has been coming online for years: dashboarding software, data lakes with accessible data, and data curation software that makes that data usable. This shift will allow executives to obtain more up-to-date information from operational systems and see data through different lenses such as time periods, populations, or geographies, all in real-time. But beware: agencies must help executives make timely, smart decisions by curating information up-front versus letting them get buried in the raw data—something that would be more of a hindrance than a help.
Imagine being charged with leading the COVID-19 pandemic response, or with distributing and administering vaccines. Decision-makers need the most up-to-date information, such as which segments of the population or geographic locations are most affected, as well as trends. They also need to be able to orchestrate the response by getting curated data out to response teams and others working in distributed groups on the issue.
2. APIs will be a key enabler for public services, as well as for bridging disparate and legacy systems.
Data silos in the federal government aren’t going away any time soon, if ever. As a result, neither will the APIs that allow disparate data sources to be extracted from or written to.
Several areas of the federal government have been leveraging APIs for years to enable public services. One prominent example is the U.S. Digital Service, the organization that grew out of the rescue of HealthCare.gov, the federal website meant to allow consumers to shop for private health insurance. The work of making the site more functional required extensive use of APIs to pull data from a variety of disparate systems.
By using APIs, technical decision-makers have more flexibility in what is created, whether it’s done in-house, or by IT services contractors or vendors. USDS is at the vanguard of leveraging APIs, but the trend will become more widespread, especially for agencies that need to create citizen-facing websites.
APIs will also continue to extend the functionality of legacy systems to help organizations take better advantage of modern technologies such as cloud and microservices. This will allow agencies to stay lighter on their feet by creating collections of loosely coupled and more tailored services that still leverage legacy systems, versus developing rigid, monolithic software programs. Finally, APIs will continue to help agencies continue to migrate to the cloud, giving them more flexibility and letting them only pay for what they use. Regardless of the application, it is essential to ensure that the data coming from all the different systems, cloud-based or not, is curated and mastered in order to avoid the age-old “garbage in, garbage out” problem.
3. Cloud services will continue to proliferate throughout the federal government, thereby increasing the expectations for data retention and modeling.
As federal agencies increasingly move to the cloud, they will be more reliant than ever on cloud providers to help them with data retention and modeling. Cloud providers will have to step up to ensure that federal agencies can stay compliant with policies and meet legal requirements for data retention. There are also increased risks because cloud providers in many instances can now be considered information custodians. Cloud providers are providing advanced capabilities that make safeguarding data inherent to the cloud infrastructure itself, including backup and disaster recovery. That will become even more important in 2021.
Cloud providers will also have to be sure that data modeling capabilities are table-stakes on their menu of service offerings. Again, a major consideration is to be sure that data, regardless of where it resides, is well-curated and mastered so that data modeling and analytics efforts pay off.
4. There will be increased interest in and use of analytics among diverse constituents.
From data scientists and CDOs to data analysts, agency team members will have a common focus on data analytics. Although the term is overused, the “democratization” of data will continue in 2021. More people will want to analyze live data directly, rather than going to a centralized, managed service for answers. Business users will be more empowered to create and deliver analytical solutions than they were in the past. As a result, data analytics will continue to expand and require a greater percentage of technical funding and resources.
A final word
In recent years, advances in technologies and the increasing amount of information have dramatically transformed how business is conducted in the federal government. With the increased adoption of live data dashboards, APIs, cloud and microservices, as well as an expansion of data analytics across broader teams, federal agencies will be able to accomplish their missions more successfully—but they must ensure that data is well-curated to begin with.
Michael Gormley is the head of public sector for Tamr.