Why the CDM program needs an overhaul
By focusing on legacy attack vectors, DHS’ cybersecurity program fails to address rapidly evolving new threats.
The Department of Homeland Security has described the Continuous Diagnostics and Mitigation (CDM) program as “a dynamic approach to fortifying the cybersecurity of government networks and systems.” That fortification is meant to be achieved through three distinct phases.
Phase 1 covers the asset management process for categorizing hardware and software assets across the IT landscape at federal agencies. Phase 2 is designed to enhance the management processes for monitoring and auditing personal activities based on credentials, privileges and expected behavior. Phase 3 defines requirements for management aspects of the security life cycle, including planning for events, documentation requirements and quality management.
However, despite being well-intentioned, the CDM program has been plagued by slow delivery cycles, inadequate definitions of requirements and an overall ineffective approach to cross-agency security improvements. Notably, only one line item in the program is intended to thwart threats: Phase 3’s “Boundary Protection (Network, Physical, Virtual)” encompasses IT technology capabilities that federal agencies already have in place at their network boundaries.
Given ongoing breaches and hacks, it’s clear that deficiencies still exist at the federal level, and CDM is missing the point by failing to adequately define phases that help government entities identify and mitigate threats to improve their overall cybersecurity posture.
By attempting to address an issue via an archaic set of processes, CDM is fixing the wrong problem. The program looks backward at legacy attack vectors and does not address emerging threats. One primary limitation to CDM efficacy is the fact that the program does not focus on application programming interfaces (APIs), which have become the epicenter of industry innovation and the primary driver of mobility, cloud, internet of things and enhanced protection of data and services.
The emergence of API security gateway technology has changed IT security capabilities in profound ways to reduce most modern threats. The technology, similar to the web application firewall evolution that occurred nearly a decade ago, has emerged as an essential architecture capability to prevent modern threats. It encompasses the fundamental tenets of the CDM initiative by combining identification and enforcement as architecture capabilities rather than using disparate processes and event monitors.
In other words, although the CDM initiative aims to streamline processes and simplify risk management, the framework fails to address a technology that has prevented many of today’s threat challenges.
The other significant hindrance to CDM success is that it only provides access to technologies and companies that are included in the blanket purchase agreement. That approach restricts the ability of viable vendors to offer their services by limiting participation to selective BPA partnerships. It leads to discrete, non-repeatable and hand-coded solutions to problems rather than best-of-breed industry products that have been proven in the commercial sector and are substantially more cost-efficient to deploy and maintain.
Further, the CDM program creates silos of agency groupings that the limited set of BPA integrators are able to bid on, thus stifling the ability to realize economies of scale and establish consistent solutions that can be applied across the entire federal government.
DHS touts CDM as achieving a “consistent application of best practices.” Applying best practices is indeed a laudable approach, but if you are practicing for the wrong game, what’s the point? Best practices need a strategy, and CDM needs to be augmented to focus on existing industry technologies such as API security gateways and others that are already solving cybersecurity problems but will likely take years for the CDM program to discover.