Focusing on security: What matters?

Federal Computer Week held a roundtable discussion with seven experts on information security.

What is the state of federal information security? That was the overarching question posed by Federal Computer Week last month during a roundtable discussion with seven experts from the public and private sectors.

FCW asked the experts about the federal government's cybersecurity priorities. Overall, they said that current information security policies put disproportionate emphasis on system security and not enough on network security. Furthermore, they said money for information security is the first to get squeezed when budgets are tight. One person suggested that federal officials set security standards for the software industry to follow. And nearly all agreed that officials at federal agencies need to use more automated methods and fewer manual means for managing information security.

Around the table were Kenneth Ammon, president and co-founder of NetSec Inc.; Bruce Brody, associate chief information officer for cybersecurity at the Energy Department; Bob Dix, staff director of the House Government Reform Committee's Technology, Information Policy, Intergovernmental Relations and the Census Subcommittee; Dennis McCallam, technical fellow for Defense enterprise solutions at Northrop Grumman Information Technology; Edward Schwartz, senior architect at netForensics Inc.; David Thomason, director of security engineering at Sourcefire Inc.; and Amit Yoran, who at the time was director of the Homeland Security Department's National Cyber Security Division.

FCW's technology editor Rutrell Yasin and assistant editor Florence Olsen asked the questions. To read an uncondensed transcript of the discussion, go to FCW.com Download's Data Call at www.fcw.com/download.

From where you view information security, do you think the right priorities are being emphasized?

BRODY: It's kind of a complicated question from a government perspective because, in many cases, our priorities are ever changing. In addition, those priorities are married to a budget, and on top of that, they're driven by congressional legislation and the oversight community that we have to respond to.

So the question of whether or not we're focused on the right priorities changes almost from department to department and agency to agency. Generally, we're focused on fixing systems. However, there are certain infrastructurewide strengthenings that could occur if [CIOs] had the authority to effect infrastructure-level changes.

Those departments that are able to do that have found they have strengthened their security. But many departments don't have the authority to focus on the infrastructure — at least the CIO doesn't.

DIX: The evidence would suggest that [the priorities] haven't been [correct]. Notwithstanding legislation, notwithstanding congressional oversight, the federal government, in far too many instances, still views the whole discussion of information security as a technology issue.

It is our continuing view, as it is in the private sector, that it is a management, governance and business process issue, and it needs to be incorporated in all decision-making about IT investing.

We're trying to make the protection of networks and information assets a priority for agencies at the highest level.

SCHWARTZ: My experience in the government sector is that there's way too much emphasis on certain aspects of compliance with the Federal Information Security Management Act. The rush to perform certification and accreditation on all systems has been an enormous money drain for agencies and really hasn't addressed the core issue of real-time security.

The certification and accreditation process is only as good as the day it was finished. I've been arguing for the agencies I work with to develop a continuous monitoring architecture. If you're going to say you're FISMA-compliant or compliant with any government regulation, once you're there you want to know whether you are getting better at it and whether you are performing more efficiently from year to year.

THOMASON: It seems there's been an awful lot of effort spent on perimeter security, a lot on firewalls and virtual private networks — technologies to protect a perimeter that doesn't exist anymore.

Because technology has dissolved that perimeter so significantly, we have to focus on the entire infrastructure. Getting into the infrastructure is important, not only looking at security system by system but at all of the infrastructure components that pull those systems together.

Is there an internal vs. an external threat, and where are the borders? Those don't exist anymore.

YORAN: We've defined through legislation, through paperwork processes, through the risk management practices of agencies and departments — we've done a reasonably competent job of identifying what work needs to get done.

The gap here is in taking some of the good technologies that are out there and driving them into action. My esteemed colleague from netForensics said that it's gotten to the point where there's as much or more emphasis on the paperwork drills as there is on actually fixing problems or fixing our security practices.

We need to flip that. We need to look at technology as offering great economy of scale here. Unless we have technical enforcement of the policies we've defined, and technical enforcement of the objectives that we're trying to achieve, we're never going to achieve meaningful progress.

I'm not suggesting that the certification and accreditation process is a bad one or that asset inventory doesn't need to get done. But the idea of going through a paperwork exercise to identify assets is fundamentally flawed in its concept.

DIX: I agree with that. It should be an automated process.

But a lot of the focus now is on getting to green [on the Office of Management and Budget's score card]. Whatever the risk, it's acceptable because it's a way to check the box saying, "We've completed certification and accreditation and can move on to what's next." But if all we're doing is checking the box so we can get to green in somebody's process, then that fails to address the fundamental issue of what we're trying to achieve.

BRODY: At the risk of agreeing with Bob Dix — which is a rather common practice, I might add — let me just say that when responsibility for the certification and accreditation process or the approval process or the acceptance of risk process, is decentralized and given to individual system owners, there is a tendency — under time pressure, under budget pressure — to accept a whole lot of risk that's not well understood.

There is a tendency to complete a paperwork drill and say, "OK, the system has been certified and authenticated," and then not to manage residual risk and not to redo the process every time significant changes to the system occur.

Unfortunately, centralizing that to a competent and qualified individual who may be the senior agency information security officer or the CIO is very difficult to do. The FISMA legislation didn't give enough authority to that central person. And so we end up with this decentralized paperwork drill that is a weakness in the process. Maybe we could look at ways of strengthening the authority of the CIO.

Is a primary culprit to blame for software vulnerabilities such as buffer overflows? Is it mainly the fault of colleges and universities for not adequately training their students in good software engineering practices, or are software companies primarily to blame for doing a poor job of managing their programmers?

YORAN: It's a faulty belief that computers are deterministic, finite state machines. They're not. General-purpose computing has become so complex and the development of software is so complex and the interactions [among] protocols and software and hardware [are] so complex that you can't point to a single culprit and say, "This is your fault."

This is a problem for which education will help, improved software development practices will help, better software engineering disciplines will help. But fundamentally, it can only be solved through the use of technology.

By that, I mean technology that prevents programmers from making well-known and well-understood mistakes. You get people developing code who are English majors or have studied journalism or physics and haven't been through the software development disciplines.

We need to find the technologies that can help enforce better coding practices and drive down that road that will return results of much more dramatic significance than isolated investments in training, education and awareness or legal mandates or liabilities.

Do you think the federal government is doing enough in this area to solve the problem of coding or software vulnerabilities?

McCALLAM: Twenty years ago the Defense Department did a very smart thing. They said, "Thou shalt have Ada." Ada was a process, not a language — it was a process. You were drilled in that process, and you would be put through the wringer: "Did you have the code review? Have you got every test case set?"

That process never migrated outside DOD. People who did leave took that discipline with them and tried to instill it. I would love to see colleges and universities put in some kind of disciplined software engineering course, so students can see what it's really like to have to put something together in which they consider all the test cases.

Why does a piece of software fail? Well, it's because you didn't check for divide by zero. That's a simple check, right? So I think it's an issue of software management. We've seen way too many software programs fail because of poor software management. But it may not be simply a management issue. It's also a requirements-creep issue. It's a tough thing.

I think the government has tried to do some good stuff with it, and I'd like to see some of that continue. Some kind of discipline, a set of standards that are uniform — that would really be helpful.

For commercial companies to adhere to?

McCALLAM: Yes. Why not?

How are federal agencies paying for information security? Are they stealing money from other people's pet projects?

BRODY: Well, in an ideal world, every system will have a certain percentage of funds that are dedicated to information security. That's a discrete number that's rolled up every year into the documentation that we provide to the Office of Management and Budget.

The reality is system owners tend to squeeze security more than anything else. Unless there's a very strong security champion somewhere high in the department who can direct traffic and provide refereeing, you'll find that security usually ends up as a lower priority.

YORAN: I'll probably be beat up by my CIO compatriots later, but I don't believe there is a funding shortfall. I think enough funding exists to accomplish far better security than we've achieved.

What's missing is the cultural importance of security within some of the departments and agencies. And that will continue to be the case, and we will continue to get the paperwork drills and the lack of attention, focus and emphasis on security until someone is held accountable.

Flaws in security are tolerated and business goes on. I would love to see department executives recruiting the best possible security champion they can to help them understand this complex issue. And the ones [who] don't do that and put taxpayer systems at risk, at some point they need to be held accountable for doing that.

I would imagine that most organizations take a multilayered approach to security: having firewalls, some type of authentication and access control, antivirus at the desktop and gateway, intrusion prevention. What's out there that officials might not be implementing in their agencies or organizations that they should be implementing?

BRODY: Vulnerability management is all that. An effective, centralized vulnerability management program [that] includes patch management and distribution, vulnerability management and intrusion prevention, settings management and configurations would probably be the most important seat-by-seat approach.

Then, you still have the whole problem of the entire infrastructure that you have to deal with. So you're talking about a defense-in-depth approach that absolutely needs to occur.

AMMON: There's an area [in which] I haven't seen a lot of technology applied. If you have encryption technology, you've made sure your operating system is patched and secure. You have intrusion prevention, and somebody connects to your interactive Web-based form and because your credentials are not necessarily managed strongly, an intruder writes himself a million-dollar check.

You haven't found a bug in the operating system. What you've found is a systemic issue; it's more of a solution vulnerability than any piece of technology. I think with the push to Web-enable many of the capabilities of government there's less urgency around analyzing that problem and dealing with the risk that we've found for commercial organizations.

We have companies that we support that won't let one Web-based application go live without a test against the application end-to-end rather than just an assurance that the components alone are secure.

SCHWARTZ: If we're going to try to manage security in an operational environment, we have to get all the data, the data such as how the solution is behaving at the application layer. There are technologies now such as network behavior anomaly-detection technologies that give you incredible visibility into what's going on on your network.

All of that data needs to be gathered in some way in real time based on the criticality of the asset, based on certain prioritizations. And then that data needs to be correlated and displayed to someone who can take action on something that comes up as a blip, as an alert of some sort.

Also, you need to identify the assets, and there needs to be an automated approach to that.

One way that we're seeing is a lot of companies are coming out with these approaches where you can't even get an IP address anymore if your system doesn't meet certain security requirements.

And maybe those requirements mean that they report that they're an asset on the network; they report their configuration; they have certain things like network-based anomaly-detection systems and so on.

Once those things are established and they're at the right levels, they're able to get network resources. But then, they're also able to be monitored.

THOMASON: When you don't know what you have in your inventory and can't control and watch the changes in those kinds of things, then you have a real problem. If you've got a systemic problem that allows someone to start a new service on a box, and you have no way to detect that that new service has been started, how can you protect it? How can you identify the vulnerabilities associated with it and then take some mitigating action?

To get this visibility and collect the assets so you know what's on your network, are we talking about using the system management tools that are out there for IT asset management? Or are we talking about a new type of technology to give you this kind of visibility?

AMMON: I think it's a collection of technologies. I think the issue is that you have to correlate those various data points into one system so that you can normalize it.

One of the things that I've found valuable in enterprise architecture is this idea that you had to have a naming convention. And I think once you have a naming convention, then you can take disparate data, normalize it and put it into a management tool.

So I think you need various technologies. You need to find a standard language and then correlate the data into one repository.

McCALLAM: You could even extend that a little bit and say, "What I really want is to move from defense-in-depth to resiliency-in-depth." I mean, I not only want to defend, I want to be able to still operate.

YORAN: It's important to get the terminology down, and that can help evolve how we think about our defensive practices. But increasingly, as David pointed out earlier, the [network] perimeters are gone. If your organization believes you have a solid perimeter, I'll all but guarantee that you're mistaken.

And the thing that I will absolutely guarantee is that within five years you won't be able to white board or, even with the largest network mapping application you have, draw what your perimeter looks like.

And not only won't you be able to draw what your perimeter looks like, you won't be able to identify where your data resides and where it travels to. All of these technologies, all of these data sources, all of these Web-enabled service infrastructures are becoming completely meshed and intertwined.

And the technology we use — and the way we think about our network defense and our information security — needs to continually be revisited and reinvigorated through investing in new types of applications and services to protect ourselves.

Managed security services are one fundamental advantage [that] we can [maximize]. The enterprise information security information collecting and analysis tools are very important tools [that] we can use. The intrusion- prevention systems, both host-based and network-based, are an important part of the puzzle. The asset inventory configuration management and patching technologies are one part of that.

And [officials at] organizations have to stop thinking, "I can ignore these systems because they're not a critical part of my infrastructure."

And they need to stop whining about producing so much paperwork and start making investments in technology, start making investments in operations to improve their practices.

We've developed a cottage industry here in the Washington, [D.C.,] area that doesn't exist in the real world. If you look at the practices of the largest corporations of the world, their networks are every bit as complex as the government's.

They have every possible system; they have every antiquated system; they have different business units performing different missions in different countries of the world and under different regulations very much the way the U.S. government IT infrastructure looks.

But they don't let that get in the way of protecting their systems and conducting business. And we need to stop focusing on this cottage industry of paperwork production and start focusing on investing in technologies and taking action to protect ourselves. We need to continually evolve those actions and the technologies we use.

SCHWARTZ: I think you made a great point. I mean, if somebody from an agency came to me and I worked in a senator's office and they said, "Well, we've got a couple choices here. We could spend $50 million on paperwork in the next two years to comply with FISMA, or we could spend $5 million on that and $45 million on technology investment," I would obviously push them toward the $45 million. But I've seen in some agencies that they're perfectly content doing the $50 million of paperwork and then have nothing left to do the technology evolution, if you will, that they need to do. I think we need to take a harder look at that within the government.

DIX: With all due respect, I think that's a gross misrepresentation of the current situation. First of all, a law [states], "Thou shalt do certain things." It is not discretionary, or should not be, and agencies are evaluated on whether or not they comply with the requirements of the law.

The mechanism by which they achieve that, I think that's a valid observation. And I think automating those processes is absolutely the right direction to go.

But it needs to be all part of the IT investment decision-making and the strategic planning process on the front end, and we have to quit making excuses and kidding ourselves about these artificial attempts to comply with the requirements of the law.

We need to start actually doing something. We need to actually start investing in security and changing the culture from one simply focused on functionality and features and performance and reliability, which are all important. But absent a focus on security, none of that may matter.

There are some who would hypothesize right now that we have an awful lot of systems in the federal government that are sitting in an infected state as they operate today because of the fact that they've been unpatched or left vulnerable or not attended to. And there's not a lot of focus being put toward addressing that issue right now either, which is of great concern to us.

So I don't mean to be argumentative, but I think there are certain things that the law requires and there's an expectation by Congress and the legislative branch that people comply with. And if they're not going to comply, they're going to be held accountable, and that may mean loss of funds. That may mean any number of things.

NEXT STORY: GEIA: IT to grow 3.6 percent