An explainer on containers

If containers continue to advance into federal enterprise IT, they could spell the end of monolithic legacy applications.

Shutterstock image (by robuart): application development, program code.

(Robuart / Shutterstock)

Containers enable developers and systems administrators to build, distribute and run self-contained applications. Experts say their adoption could usher in the age of microservices, a software architecture approach in which agencies develop small, lightweight applications that operate in system-level isolation.

“I think containers are part of a general trend in the government toward more modern application development and architecture,” said Mark Ryland, chief solutions architect at Amazon Web Services’ Worldwide Public Sector.

David Messina, vice president of enterprise marketing at Docker, said the developer workflows that containers enable could become the foundation for a major application transformation — from giant monolithic applications to more distributed models.

The fundamentals

“Given that it’s a developer-led movement, enabling your development team to use Docker in their developer pipeline has instant productivity benefits,” said Messina, whose company is a leading open-source container developer. “What Docker enables is the ability to make decisions at any point to migrate your applications from one infrastructure to the next,” which is especially useful as agencies move to the cloud.

“Your application can be developed within the four walls of your agency, and then you can have the flexibility of running your Dockerized applications in any cloud and also have a model where it’s hybrid across your own private cloud, in a public cloud or across public clouds,” he added. That portability brings freedom of choice for infrastructure and operations.

“At a certain level, you can think of containers as a way to package the applications and then to deploy them in a predictable and consistent and fast way,” said Kurt Milne, vice president of product marketing at CliQr Technologies, an application-defined cloud management vendor. Containers eliminate “some of the dependencies [and] configuration issues that can often sideline or slow down a new project.”

The hurdles

Jared Rosoff, senior director of product management and architecture at VMware, said that right now container companies offer “a code base that is very new and changing very rapidly. [The technology] doesn’t lend itself to an environment where you want to push some code and let it sit for a year. You [must] plan on upgrading the base in a container infrastructure pretty frequently until the technology is stabilized. That’s going to be a problem for everybody.”

He also cautioned that we are only at the beginning of the maturity curve for containers. An ecosystem of tools for managing fault diagnostics, configuration, performance and security does not yet exist for containers. That means federal agencies might have to cobble together their own container management systems from various parts.

Christian Heiter, chief technology officer at Hitachi Data Systems Federal, stressed the key role an agency’s infrastructure will play. “There are a number of management tools because you will be launching all of these virtualized applications, [so] you need to have a good infrastructure...to be able to keep your costs under control,” he said. “Otherwise, you’re building tools, [but] you’re not really monitoring what’s going on in all these virtualized environments, and the costs can rise.”

There is also the struggle for control between developers and operations teams. “Because you are going to be using the base operating system for the application, you have to make sure that base operating system is set up correctly for the application or range of applications so that they do not imperil any data or compromise any data or cause problems due to incompatibilities, “ Heiter said.

Experts also raised concerns about who would maintain governance over containers. Rosoff summed it up this way: “I can’t just willy-nilly give complete control of that environment back to an application team unless I’m sure that I’ve got the right levels of controls and governance to ensure that same level of data protection, for example. I’m thinking about data protection, but you could pick almost anything.”

“There are multiple approaches to management and multiple approaches to security [and to] the implications of the launch of the containers themselves,” Heiter said. Therefore, agencies should take a close look at their requirements and then find the best fit for current and future needs.

“A lot of applications have key dependencies that can’t be containerized,” Milne said. “Your containers have to coexist with your existing generation of servers and the reality of multiple OS versions.... These containers are going to end up in a mixed” environment.

In other words, “containers need to run on an OS,” he said. “Basically they’re sharing part of the kernel, so you can’t create a container in one operating system, then run it on another.”

He sees similarities to the early days of private clouds, with containers running on bare OS until agencies figure out how to integrate them into their existing processes.

Ryland had a slightly different take. “Moving toward a DevOps model itself [is] the bigger challenge — getting your head around how to meet compliance and other kinds of requirements in a world in which developers are constantly building and deploying new code on a daily basis,” he said. “I think historically people have thought that that super agile model is somehow not compatible with the more controlled style of government application development and deployment.”

The future of containers at federal agencies

Ryland said customers are already recognizing the benefits of containers. “Lately we’ve seen more cutting-edge kinds of customers who say, ‘Look, I can actually do better in meeting the government requirements because it’s completely automated. I’m removing all sorts of human decision-making from my application development-into-deployment process,’” he said.

And those customers have the metrics they need because they have conducted compliance and security tests to ensure that the code they release meets their requirements, he added.

When asked for his predictions, Milne said, “Containers will be found in traditional agency production environments. It’s going to happen, or it’s happening.”

However, “those uses, those containers won’t be 100 percent containerized.” Instead, he foresees a mix of containers and non-containerized components or tiers, which means agencies will need management platforms that can handle such mixed environments.

“I think as new applications come online, or as that development process starts, and as people begin to really internalize the DevOps model, and the [continuous integration/continuous delivery] model, then they’ll really look to containers as an important tool,” Ryland said.

Furthermore, containers give developers the ability to create applications wherever they are. “You can literally be writing a federal application on a plane, without Wi-Fi, because you’ve got all the pieces you need there, containerized on your laptop. Then as soon as you land, you can deploy [your application] into a cloud...because the container provides that compatibility layer...between local off-line development and the production environment.”

Heiter was equally enthusiastic about the future of containers for federal agencies. “This is exciting technology,” he said. “I love where the industry is going with this. I love where the open-source community is taking it. It will be highly beneficial to the federal sector moving forward, if not already.”