4 key trends that every CIO should watch
The hot topics of 2014 are evolving quickly, and they present challenges and opportunities for federal agencies.
The dominant public-sector technology trends of 2013 -- which include cloud, big data, cybersecurity and software as a service -- are here to stay, and they continue to evolve in ways that open up new challenges and opportunities for federal CIOs.
Four technology trends in particular are shaping up to occupy growing mindshare among government CIOs in the months ahead, and they are worth watching as the year progresses.
1. The CIO's role will expand to being a 'broker of things'
A key benefit of accessing IT services such as software, storage and security "on demand" is that it frees agency CIOs from time-consuming implementation issues. Instead of being mired in buyer/builder obligations, CIOs can use their significant product knowledge to evaluate technology options as "brokers of things."
CIOs' ability to play that role is enabled by adoption of on-demand services as a way to acquire the highest level of technology at the lowest investment cost. The CIO becomes a broker to mix and match opportunities based on cost, available internal resources and other critical factors.
Examples of the broker role will be increasingly evident as agencies adopt multiple cloud models. CIOs will sort their IT portfolios into applications that they must control entirely (in on-premise private clouds), applications they must control partially (in enterprise-grade public clouds), workloads that are more transient (in public hyperscalar clouds) and those best purchased as SaaS. Then CIOs and other IT decision-makers will act as brokers across those diverse cloud models to arrive at the optimal posture.
2. Hybrid clouds will grow in appeal
To date, agencies assembling their cloud strategies have tended to focus on moving to a single public or private cloud architecture as dictated by their specific requirements.
Public clouds tempt agencies with cost savings through pay-as-you-go pricing and the flexibility to scale up and down, while private clouds allow agencies to more directly control data and infrastructure and offer more security assurance. Deltek's Federal Cloud Computing Market Outlook predicts that the federal cloud computing market will grow at a compound annual growth rate of 32 percent over the next three years.
There is little doubt that cloud adoption in the public sector will continue to grow, but every CIO must wrestle with whether the optimal path is a public cloud, a private cloud or the emerging hybrid cloud approach.
The hybrid cloud model has been the least tapped in the public sector, but it offers the most potential. Agencies seeking a balance between scalability and security will gravitate toward public computing power coupled with private cloud storage that preserves their control over their most sensitive asset: data.
In other words, the hybrid cloud model allows agencies to maintain control of their data while fully maximizing cloud computing economics.
3. There will be no flash (storage) in the pan
For government agencies facing exploding data storage requirements, cost and scalability are important. However, for many critical applications, the chief pain point is performance because scaling disk drives or even using hybrid solutions cannot deliver the mission-critical sub-millisecond response times necessary.
Last year saw dozens of flash storage startups nudge into the public sector and onto the radars of CIOs. Those startups recognized that federal agencies could use flash-array storage for enhanced performance via solid-state drives with flash memory drives rather than hard-disk drives. CIOs will find the competitive landscape for flash storage expanding in 2014 as larger, industry-leading vendors launch compelling all-flash storage solutions alongside the startups.
The battle will be won by providers that can demonstrate to CIOs that their solutions deliver reliability, scalability and non-disruptive operations.
4. Software will define more government IT
Software-defined networking (SDN) and software-defined storage (SDS) will move further to the forefront in the public sector. As an architectural approach to managing data storage, the SDS infrastructure is automated via intelligent software as opposed to the storage hardware itself. SDS virtualizes and encapsulates the entire infrastructure into a container that can be logically partitioned. Therefore, the pooled storage infrastructure resources in an SDS environment can be automatically and efficiently allocated to match the application needs of an enterprise.
The multitenant environment of SDS allows agencies to optimize management and control of the data and the environment through separation and control of the data planes. Inefficient data storage typically represents as much as 15 percent to 20 percent of IT infrastructure budgets. SDS prevents overspending on data storage by:
- Including efficiency technologies such as deduplication and compression.
- Extending backup technology to take efficient volume-based backups over the network.
- Preserving compression and deduplication over the network and at the remote site.
Although SDN and SDS made strides in the public sector last year, there remains significant room for improvement, innovation and education. In a NetApp survey released in January, public-sector respondents cited reducing storage costs as the top benefit of SDS, yet 33 percent were not familiar with SDS and only 7 percent described themselves as very familiar. Translation: More education and awareness of the benefits of SDS will spur broader adoption.
Those four trends are being driven by agency pain points and a consistent stream of innovation that is expanding opportunities for government CIOs to succeed with their missions in 2014 and beyond.
NEXT STORY: Rochford to run telecom technology lab