Einstein approved, GSA nominees move forward and more
News and notes from around the federal IT community.
Senate panel approves Einstein authorization
The Senate Homeland Security and Governmental Affairs Committee approved a bill on July 29 that would codify the Department of Homeland Security's Einstein intrusion-detection program.
The bill would require all federal agencies to implement stronger protections and state-of-the-art technologies to defend against cyberattacks, and it address shortcomings in deployment and adoption of DHS's Einstein continuous diagnostics and mitigation cybersecurity program.
The bill, sponsored by the committee's ranking Democrat, Tom Carper of Delaware, was approved 9-0.
GSA IG Ochoa confirmed; Roth moves forward
Senators confirmed one General Services Administration nominee on July 29, and moved a second closer to the finish line.
The full Senate confirmed Carol Fortine Ochoa to be GSA inspector general. And the Senate Homeland Security and Governmental Affairs Committee approved by voice vote the nomination of Denise Turner Roth as administrator of the General Services Administration.
Executive order calls for computing strategy
President Barack Obama issued an executive order July 29 that laid the groundwork for a cross-agency high-performance computing strategy.
The directive established the National Strategic Computing Initiative, a “whole-of-government” initiative to deliver an investment strategy for making the most of HPC, which is essentially computing at scale. The departments of Energy and Defense, as well as the National Science Foundation, will be the lead agencies for the new initiative, the order stated. A council co-chaired by the directors of the Office of Management and Budget and the Office of Science and Technology Policy will coordinate R&D and deployment related to the initiative.
“Maximizing the benefits of HPC in the coming decades will require an effective national response to increasing demands for computing power, emerging technological challenges and opportunities, and growing economic dependency on and competition with other nations,” the order said.
Agencies participating in the initiative should accelerate “delivery of a capable exascale computing system that [integrates] hardware and software capability to deliver approximately 100 times the performance of current 10 petaflop systems,” the directive stated.
Cisco engineer floats 'fog computing' for federal agencies
Organizations have long focused on bringing data to a central repository for analysis, but according to a Cisco engineer, the Internet of Things (IOT) has changed the game.
The explosion of Web-connected devices -- over 29.5 billion devices will be connected in 2020, according to IDC -- means a "a lot of data has been created at the edge of the network, so the network edge analytics is extremely important to consider as part of the entire technology strategy," Kapil Bakshi, a distinguished engineer at Cisco Systems, said in an interview.
To analyze IOT data at the network edge, Bakshi floated the idea of "fog computing," which he described as having enough compute capability to do processing, network storage and analysis.
When asked which federal agencies have a good handle on IOT devices, Bakshi gave three examples. NASA and the National Oceanic and Atmospheric Administration, with all their ongoing scientific experiments, are well aware of the data they are collecting, he said. And the sheer number of devices the Defense Department is deploying with its soldiers makes that agency well-attuned to the IOT.
Big data analytics needn't be costly. The software and hardware of a traditional enterprise-class database typically costs at least $20,000 per terabyte, but new technologies such as MapReduce and NoSQL have cut that figure below $1,000, according to Bakshi.