NSF to issue framework addressing national security implications of sensitive research

National Science Foundation Director Dr. Sethuraman Panchanathan speaks at the 2022 SXSW Conference in Austin, Texas. He told Congress this week that his agency is developing a new framework to reduce security risks in NSF research.

National Science Foundation Director Dr. Sethuraman Panchanathan speaks at the 2022 SXSW Conference in Austin, Texas. He told Congress this week that his agency is developing a new framework to reduce security risks in NSF research. Hutton Supancic/Getty Images for SXSW

The agency’s director noted that the TRUST framework — based on recommendations made in a recent advisory group report — will be unveiled “in the coming months.”

The National Science Foundation will release a new risk management framework to help guide agency decision-making regarding the potential national security implications of research projects working with sensitive technologies. 

The risk rubric process — Trusted Research Using Safeguards and Transparency — responds to a recent report issued by the MITRE Corporation’s JASON scientific advisory group that recommends the NSF develop approaches to mitigating risks to national security stemming from research efforts. 

NSF Director Sethuraman Panchanathan confirmed the new TRUST process is slated to be unveiled and piloted in “the coming months” during a House Science, Space and Technology subcommittee hearing on Thursday.

“In the near future, we will begin a risk rubric that will guide the agency in making determinations about the national security implications of projects in sensitive technologies,” Panchanathan said. “We have prohibited funding for researchers that participate in malign foreign talent programs and developed analytical capabilities to assess risks.”

TRUST will respond to the recommendations in the JASON report, which was requested by NSF itself and released in March 2024, that ask the agency to differentiate scientific research projects based on the sensitivity levels of their potential applications and apply specific mitigation measures to prevent lapses in security. 

Creating an effective framework to gauge research efforts' levels of risk to national security stems from provisions in the CHIPS and Science Act that direct NSF to identify and control research that may expose controlled unclassified or classified information.

NSF has been expanding its research portfolio, particularly following President Joe Biden’s Executive Order on AI, which tasked the agency with multiple agenda items related to safely developing AI systems. 

Panchanathan noted during the Thursday hearing that funding constraints to NSF threaten continued innovation in critical and emerging technologies, namely AI and quantum information sciences. He referenced the steady U.S. geopolitical rivals, notably China, who are allocating significant funding towards domestic research endeavors, as a signal for the U.S. government to respond accordingly.

“The more we cut, the more the ideas that are being proposed to NSF in quantum in AI will not be funded and guess who's funding them? It is our competitor,” he said. “Our competitor is now funding those ideas that we don't fund because we don't have the resources.”