Spy Agencies Must Work Through AI’s Ethical Issues, Former Leader Says
Sue Gordon, the former principal deputy director of national intelligence, also pushed the intelligence community to put more trust in public information and outside organizations.
Spy agencies need to fully embrace artificial intelligence to maintain the country’s geopolitical strength in the years ahead, but in order for that to happen, officials must focus on building trust, according to the intelligence community’s former second-in-command.
Former Principal Deputy Director of National Intelligence Sue Gordon said that mission is multi-faceted: Agencies must learn to trust organizations and information they haven’t relied on in the past while proving to the public that the government can be trusted with unprecedented insight into their digital lives.
Gordon, who resigned last month amid a shakeup in the Trump administration’s intelligence leadership, has long promoted AI as an indispensable tool in the national security arena. The tech allows officials to quickly sift through the data flooding into intelligence agencies, but because the tools and data are available to everyone, “technology isn’t the same strategic advantage it once was,” Gordon said Wednesday at the Kalaris Intelligence Conference.
Going forward, the countries that can act on information the fastest will come out on top, she said, and the intelligence community must rethink its operations to succeed in a world of “data abundance.”
For starters, it’s important for the government to recognize it trails the private sector in most fields of AI, and that reality isn’t likely to change anytime soon, she said. Instead, agencies must become “fast followers,” quickly adopting new tools as they roll out onto the market and reforming the procurement process to allow for the necessary speed.
But beyond bringing industry tech into their agencies, intelligence officials will need to open their doors to the outside world if they want to stay ahead of global threats, according to Gordon. Agencies must learn “to partner with people [they] don’t know how to trust yet” and also give credence to information collected by industry, civilian agencies and other groups outside the national security apparatus, she said.
“If the intelligence community just sticks with the data that we can curate, that we can understand … we will not be able to get to the future,” Gordon said. “We cannot have the world knowing more than we know.”
Gordon also advocated tearing down data silos within the intelligence community itself. Historically, agencies have closely guarded the information they collect, but leaders will need easy access to those disparate pools of information to make informed decisions faster, she said. Building speedy information sharing pipelines will also require a significant overhaul of the agencies’ outdated IT infrastructure, which isn’t equipped to analyze and manage such enormous quantities of data, Gordon added.
And as agencies begin standing up new artificial intelligence systems, she said, it’s critical officials consider potential impacts on privacy and civil liberties throughout the process. The government is better poised to take on the AI’s ethical issues than the private sector, and the intelligence community is already working to establish a framework for the responsible use of the tech.
As spy agencies collect and analyze increasing amounts of data, “we’re going to have to do so in a way that our society believes that we are being trustworthy and protective of that information,” Gordon said. “This notion of AI and ethics is perhaps even more important for the national security sector to take on that anybody else because of the responsibility and the opportunity the American people give us in order to apply our trade.”