Pilots, Bias Important Considerations Ahead of Deploying AI
Cloud technology makes piloting artificial intelligence efforts easier than ever, but hard decisions around bias need to be confronted.
Pilots can serve as a critical resource for agency insiders as they begin to deploy artificial intelligence, and considerations around built-in bias should be addressed from the earliest of efforts, federal officials with firsthand experience leveraging the technology said in Washington Tuesday.
“One beauty of the cloud is you can do a pilot, and if it fails just turn it off—it’s not like buying millions in servers and then find that you are going to go in a different direction,” Small Business Administration Deputy Chief Information Officer Guy Cavallo said at Nextgov’s Tech Talks.
“We have actually done three 90-day sprint pilots that have revolutionized the way we work.”
Cavallo explained how SBA effectively implemented artificial intelligence and the cloud to enhance their cybersecurity capabilities and supplement what human employees are able to do. Elaborating on the success he’s seen through various piloting projects related to that work, Cavallo said the agency participated in one with General Services Administration and Homeland Security Department for the Trusted Internet Connections initiative.
He said the pilot ultimately led DHS to issue an exception for specific use cases, as SBA proved their cloud cyber tools approach were a viable alternative solution to TIC.
Now the agency is in the process of writing a final report on a more recent pilot through which they sought to prove that their approach may be used to meet the same intent as the Continuous Diagnostics and Mitigation program. Cavallo expects that work to bring benefits to all agencies.
“I want you to realize that you do not need an army of people to do this,” Cavallo said. “You get a handful of people that believe in it and they can move forward with it.”
“I would agree with that,” Veterans Affairs Department’s inaugural Director of Artificial Intelligence Gil Alterovitz said. “I think some of the best pilots don’t have as many people, but they have the right people—that is the key.”
Through his role in academia, Alterovitz led teams in exercises that were run much like pilot projects. In the beginning, he found that the participating doctors would want to work on teams with other doctors, engineers latched on to other engineers, and the same would happen with others across different professions, which ultimately led to a variety of great ideas that didn’t really work in execution.
“So in future years we forced people to have one of each person in the group, and then we started seeing things,” Alterovitz said.
By diversifying the backgrounds of people in each group, the teams were able to think up new designs and projects that could actually be used in real-world environments.
“And that is the kind of approach we are thinking of now in research and development [at VA],” Alterovitz said. “To have people—not an isolated researcher—but to combine them with others in a veteran-centric design approach.”
The federal executives also discussed how their agencies are approaching the issue of bias in AI algorithms and datasets. From congressional hearings, legislation and warnings from lawmakers, it’s an issue that’s moving to the forefront as more and more people consider the impacts it can have at the individual level.
Cavallo and Alterovitz said it’s something their agencies are facing head on.
“Any data set will have some biases, so you need to take that into account, it’s just in different ways,” Alterovitz explained.
He said, just like in academia, bringing a variety of people with many different backgrounds to the table ahead of deploying the tech can itself potentially ruin bias at the forefront. The agency also recognizes that VA data itself is different than data on the general population. So Alterovitz and his team are looking at processes like transfer learning to essentially apply learnings from other studies being done in the general population and transfer some of that knowledge onto their datasets.
SBA offers loans to homeowners and small businesses when disaster strikes and Cavallo said the agency is now considering ways to leverage their AI practices to combat loan fraud, which he noted opens the potential for algorithmic or data bias to unintentionally be backed in.
“Unfortunately we don’t have any data scientists or any AI experts on staff at SBA, so it is something we are going to have to work through, most likely through contractors with government leadership,” he said. “But it's definitely something we consider as we move forward.”
Alterovitz added that others from multiple agencies have reached out to him for recommendations around their own needs for technical experts and how to go about creating new tech-driven positions. The two agreed to have a broader discussion about it in the near future.
Editor's note: This story's headline was updated to better reflect comments by officials.