Audits of CBP’s Facial Recognition Partners ‘Vital to Building Trust,’ Lawmaker Says
The agency has “a ways to go” in auditing facial recognition partners for privacy and security, a watchdog noted.
Customs and Border Protection has taken steps to enhance the privacy and transparency of its biometric entry-exit program, according to the U.S. Government Accountability Office, but more needs to be done to ensure that partners, contractors and vendors associated with the deployment of the facial recognition technology are complying with federal privacy and security requirements.
Rebecca Gambler, director of GAO’s homeland security and justice team, said during a House Homeland Security subcommittee hearing last week that CBP has conducted “five assessments of its partners in the air environment” to ensure they are complying with the agency’s security and privacy policies, and is in the process of conducting another three additional assessments. A previous GAO report conducted in 2020 found that the agency had only audited one of its then-27 airline partners to ensure that they were complying with the agency’s privacy and security policies, despite facial recognition technologies being used by its partners in the air environment since 2017.
Gambler said in her testimony that, as of last month, CBP has deployed facial recognition technology to at least one gate at 32 airports for travelers leaving the country, at all airports for travelers entering the U.S., at 26 seaports for travelers entering the U.S. and at all 159 land ports of entry.The biometric entry-exit program uses facial recognition technology to confirm the identities of travelers arriving and departing at ports of entry by comparing their biometric data with government-issued IDs, such as passports.
Rep. Nanette Barragán, D-Calif.—chair of the House Homeland Security Subcommittee on Border Security, Facilitation and Operation—noted during her opening statement that CBP does not have “a robust system for conducting audits” to ensure that the agency’s partners, vendors and contractors are abiding by the agency’s privacy requirements.
“These audits are vital to building public trust,” Barragán said. “Proper oversight ensures that biometric data gathered in airports is not monetized by private industry or kept in industry databases.”
In 2019, a data breach involving a CBP subcontractor exposed roughly 184,000 images of travelers from a biometric entry-exit pilot program. Prior to the breach—which CBP officials said may not have been identified through an audit because security protocols were already in place—the agency had not conducted any audits of its contractors.
Barragán questioned CBP’s ongoing pace of audits, saying that the agency’s progress in conducting audits of its partners “seems like a small sample to me.”
Gambler said it was a positive sign that CBP is moving to implement these security and privacy audits, but added that “they do have a ways to go.”
“To fully implement our recommendation, they need to audit partners not just in the air environment, but also in the land and sea environment, and they need to ensure they’re conducting those audits on their contractors and vendors as well,” Gambler said. “So they are taking some positive steps, but they still need to take more action to really implement our recommendations.”
These audits, according to Gambler, should look at the privacy and security requirements that partners are utilizing, as well as the implementation of those requirements.
Gambler also said that CBP is making progress on efforts to post privacy notices and additional public information at all ports of entry where facial recognition technologies are being used, including providing U.S. travelers with notices that they can freely opt-out of the facial recognition screening process if they so choose.
A report last month from the Department of Homeland Security’s Office of Inspector General found that CBP “complied with facial recognition policies to identify international travelers" and took additional steps to enhance its facial recognition policy guidelines, including requiring mandatory referrals to a secondary inspection when a facial mismatch occurred and implementing system controls to remove the ability for CBP officers to override facial mismatches.
Privacy and civil rights advocates, however, remain concerned about CBP’s use of facial recognition and algorithmic biases in the underlying technologies that have disproportionately affected minorities. As Barragán noted during the hearing, a 2019 report from the National Institute of Standards and Technology found that the “Asian and African American faces were 10 to 100 times more likely to be misidentified than white faces.”