IBM calls for stricter regulatory posture on deepfakes
The multinational tech company supported legislation that outlaws synthetic AI-generated content
IBM is urging lawmakers and federal agencies to police deepfakes and other synthetic content generated by artificial intelligence ahead of the 2024 presidential election, in order to maintain election security, artist protection and privacy.
Announced on Wednesday, the tech behemoth called for a combination of technical and legal solutions to prevent AI technologies and their products from undermining U.S. elections, publicly endorsing the bipartisan Preventing Deepfakes of Intimate Images Act for the first time.
“Democracy depends on a population’s ability to participate in free and fair elections,” the company said in a statement. “Unfortunately, bad actors can use deepfakes to impersonate public officials and candidates to deceive voters in a variety of ways that would undermine this key principle.”
The company advocated protections for artists and creators to defend their work and likenesses, saying lawmakers must “hold platforms accountable if they knowingly disseminate such unauthorized content.”
“We need regulations to protect elections, to protect creators and to protect people's privacy,” said Chris Padilla, IBM's vice president for government and regulatory affairs, during a press briefing.
Padilla also confirmed IBM has supported penalizing the misuse of AI tools and their products since last September, when leadership met with Senate Majority Leader Chuck Schumer, D-N.Y., during his series of AI Insight Forums. On the transatlantic front, the company also voiced its support for the European Union’s AI Act, which applies stricter laws for disclosing when content is synthetic.
He said that supporting these legislative projects that work to penalize and regulate deepfakes is part of the larger aim to bring accountability to AI technologies while not stifling innovation in the field.
“We have said generally that we think existing copyright law is pretty good to protect creators — Fair Use provisions can be used to protect the copyrights of individuals — but that there are certain gaps that need to be addressed,” Padilla said. “It's all part of a bigger picture: regulate risk, promote open source and have accountability. And that's where we're coming from.”