Democrats and Republicans United in Big Tech Frustration
But what to do with policies like Section 230 remains unclear.
In their first appearance before Congress since the Jan. 6 insurrection at the Capitol, the chief executives of Facebook, Google and Twitter faced frustrated lawmakers in a hearing on the role of social media in the spread of disinformation and extremism.
Lawmakers on the House Energy and Commerce Committee spent much of the Thursday hearing attempting to push Google and Alphabet’s Sundar Pichai, Facebook’s Mark Zuckerberg, and Twitter’s Jack Dorsey into answering yes or no questions around whether the platforms had any responsibility in the attack on the Capitol, whether the individual CEOs believe the COVID-19 vaccines work, and why platforms still allow harmful hashtags associating Asian people with the coronavirus.
The CEOs endeavored for a diplomatic posture by shrouding their views under the cover of nuance. Partway through her questioning, Rep. Anna Eshoo, D-Calif., likened the inability to answer the yes or no questions to the filibuster.
“We don’t do filibuster in the House,” she said.
While what to do about disinformation is not a simple binary, the conversation between lawmakers and Big Tech continues to appear as a triangle-shaped stalemate, with Big Tech, Republicans and Democrats at each point. Dorsey dryly summed up the impasse during his opening remarks.
“Some of you will say we’re doing too much in removing free speech rights. Some of you will say we’re not doing enough and then end up causing more harm,” Dorsey said. “Both are reasonable and worth exploring.”
Despite the willingness of the CEOs to elaborate on the actions they have taken, such as setting up fact-checking programs, promoting reliable COVID-19 information, and taking down content that violates company policies, they were less forthcoming when asked about specific areas where those efforts have failed to do enough to address the problem of misinformation.
The hearing came less than a week after the Office of the Director of National Intelligence released an assessment that found domestic violent extremists “exploit a variety of popular social media platforms, smaller websites with targeted audiences, and encrypted chat applications to recruit new adherents, plan and rally support for in-person actions, and disseminate materials that contribute to radicalization and mobilization to violence.”
Left unacknowledged is how the lobbying power of these tech companies influences the debate. The non-profit consumer rights advocacy group Public Citizen published a report Wednesday showing that Facebook is now the top individual corporate lobbying spender. The same report found 94% of lawmakers with jurisdiction over large tech firms received financial contributions from political action committees or lobbyists associated with those companies.
Democrats on the House Energy and Commerce Committee received nearly $620,000, while Republicans on the committee received more than $420,000, from big tech PACs or lobbyists, according to the report.
“Importantly, the mere fact of a corporate contribution does not automatically compromise a legislator,” the Public Citizen report reads. “Some legislators and committees who have received Big Tech PAC and lobbyist funds have conducted the most thorough investigations and hearings on Big Tech in decades, and have introduced the boldest legislation to stifle the corporation’s unfettered growth to date. At the same time, there is no doubt companies direct their campaign funds in order to gain access and influence.”
Vectors for lawmaker questions varied and included the recent mass shooting outside of Atlanta that killed eight people, six of whom were Asian American, in addition to other issues such as the COVID-19 vaccine, bullying and sex trafficking. But the primary policy issue at hand is Section 230 of the Communications Decency Act. A policy reviled by former President Donald Trump and some Democrats, the 26-word provision protects free speech online by providing liability protection for platforms that host or re-publish the speech of others, according to an explanation from the Electronic Frontier Foundation.
Zuckerberg in his prepared testimony proposed several changes to Section 230. Rather than granting platforms immunity, Section 230 should require demonstrations that platforms have systems in place for identifying and removing unlawful content, Zuckerberg said. But those platforms should still not be held liable “if a particular piece of content evades detection.”
According to civil society organizations such as EFF and Fight for the Future, Zuckerberg’s proposal is problematic on several points.
“Of course Facebook wants to see changes to Section 230,” Evan Greer, director of Fight for the Future, said during a livestream ahead of the hearing. “Because they know it will simply serve to solidify their monopoly power and crush competition from smaller and more decentralized platforms.”
Instead, Greer said, lawmakers should pass federal data privacy legislation and enforce antitrust laws, particularly those that target practices like the nontransparent manipulation of algorithms. Zuckerberg said during the hearing he believes Congress should establish national privacy legislation.
In a post on its website, EFF called Zuckerberg’s proposal an “explicit plea to create a legal regime that only Facebook, and perhaps a few other dominant online services, could meet.” Ultimately, the proposal would lead to increased censorship while still failing to address problems with online misinformation because of the narrow definition of what content is actually illegal, according to EFF.
During the hearing, Zuckerberg clarified that he doesn’t want the Section 230 reforms he is proposing to impact startups and small companies right away.
“I want to be clear that the recommendations that I'm making for Section 230 I would only have applied to larger platforms,” Zuckerberg told Rep. John Curtis, R-Utah. “I think it's really critical that a small platform, you know the next student in a dorm room or garage needs to have a relatively low as possible regulatory burden in order to be able to innovate and then get to the scale where they can afford to put those kinds of systems in place.”
Dorsey said the real issue is algorithms. He called for more algorithmic choice in his testimony. Fixing issues with algorithms and the need to give individual users more power over them would be a “tough” change, Dorsey said, “but it’s the most impactful.”