Google's Mass-Shooting Misinformation Problem
When no reputable information is available, the search engine promotes fake news.
It happened again.
After a horrifying mass shooting, searching for the shooter’s name on Google surfaced an editor of the conspiracy site InfoWars, Julian Assange claiming the shooter had converted to Islam, and a “news” Twitter feed that’s tweeted a few dozen times since it was created last month.
All of these links appeared high up in the search results, just below the “Top Stories” modules in the “Trending on Twitter” box. To Google’s credit, as the hours have gone by, the less-reliable information has been replaced by reputable sites doing actual journalism.
But the damage has been done. Despite the lack of any real evidence about the ideology behind the attack, a search for the shooter’s name now suggests you might want to append “antifa” to your search.
And when you do that, you get a mix of the Russian-backed news organization RT, small conservative sites, YouTube videos purporting to prove Devin Patrick Kelley was an anti-fascist, and a few debunks.
This is all significant because in Congressional hearings last week on the role of Russian meddling on social media in the 2016 election, Google got off easy. YouTube was barely mentioned, despite suffering from the same, if not worse, fake-news problems as Facebook and Twitter. And crucially, Google can give a temporary fake-news phenomenon a much longer life. Google, as we can see in the example above, becomes the repository of the unreliable information that people are spewing into social media.
Congress appears to have missed a key point in its questioning last week. It’s clear that fake news and outright lies are, in fact, a small portion of the total content on any of the big tech platforms. But what matters are the routes that these companies provide to unreliable sources of information. You don’t have to silence Julian Assange or some random Twitter account that’s set up to look like a real news outfit, but you also don’t have to inject them into a legitimate news discussion.
For Google, the problem is the information marketplace around the previously unknown actors in major news events. In the immediate aftermath of the shooting, there just isn’t a lot of content to serve up for the search “Devin Patrick Kelley,” so Google reaches to less authoritative users so that it can show something, anything.
This need for content has not been lost on the hoaxers, trolls, bullshitters, and information profiteers. They know when the attention economy is likely to need their contributions because the real journalists and news gatherers and people on the ground are doing the work to get the facts. And while those other people are working, the know-nothings can surf the attentional wave.
How many oblivious Google users will end up in some strange corner of the internet? How many users will go on to search “devin patrick kelley antifa” in hopes of educating themselves, and in so doing, teach the machines to spread the unsubstantiated information?
This is the Shooter-Name Problem.
We see this problem come up again and again around major national news events. Google has to come to see this as a problem that deserves a solution. (We’ve reached out to Google. They did not respond by publication time.) And one might hope that it leads them to another conclusion, too: When there is nothing reputable to show users, it’s better to show nothing at all.
NEXT STORY: VA To Run Massive Survey of High-Risk Vets