YouTube Is Turning to Wikipedia to Help It Fight Conspiracy Theories
The company also wants to include other “information cues."
As the public grows more concerned about the heaps of harmful or misleading videos on YouTube, the company is looking for ways to clean itself up.
Speaking at the South by Southwest conference in Austin, Texas, on the evening of March 13, YouTube CEO Susan Wojcicki said the video streamer will launch features intended to curb the spread of misinformation. On videos promoting conspiracy theories, the site will place linked Wikipedia text boxes discussing the event in question.
“When there are videos that are focused around something that’s a conspiracy—and we’re using a list of well-known internet conspiracies from Wikipedia—then we will show a companion unit of information from Wikipedia showing that here is information about the event,” Wojcicki said, according to the Verge.
The company provided the following statement to Quartz:
We’re always exploring new ways to battle misinformation on YouTube. At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing. These features will be rolling out in the coming months, but beyond that we don’t have any additional information to share at this time.
It’s unclear when exactly the features will launch, or what the other “information cues” will be.
Wojcicki’s comments come as social media platforms wrestle with how to balance a commitment to free speech and editorial neutrality with the public’s desire to stomp out misinformation and harassment online. In February, the video site announced it would label videos from broadcasters that received state funding as such. Last week, Twitter CEO Jack Dorsey said his company would soon allow anyone to receive the blue checkmark of verification, in order to reduce any impression that the status implies an endorsement from the company.
Measures such as these, however, arguably don’t address a deeper problem. As the University of North Carolina’s Zeynep Tufekci points out in a recent piece for the New York Times (paywall), algorithms on social networks, designed to show users what they want to see, often end up delivering increasingly “extreme” videos. This exacerbates political polarization and radicalization. As long as these algorithms remain in place, Wikipedia text boxes in videos will not likely do much to slow this trend.
This isn’t going to address the core issue at all. https://t.co/XFCuXpE7id
— zeynep tufekci (@zeynep) March 14, 2018
YouTube including Wikipedia links, Facebook putting an asterisk next to propaganda, & related approaches focused on perceived neutrality feel a lot like Silicon Valley’s version of thoughts and prayers.
— parker (@pt) March 14, 2018
NEXT STORY: TSA Sued Over Electronic Devices Screenings