To Protect Research Subjects, Account for the Internet
When today’s scientists design experiments, they most often refer to ethical guidelines written decades ago.
The old ethics rules no longer offer adequate protection to field research subjects, say social scientists.
As a result, individual people and even entire societies are left vulnerable to financial ruin, emotional manipulation, and more.
In their peer-reviewed essay, Rose McDermott, a professor of international relations at Brown University, and Peter K. Hatemi, a professor of political science at Penn State, argue that the advent of computers, the internet, and social media have yielded massive change in the design and execution of certain types of large field experiments—change that traditional ethics guidelines couldn’t have anticipated.
Equipped with no widespread formal guidelines on securing voluntary consent in the internet age, scientists are designing big experiments that can, and often do, cause harm, McDermott and Hatemi say. But if research institutions, leading journals, and scientific professional organizations were to publish and enforce updated ethical standards, scientists might better understand how to gather important insights without unintentionally damaging people and societies.
“The concern we’re voicing is that early ethical guidance doesn’t account for field experiments on huge numbers of people, because these experiments weren’t common or even possible before the 1990s,” McDermott says. “There’s evidence that some of these recent experiments have stoked racial resentment, changed election outcomes, and caused huge societal divisions. We’re not saying these kinds of big field experiments aren’t valuable—we’re saying we need to come up with ways to do it ethically.”
McDermott’s and Hatemi’s essay is a “Perspective” piece in the Proceedings of the National Academy of Sciences. Perspective pieces undergo the same submission and review processes as research reports, but rather than describing the results of original research, they present a balanced, objective, and thoroughly researched viewpoint on a specific field.
McDermott says that when today’s scientists design experiments, they most often refer to ethical guidelines written decades ago, such as the Declaration of Helsinki—a much-revised medical ethics guide first written in 1964—and the 1979 report “Ethical Principles and Guidelines for the Protection of Human Subjects of Research,” now commonly known as the Belmont Report. But those guidelines, she says, were not created with computers, the internet and social media in mind.
“You’ve had a rise in the ability to do massive computing projects and analyze lots of data very quickly,” McDermott says. “With social media platforms, you can have all kinds of access to huge populations. Combine with that the immense pressure on academics to publish high-impact research quickly and frequently, and you’ve got a world that looks very different than it did when the Belmont Report was published.”
How Studies Have Caused Harm
As a result, McDermott says, researchers have recently undertaken studies that made important discoveries but also changed people’s behaviors, caused them trauma, or even financially endangered them. For example:
- Several studies that have sought to identify what increases or depresses voter turnout have unintentionally altered election outcomes by influencing voters with racially-charged mailers and phone calls and door-to-door visits from fake political candidates.
- Scientists seeking to understand how social media alters people’s moods and political affiliations have inadvertently engaged in the emotional manipulation of hundreds of thousands of people by pushing certain types of posts to their feeds.
- And many studies investigating the benefits and drawbacks of financial assistance have purposely given or withheld money from research subjects, causing them to become homeless or suffer from increased domestic violence.
Times Have Changed
“Science is a process of trial and error, and I think in the early days of large field experiments, people couldn’t anticipate what might happen to the subjects,” McDermott says. “But now we do know what can happen if we’re not careful. We need to stop, take a breath, and take stock of the damage some of these experiments have done so that we can learn from those mistakes and implement changes.”
Those changes, McDermott and Hatemi argue, must come primarily from the top down. In their essay, the two scholars called on academic professional associations, journals, and research institutions to update their policies to not only adhere to existing ethical norms but also formulate new requirements to address potential harm in large-scale field experiments that impact entire populations.
McDermott says she hopes the PNAS essay helps spur the kind of systemic change Henry Knowles Beecher kickstarted with his famous 1966 essay exposing unethical practices in the field of medical experimentation. Beecher’s investigation eventually led to the passage of federal rules requiring scientists to obtain informed consent from study participants.
“While we would be happy to see individual scholars and individual universities addressing issues of informed consent in their field research, real change can only happen if it’s systemic,” McDermott says. “It’s just like reversing climate change: Yes, it’s good that you bought a Prius, but really what we need is for governments to shut down coal-fired plants. All of the institutions that hold power in science need to work in concert.”
This article was originally published in Futurity. It has been republished under the Attribution 4.0 International license.