The EU Is Exploring Legal Personhood for Robots
For the time being, robots don’t need civil rights—they have a hard enough time walking, let alone marching—but the European Union doesn’t expect that to be the case forever.
“We don’t serve their kind here.”
For the time being, robots don’t need civil rights—they have a hard enough time walking, let alone marching—but the European Union doesn’t expect that to be the case forever. The European Parliament’s committee on legal affairs is considering a draft report, written by Luxembourg member Mady Delvaux, that would give legal status to “electronic persons.”
Delvaux’s report explores the growing prevalence of autonomous machines in our daily lives, as well as who should be responsible for their actions. It’s not intended to be a science-fiction thought experiment, Delvaux told The Verge, but rather an outline of what the European Commission should establish: what robots are, legally; the ethics of building them; and the liability of the companies that do so.
“Robots are not humans, and will never be humans,” Delvaux said. But she is recommending they have a degree of personhood—much in the same way that corporations are legally regarded as persons—so companies can be held accountable for the machines they create, and whatever actions those machines take on their own.
This may not seem pressing: Most of today’s “robots” can only vacuum things, or tell us how many teaspoons are in a tablespoon. But the urgency behind Delvaux’s report comes into sharper focus when you consider something like autonomous vehicles.
Uber is already testing self-driving cars in Pennsylvania and Arizona, as is Alphabet in Silicon Valley (and elsewhere). Soon, Volvo will put driverless cars on the road in Sweden and Ford will in Germany. It’s unclear which, if any, of these companies have established ethics boards for the decisions their robot cars will make—Uber, at least, told Quartz it has not—but regulations that put blame on autonomous-vehicle manufacturers could well convince more of them to consider their legal status before setting those cars out on the road.
Delvaux’s report does suggest the more autonomy a machine has, the more blame should fall with it over its human operators. But robots are generally only as smart as the data they learn from. It might be difficult to determine what a robot is responsible for, and what was because of its programming—a sort of robot version of the “nature versus nurture” argument.
The draft is, of course, just a draft. The European Commission could choose to ignore its suggestions when crafting any future legislation. But Delvaux does make a good point: It’s worth thinking about these sorts of problems before we have rampaging robots that require more robots to put down. Based on so many sci-fi movies, that doesn’t tend to turn out so well.
NEXT STORY: Video: A Steering Wheel for Self-Driving Cars