White House techies explore the intersection of big data and ethics
A new report examines how big-data applications can uncover discrimination but also encode hidden biases.
Big-data applications might seem remote and impersonal, but the software and algorithms they use are coded by humans and therefore can reflect human error and bias. A new White House report warns that the emerging technology poses risks and opportunities.
"Properly harnessed, big data can be a tool for overcoming long-standing bias and rooting out discrimination," U.S. CTO Megan Smith, Deputy CTO DJ Patil and Domestic Policy Council Director Cecilia Munoz wrote in a blog post announcing the report, which is the second in a series on big data.
Citing case studies on lending, employment, college admissions and criminal justice, the report offers detailed recommendations for advancing the relatively new field of data and ethics. Recommendations include more research into mitigating algorithmic discrimination, building systems that promote fairness and creating strong data ethics frameworks.The Networking and Information Technology Research and Development Program and the National Science Foundation are exploring ways to encourage researchers to delve into those issues.
The report also recommends that designers build transparency and accountability mechanisms into algorithmic systems so people can correct inaccurate data and appeal data-based decisions. And it calls for more research and development into algorithmic auditing and testing.
In an earlier report, the Federal Trade Commission concluded that existing non-discrimination law applies to complaints about bias in lending and credit, college admissions and other activities that use algorithmic models for decision-making.
Although data can be thought of as neutral, coders must decide how much weight to give to the data inputs in algorithmic systems, and those choices can lead to biased results, according to the report.
For example, if a person is searching for the fastest route via a GPS app, the results "might favor routes for cars, discourage use of public transport and create transit deserts," the report states. Also, if speed data is only collected from people who own smartphones, the system's results might be more accurate in places with the highest concentration of smartphones and "less accurate in poorer areas where smartphone concentrations are lower."
Furthermore, an app deployed in Boston used accelerometer and GPS technology on users' smartphones to locate potholes. A Harvard Business Review report notes, however, that older people and those in lower income groups often don't have smartphones, which means the app is not recording information from significant parts of the population.
Despite the drawbacks, "big data is here to stay," the White House blog post states. "The question is how it will be used: to advance civil rights and opportunity, or to undermine them."