White House: If Unchecked, Big Data Could Lead to Discrimination
A new report warns that algorithms might inadvertently be biased against certain populations.
More data isn’t necessarily a good thing, a White House report suggests.
As big data techniques creep into hiring, lending and other common processes, businesses need to ensure their algorithms don’t discriminate against certain populations, according to a White House report linking the technology with civil rights.
Yes, the report says, an algorithm could pull alternative data sources -- phone bills, educational background, and social media connections -- to establish a credit score for someone who doesn’t have an extensive credit history. But tapping into these new data sources could also reinforce credit disparities between separate communities, as a new applicant might be linked to others “largely disconnected from everyday lending.”
The report was meant to be a glimpse into big data applications that could potentially help marginalized populations, but also a snapshot of how they can go wrong. It builds on a 2014 White House report broadly concluding that algorithms could, often inadvertently, discriminate against the same people they tried to help.
“[W]e need to develop a principle of ‘equal opportunity by design,'” the report said.
As alternative credit score algorithms begin polling more dynamic data sources -- potentially including information as granular as GPS location information or social media use -- the likelihood of error increases, the report said. And consumers not used to dealing with large institutions likely won’t know how to fix inaccuracies in the complex calculations delivering their credit scores, the report warned.
» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.
And algorithmic systems used in hiring could help recruiters sort through large volumes of applicants and can also specifically search for underrepresented populations who have skills required for a job. But that same algorithm might get derailed as it tries to search for the ideal candidate; if told that employees who live closer to work generally stay at the company longer, the system might discriminate against people who live farther away from the office, which could skew it in favor of specific social, racial or economic groups.
The groups relying on big data techniques must be aware that, if unchecked, algorithms might correlate completely unrelated factors and assume causation -- income level and ethnicity, among many other potential flaws. And even well-constructed algorithms will falter when fed poorly selected or incorrect data, the report said.
Another recent report from the Federal Trade Commission, came to a similar conclusion; for instance, that targeting ads to consumers with certain characteristics, especially for financial products, might mean that low-income consumers who might be eligible for these products may never see those ads.
NEXT STORY: Why law enforcement agencies need to share data