Follow Datanami:
September 19, 2014

Is Big Data the Next Big Civil Rights Issue?

There’s no doubt that the big data phenomenon is having a major impact on our lives. Algorithms are affecting what we buy, who we date, and how we work and play. But big data also takes a toll, including a loss of personal privacy and an increased risk of becoming victims of discrimination or unscrupulous vendors. Do these concerns make big data a civil rights issue? One group argues that it most definitely does.

In a new report entitled “Civil Rights and Big Data, and Our Algorithmic Future,” researchers at Robinson + Yu analyzed how people, companies, and the government are using big data, with a special focus on the civil rights implications of data analytics.

The outfit recognizes the benefits that big data analytics has brought to businesses. Auto insurance companies, for instance, have long relied on actuarial tables to help them assess risks of drivers, including how old they are, how many miles they drive, and what ZIP code they live in. But the deluge of big data, the researchers write, allows for a new level of specificity in underwriting that is changing the nature of insurance.

For example, the “Snapshot” device that Progressive encourages customers to install on their cars lets insurance monitor driving patterns, including where they drive and at what times, which the company extrapolates into risk exposure. This personalized approach works great for some people, but hurts others, such as those who have long commutes or drive mostly at night. By reducing the overall risk pool, it effectively discriminates against those who may be good drivers but who drive at the “wrong” times, including the graveyard shift, which is disproportionately manned by minorities.

Data brokers gained the ire of the researchers. David Robinson, Harlan Yu, and Aaron Rieke took the brokers to task for the unregulated manner in which they enable “precision-marketing of consumer products to financially vulnerable” people. “Data brokers vacuum up data from wherever they can, including from public records, social media sites, online tracking, and retail loyalty card programs,” the researchers write. “Using these data, brokers build ‘modeled’ profiles about individuals, which include inferences and predictions about them.”

The researchers cite recent Senate and FTC reports that slammed data brokers and the creation of marketing lists with names “Rural and Barely Making It,” “Ethnic Second-City Strugglers,” “Retiring on Empty: Singles,” “Tough Start: Young Single Parents,” and “Credit Crunched: City Families.” Targeted marketing is nothing new. “But precision targeting of vulnerable groups also carries a risk of harm,” the researchers say.

Other areas where big data could lead to an increase in discrimination against some groups of people include: the use of “alternative data” to create credit histories from utility and phone bills when an individual doesn’t have a traditional credit history; erroneous results generated by the E-Verify database run by the Department of Homeland Security (DHS) to verify employment status; and the use of “hiring algorithms” to help employers select job candidates (including the predictive approach used by the big data firm Evolv, which Datanami profiled earlier this year).

The criminal justice system had its own chapter in the report. The researchers cited the Chicago Police Department’s controversial use of predictive technologies to preemptively contact about 400 people who were most likely to commit crimes. While the city claims the “Heat List” may have already reduced crimes, the lack of transparency into what data goes into the algorithms has prompted concern among organizations that seek to bolster civil rights.

Other big data tactics used by law enforcement that made the researcher’s naughty list include license plate scanning, facial recognition software, and snooping on mobile phones. “Transparency is vital. It allows for the incorporation of safeguards and policies that could pave the way for fair uses of these new technologies,” they write. It shouldn’t be surprising, then, that the new generation of body-mounted cameras that police departments are experimenting with gained the researchers’ nod of approval.

The government’s widespread surveillance programs first revealed last year by former NSA contractor Edward Snowden had its own chapter in the paper. The “warrantless wiretapping” of American’s phone calls may not give the government insight into the contents of our conversations. But the valuable metadata collected on each call does enable the government to create detailed maps, or graph databases, of the communication patterns of all Americans.

The invasion of privacy that warrantless wiretapping constitutes is a civil rights issue in its own right. But as the Robinson + Yu researchers pointed out, the big data tactics also threaten to undermine our judicial process. It cited cases in which major phone companies work hand-in-hand with federal law enforcement officials to gain incriminating evidence against suspects without following due process. When the DEA officials then concealed the source of the evidence, it amounted to “phonying up the course of the investigation,” the researchers quote former federal judge Nancy Gertner as saying.

“As big data increasingly informs decisions from national security to public school budgets, from employment to law enforcement, vulnerable communities need policies that limit the risk of super-sized discrimination,” says Malkia Cyril, executive director of the Center for Media Justice. “We applaud this incredible report from Robinson + Yu, and are excited that it offers such an important case for civil rights protections against the unmitigated use of 21st century technologies to surveil communities of color, low-income families, migrants, and others.”

The entire report is available here.

Related Items:

Concerns About Big Data Abuses Grow

Police Push the Limits of Big Data Technology

Big Data for the Common Good