Follow Datanami:
March 19, 2014

How Big Data Could Be Harming Borrowers

Tiffany Trader

Should big data play a part in determining credit worthiness? Many big data proponents would argue in the affirmative; a number of projects both public and private are successfully using large data sets to predict future behavior. Now the trend has moved into the consumer finance space. A bevy of startups, companies like LendUp, ZestFinance Inc., Think Finance Inc. and others, are leveraging big data technologies to analyze the risk of credit applicants.

Considering how critical affordable credit has become to the well-being of society, this new breed of entrepreneurs is taking an innovative approach to traditional credit assessments. The alternative consumer reporting model looks at a wide variety of data points, which may include an applicant’s Internet searches and social media feeds and even how much time they spend completing a loan application. These sources are combined and analyzed to arrive at a consumer credit score without requiring external credit bureau information.

The stakes are high. Some 64 million consumers struggle to get loans due to a lack of credit history. While some would argue that enhanced data analysis is saving the day by opening up credit to underserved populations, others are wary that this could give banks more leverage to charge vulnerable communities higher interest rates.

So how do these data-driven credit risk models compare to conventional methods? A consumer group studied the issue and found that big data just doesn’t make a difference when it comes to creating loan products.

In a recent report, Big Data: A Big Disappointment for Scoring Consumer Credit Risk, the National Consumer Law Center (NCLC) investigates several major data brokers and seven loan products that use big data underwriting.

“[The report] raises several red flags for consumers, policymakers, and federal regulators,” says co-author and National Consumer Law Center attorney Persis Yu. “Transparency and accuracy are some of the core tenets of consumer protection laws and our analysis finds those requirements are nearly impossible to meet with big data.”

As part of the investigation, fifteen volunteers, obtained reports from data brokers eBureau, ID Analytics, Intelius, and Spokeo. The two co-authors also obtained reports from Acxiom.

The report’s authors note being “astonished by the scope of inaccuracies among the data brokers” they examined. Wrong email addresses and incorrect phone numbers were some of the minor flaws, while more serious infractions included income and educational discrepancies, even mixed identities. Reports from Acxiom, eBureau, and ID Analytics contained very little information. (Apparently some companies only provide full reports to corporate clients.)

The NCLC also evaluated seven big data loan products: Great Plains Lending, MySalaryLine, Plain Green, Presta, and RISE, all of which use ThinkFinance technology; Spotloan, which relies on ZestFinance technology; and LendUp. Six of these bill themselves as payday loan alternatives. The consumer group found that these lenders are in some cases “less bad” than traditional payday loan schemes yet they still feature three-digit annual interest rates that run from 134 percent to 749 percent.

The consumer agency was also concerned with whether the algorithm-based models run afoul of consumer protection laws. The Fair Credit Reporting Act stipulates that consumers have the right to dispute items on the report.

The report’s authors maintain that “it is highly unlikely, given the size of the data set and the sources of information, that the companies that provide big data analytics and the users of that data are meeting these FCRA obligations.”

The study also raises the potential discriminatory impact of big data-backed loans. The algorithms themselves are proprietary, but the companies list Internet usage, mobile applications and social media as common information sources. Since Internet access and usage patterns vary by race and socioeconomic status, the potential exists for race-based profiling.

The NCLC concludes that these big data loans are not living up to their promise. The group recommends that the Federal Trade Commission (FTC) ascertain whether big data-backed lenders are adhering to fair credit and equal opportunity guidelines. Another suggestion is that the Consumer Financial Protection Bureau create a mandatory registry for consumer reporting agencies so consumers can know who has their data.

Reporting from the Wall Street Journal suggests the companies are operating in a legal gray area. The firms are standing behind the argument that the algorithms and data points are mainly used to screen out fraud, not to determine credit worthiness, and as such are not subject to the same reporting laws.

The Federal Trade Commission is meeting today in Washington to discuss how companies are using these so-called alternative scoring products and the impact to consumers. “Consumers are largely unaware of these scores, and have little to no access to the underlying data that comprises the scores,” notes the conference description. “As a result, these predictive scores raise a variety of potential privacy concerns and questions.”

Related Items:

Mining Twitter Data for Disease Risk

Feds Strive to Balance Data Sharing and Risk

Discriminating Aspects of Big Data

Datanami