Discriminating Aspects of Big Data
Technology can be a blessing and a curse, and big data is no exception. While the ability to mine vast stores of data can promote greater border security and drive intelligent business decisions, there is also great potential for misuse.
Harvard Business Review blogger and author Michael Schrage explores this topic in a recent blog entry. Knowledge is power, observes Schrage, a research fellow at MIT Sloan School’s Center for Digital Business, but what kind of power does analytics-derived knowledge buy?
We live in an era where companies increasingly have the ability to mine massive amounts of user data to cull the most personal details of its customers’ private lives, including their likes, dislikes, hobbies, political interests, and even their personality types. Based on this unique data profile, the vendor can customize products and services to improve the user experience and bolster profits. It’s all in the name of value-added personalization, what could go wrong?
Aside from the privacy issues, big data analytics and the ability to micro-segment markets in this way can set the stage for discrimination.
“Big Data creates Big Dilemmas. Greater knowledge of customers creates new potential and power to discriminate,” writes Schrage. “Big Data – and its associated analytics – dramatically increase both the dimensionality and degrees of freedom for detailed discrimination. So where, in your corporate culture and strategy, does value-added personalization and segmentation end and harmful discrimination begin?”
By going granular, which is what big data does, the technology provides insight based on ethnic, economic, geographic and a multitude of other factors. Schrage provides these statistics culled from big data analysis:
- Single Asian, Hispanic, and African-American women with urban post codes are most likely to complain about product and service quality to the company. Asian and Hispanic complainers happy with resolution/refund tend to be in the top quintile of profitability. African-American women do not.
- Suburban Caucasian mothers are most likely to use social media to share their complaints, followed closely by Asian and Hispanic mothers. But if resolved early, they’ll promote the firm’s responsiveness online.
Gay urban males receiving special discounts and promotions are the most effective at driving traffic to your sites.
“These data are explicit, compelling and undeniable,” asserts Schrage, “but how should sophisticated marketers and merchandisers use them?”
Regulated industries, for example health, medicine and finance, already forbid certain kinds of discrimination. But society as a whole and many areas of the law are unclear just where “value-added” personalization ends and harmful discrimination begins. When companies give privileged offers to certain groups of people based on ethnicity, gender, race and other demographics the flip side of that is that they are withholding those offers from other groups of people because statistic and probabilistic models show they are less profitable.
“Big Data analytics renders these questions less hypothetical than tactical, practical and strategic,” writes Schrage. “In theory and practice, Big Data digitally transmutes cultural clichés and stereotypes into empirically verifiable data sets. Combine those data with the computational protocols of ‘Nate Silver-ian’ predictive analytics and organizations worldwide have the ability – the obligation? – to innovatively, cost-effectively and profitably segment/discriminate their customers and clients.”
Once again, we are reminded, as so many other times throughout history, that technology is a tool, neither good nor bad. It’s not an all-encompassing solution to our human problems, any more than it is responsible for our social ills.