Follow Datanami:
February 28, 2019

Surviving the Coming Data Governance Wave

(Varunyuuu/Shutterstock)

A data governance wave is building in the United States. If you put your ear to the ground, you can hear it rumbling. It’s inevitable that new regulations eventually will emerge that dictate how companies must ensure the integrity, security, and privacy of the personal data they gather, store, and process. Those who get ahead of the wave have the best odds of success, while those who delay investment run a risk of getting swamped.

There are many reasons why data governance has climbed so high on the list of priorities for business executives and lawmakers alike. Data breaches seem to occur almost every day, damaging reputations and claiming millions in lost market capitalization.

The latest victim is Dow Jones, the storied financial service firm whose “watch list” database of 2.4 million high-risk individuals and business entities was left unprotected on an Amazon Web Services server, according to a story today in TechCrunch. A Dow Jones customer with access to the Elasticsearch database that housed the data apparently did not implement password protection, leaving the data unprotected on the Internet, according to reports.

Data breaches are perhaps the most glaring examples of the sub-par implementation of data governance. But incidents of data abuse echo in the public domain almost as loudly as the breaches.

Facebook is still coming to grips with the implications of the Cambridge Analytica scandal, where the private data of 87 million people was misappropriated for highly targeted campaign outreach. Facebook has taken steps to quell the outrage, but the fire still smolders, and the social media giant this month came under fire again after the Wall Street Journal (a subsidiary Dow Jones, which in turn is owned by News Corp.) reported that smart phone applications are sharing sensitive user data, including weight, blood pressure and ovulation status, with Facebook and without the users’ knowledge.

Barbara Lawler, who was recently appointed Looker‘s chief data privacy and ethics officer, has been watching the data breaches and abuse pile up over the years, and is convinced that the Federal Government will step in at some point to establish governance rules of the road.

(BeeBright/Shutterstock)

The big question is when.

“I thought several times we were at the big breaking point with some of the big data breaches over the last several years,” Lawler tells Datanami in an interview earlier this month. “It’s like the news cycle. It rises and then it seems to modulate, then the next thing takes over, and it sort of becomes the new normal.”

Lawler isn’t quite ready to go on the record and say 2019 will definitely be the year that the United States gets its version of the GDPR, the far-reaching data governance law that went into effect in the European Union last May. There have been a number of bills and frameworks proposed over the past few months, but there’s still much work to be done.

“I think what we’re going to see this year is a lot of hearings, a lot of discussion, a lot of debate,” she says. “And if you ask many of my peers who have been in this space for a long time, they’d say, ‘Okay, this is the year. It’s finally going to happen.’

“But I’ve been in the space long enough to see, ‘We thought for sure it was going to happen after the latest issue,'” Lawler continues. “I don’t think there’s going to be something passed that goes to the President’s desk this year. That’s my opinion.”

Privacy By Design

Nobody knows whether the United States Government will overhaul the federal rules around data security and privacy in the year 2019 or 2029. But that shouldn’t stop forward-looking organizations from starting to prepare their data collection and analytics activities to comply with sort of governance standard.

A consensus is starting to coalesce around the sorts of data-oriented activities and processes that will be acceptable, and what will not be acceptable. In addition to the GDPR in the European Union, the governments of Canada, New Zealand, and Australia have made some headway. And the English will have their own version of the GDPR once Brexit is complete.

Barbara Lawler, Looker VP, Chief Privacy and Data Ethics Officer; FIP, CIPP/US, CIPM

Lawler, who has worked in data privacy for 20 years, is personally involved with organizations like the Future of Privacy Forum and the Information Accountability Foundation to get a better idea of what those future data requirements will look like, and also to provide ground-level feedback into legislation.

One of her current concerns is around the California Consumer Privacy Act (CCPA), which was passed in 2018 and goes into effect in 2020. The CCPA was modeled in some ways after GDPR, but there are some important differences between the two laws that CTOs and CDOs should be aware of.

For instance, the definition of personal information in the CCPA is much broader than GPDR, she says. That could trip up unsuspecting companies that think they can piggyback their CCPA efforts on GDPR.

“One of the open questions for any company doing business in California is, does that broad definition of consumers include employs and B2B contacts? What do these great consumer-oriented rights mean in a business context?” she says. “I’m hoping the California legislature will fine-tune some of these definitions so we’ll have clarity on that.”

Lawler, who served as the chief privacy officer of both Hewlett-Packard and Intuit, shares a certain enthusiasm for privacy, ethics, and governance issues that is hard to come by. “There’s a lot going on,” she says. “This is a fun space.”

One of the things she’s adamant about – and likely one of the reasons for which Looker hired her – is treating privacy and governance as a first-order development principle. “A fundamental aspect of privacy is privacy by design — building privacy in, and not bolting it on,” she says. “That means your adopting privacy engineering techniques. That’s something we’re working on now.”

Ethics of AI

We are emerging from big data’s Wild West period, when questions mostly centered around whether something was technically feasible to do, not whether it was legal or ethical. It will take some time for the culture to change, and it will also take time for the tools to evolve. Lawler hopes that she can help Looker be on the frontline of that change.

“If you think about Looker as this new generation of data platform that offer BI and other solutions, we want to be thinking about our role in that space as essentially a leadership position,” she says, “that we’re thinking about data privacy and data ethics strategically, and not just check the box compliance.”

(Sinart Creative/Shutterstock)

The fact that the Looker tool can reach into existing databases and data warehouses, as opposed to requiring the data to be copied and loaded into yet another location, gives Looker an architectural advantage. Looking forward, Lawler is working with the company’s product and engineering teams to create new guardrails that help customers conduct analytics in a compliant and ethical manner.

“Let’s describe what privacy by design criteria are. What are the guardrails on what’s inbound and what’s outbound as we’re developing new capabilities for the Looker platform,” she says. “Questions like how do we create a safe test environment and not use production data, but use appropriately scrubbed data that’s not from our customers that we can use to actually develop new capabilities.”

Discussions around ethics will go to the heart about the company’s data-driven decision-making processes. Lawler, who touched on some of these in a recent blog post, says the next wave of privacy investments is going to revolve around ethics.

“Data ethics gets at [questions like] what are the unexpected or surprising uses, and how should we be thinking about that?” she says. “Should we curate an algorithm that could have potential downstream effects to minorities or individuals or groups who may have concern about discrimination?

“For Looker, our products are helping companies analyze their data, and where I’d love to see us be able to go in the future is we build out our ethics program…where we can actually provide, not the advice, but here are some of the frameworks you can use, here are some of the questions you can be asking.  How are your business models helping individuals or potentially creating challenges or risk for individuals.”

There currently are not laws that guarantee the privacy of people’s data, or ensure they’re data is not being used unethically. But there likely will be soon, and so the sooner companies realize that momentum is building and respond to it, the better positioned they’ll be when it is a law, and consumers expect their rights to be upheld.

Related Items:

Building a Successful Data Governance Strategy

California’s New Data Privacy Law Takes Effect in 2020

The Wild West and Last Frontier of Big Data

 

Datanami