GDPR and the Transparency Revolution
It’s been over a year since GDPR went into effect, and in that time, the regulation has driven a great deal of meaningful conversations around consumer privacy and enterprise data management policies. In an age where “data is the new oil,” fueling innovation and growth, enterprises face all kinds of new challenges when it comes to handling consumer data.
Large data breaches are constantly putting personal data at risk, and today, consumers are demanding greater transparency when it comes to the collection and use of their data, marking a shift from the days when everyone blindly accepted terms of service contracts from tech companies.
GDPR has forced companies to be honest and clear about how they leverage data now that consumers are paying more attention to the fine print. Individuals are more informed than ever before of their privacy rights and are exercising them accordingly, and organizations are finally beginning to understand the importance of demonstrating accountability and transparency with how they collect, handle, and transfer personal data. One slight misstep in the wrong direction, and enterprises may run the risk of hefty, crippling fines or worse: losing customers.
Thanks to GDPR, companies are finally realizing that while data is a hugely valuable asset, it is also a significant liability. Companies that fail to prioritize transparency around data privacy and protection aren’t just risking damage to financial profit and brand reputation, they’re also risking non-compliance with the most important overarching obligation of the GDPR. To get ahead of this, proactive organizations have already taken steps to ensure transparency and compliance in the age of GDPR.
Capturing Consent at Every Step of the Customer Journey
Adopt Data Minimization Practices
For awhile, professional data scientists were in high demand, and companies sought to amass and subsequently analyze as much data as possible to drive business innovation and growth. This mentality has changed in the wake of GDPR.
Businesses are adopting data minimization practices at collection or at retention so they can handle less but more meaningful data, and they’re putting policies in place to dispose of it when it’s no longer needed or useful. Ultimately, an organization’s best cybersecurity measure is to collect less data and encrypt it so that it’s safe in the event of a breach.
Consider Encryption and Data Distribution
Encryption is what makes personal data indecipherable to anyone who isn’t authorized to see it, and data distribution is what essentially bifurcates information files so they aren’t held all in one place where they would likely be more susceptible to a cyber attack.
Companies that care about their customers, employees, and vendors as much as they care about being GDPR-compliant should adopt data minimization practices and leverage technology to encrypt and distribute the small amount of data they do collect.
Don’t Skimp on Quality Cybersecurity
We live in a world where companies are collecting and handling more personal data than ever before, but holding any amount of data is risky if an organization isn’t doing enough to protect it.
The benefits of having less data to manage is that there’s less data to steal, but that shouldn’t exempt companies from implementing cybersecurity strategies and technologies that don’t just “tick the box” for GDPR compliance, but that are actually effective at thwarting cyber crime. New technologies exist that can provide data protection competence while still enabling companies to offer compelling and customized digital experiences for customers that give them confidence that their data isn’t being carelessly collected or processed.
It’s time for global organizations to proactively address the fact that data is both an asset and a liability by being more thoughtful about how much data they’re capturing instead of storing everything, unfiltered, for some unforeseen future use.
Companies need data to function–it’s an asset to every organization, and it would be unrealistic (and unfair) to ask a company to stop collecting it entirely. However, businesses that understand the risks associated with negligent data management practices will be better positioned to comply with GDPR and offer clarity to data subjects. Collecting only the most relevant information for a very specific purpose is not only easier to explain to individuals, but it also gives them confidence that you won’t misuse or abuse it.
About the author: David Thomas is the CEO of Evident ID, a provider of online identify verification solutions. David has held key leadership roles at Motorola, AirDefense, VeriSign, and SecureIT. Since being recruited at a young age by the Department of Defense, David has been at the forefront of cybersecurity including firewalls as corporations began connecting to the Internet, Web security as online shopping emerged, wireless security as Wi-Fi and smartphones became ubiquitous, and security sensing networks as analytic technology became mainstream. He has been featured in CNN, The Wall Street Journal and other leading publications.
October 16, 2019
- The Delta Lake Project Turns to Linux Foundation to Become the Open Standard for Data Lakes
- Splunk .conf19 Hits 10th Year with Over 200 Customers to Present at Anniversary of Conference
- Percona Survey: Diverse Tools, Multiple Databases, and MultiCloud and Hybrid Environments are Transforming Datacenters
- MicroStrategy Partners with DataRobot to Set New Standards for Enterprise AI-Driven Insights
- SAS Reseller EIS Wins Spot on $500M DOJ Analytics Blanket Purchase Agreement
- SwiftStack and InfiniteIO Introduce Solution to Ease Enterprise NAS Cost, Performance, and Capacity
- Study: WANdisco’s LiveMigrator/LiveAnalytics Solution Outperforms its Competitors
- Databricks Simplifies Machine Learning Model Management at Scale with MLflow Model Registry
- Elastic Introduces Elastic Endpoint Security
- Alteryx Releases its Data and Digitization Report at Inspire Europe 2019
October 15, 2019
- Survey: Most Analytics Projects are Jeopardized due to Lack of Access to Data
- Interana Opens Behavioral Analytics Query Language with Release of 4.0
- Intel Xeon Scalable Processors Accelerate Big Data Computing in Alibaba Cloud
- SAS and Red Hat Collaborate to Optimize Analytical Capabilities Across the Hybrid Cloud
October 14, 2019
- Woodbury University Launches Computer Science in Data Analytics Program
- Japanese Researcher Wins Award for Applying Data Mining Tech to Analyze Big Data in Business Marketing
- NFL Launches Second Annual Big Data Bowl Competition
- H2O.ai Announces Additional Speakers for H2O World New York 2019
- SAS Announces Keynote Speakers for Analytics Experience 2019 in Milan
October 11, 2019
Most Read Features
- What’s the Difference Between AI, ML, Deep Learning, and Active Learning?
- Kafka Transforming Into ‘Event Streaming Database’
- Big Data File Formats Demystified
- Cloudera Begins New Cloud Era with CDP Launch
- 10 Big Data Trends to Watch in 2019
- Is Python Strangling R to Death?
- Hadoop Has Failed Us, Tech Experts Say
- How to Build a Better Machine Learning Pipeline
- Can We Stop Doing ETL Yet?
- Is Hadoop Officially Dead?
- More Features…
Most Read News In Brief
- Kafka Spawns Open-Source KarelDB
- California’s New Data Privacy Law Takes Effect in 2020
- HPE Acquires MapR
- War Unfolding for Control of Elasticsearch
- Global DataSphere to Hit 175 Zettabytes by 2025, IDC Says
- What to Expect at Strata This Week
- Data Lakes Get Structured
- MapR Says It’s Close to Deal to Sell Company
- Presto Moves Under Linux Umbrella
- IBM Adds Kubernetes Operator for CouchDB
- More News In Brief…
Most Read This Just In
- Datanami Announces 2019 Readers’ and Editors’ Choice Award Winners
- Databricks Partners with Tableau: Enabling Organizations To Run Business Intelligence on Data Lakes Faster and More Reliably
- Biogen Chooses Lexalytics to Improve Customer Care with New AI-Based Text Analytics System
- Aimia Selects H2O.ai Driverless AI for its Customer Loyalty Programs
- New Cloudera Study Shows Current Enterprise Data Strategies Ineffective
- ArangoDB Extends Open Source Solution with ArangoML Pipeline
- TIBCO Announces Beta Program for its Cloud-Native Enterprise Metadata Solution
- Syncsort Survey Reveals Disconnect Between Data Trust and Data Quality
- DataVisor Named a Global Leader in Cloud Computing
- Aerospike Now Supports Intel Ethernet 800 Series with Application Device Queues Technology
- More This Just In…
October 20 - October 22Charlotte NC United States
October 23 - October 24Berlin Germany