Social Cooling: Living in a Big Data Society
“Data is not the new gold, it is the new oil, and it damages the social environment,” stated Tijmen Schep. Strong words, indeed. But if you’ve been inundated with articles espousing all the wonders that Big Data brings over the last few years, then it’s high time for a wake-up call. To take Schep’s thoughts further, like oil leads to global warming, so data leads to social cooling. So why should any of us be worried about the concept of social cooling?
Big Data (indeed big companies and big industry) is largely getting out of control, with everything now being turned into data. Too much centralised power (responsibility) rests in the hands of a few private companies, without too much accountability insofar as how the data that they collect / hold is processed. The notion that Big Brother is watching you has never felt so absolute and the notion that many of us are changing our behaviour because of this intense scrutiny is worrying to say the least. Make no mistake, Big Data is supercharging this effect.
And we can’t talk about Big Data without illustrating the part that algorithms have to play in all of this mayhem. Essentially, anytime your data is collected and scored, so-called data brokers use algorithms to uncover all kinds of private details about you—friends and acquaintances, religious and political beliefs, and even sexual orientation or economic stability.
And this is a good time to introduce the notion of “mathwashing.” This is the belief that because algorithms are intrinsically created by mathematics, they must be neutral, right? Wrong. People design algorithms (also computers) and, importantly, decide which data to use and how to weigh it. Still think algorithms are neutral? Algorithms work on the data that humans provide. That means that data can be untidy, often incomplete, sometimes fake, and richly affected by the wide range of human meanings. Again, to quote Schep, “bias in, bias out.”
Algorithms are not neutral; they make mistakes because people make mistakes and they introduce bias because people introduce bias. We have arrived in an era where computers are doing much more than expected but that even those who created them aren’t fully aware of the consequences.
So how does any of this affect you or I? Well for one thing, your digital (footprint) is always there, all of your historical online behaviour. And this ‘digital reputation’ can affect you in a myriad of ways. For example, what you post on Facebook might influence your chances of landing that job you applied for or even cause you to lose a job. You could even find that the rate of that loan you applied for is adversely affected because you have ‘bad’ friends. And don’t forget our earlier observations on the quality behind the data that drive these decisions – these effects are independent of whether the data is good or bad.
The awareness that many have started to feel regarding being watched is bringing about changes in our online behaviour. Are you not clicking on a particular link because you think your visit will get logged and this could impact negatively upon you? Are rating systems failing due to people wanting to conform to an average? Digital reputation systems also serve to hamper free speech – under an authoritarian regime who dares to say what they really think about those in power when it could cause issues when looking for jobs, for loans or even travel visas?
Hopefully the message that personal data is being monetised by big business is also spreading. But how many people know that this is usually being done without their knowledge or consent? Remember, everything that’s shared online can be considered a public record – social media doesn’t offer any privacy. Perhaps this is why Facebook usage among young people in Germany is dropping rapidly?
Overall though, public awareness of social cooling remains low. Air pollution and climate change were once relatively unknown – now look at where we are with the likes of David Attenborough and Greta Thunberg shouting from the rooftops. So we must increase awareness, although there remains a big job ahead with respect to the perception of data and privacy. Schep posits that when algorithms judge everything we do, we need to protect the right to make mistakes. When everything is remembered as big data, we need the right to have our mistakes forgotten.
Which leads us to the notion of privacy. Privacy is the right to be imperfect, even when judged by algorithms. We should be able to click on that link without fear, make comments without reprisal or befriend who we want without affecting our ability to get a job or a loan. Privacy and anonymity are the cornerstones of any decent VPN. It allows you to overcome blockages, geo-restrictions and ad tracking (a massive data compiler). I passionately believe in protecting those who wish to maintain their privacy and anonymity online and look forward to helping raise awareness whilst data mining and the information it exposes on all of us, increases aggressively.
About the author: Sebastian Schaub has been working in the internet security industry for over a decade. He started the hide.me VPN eight years ago to make internet security and privacy accessible to everybody.
Related Items:
Algorithm Gets the Blame for COVID-19 Vaccine Snafu
Anger Builds Over Big Tech’s Big Data Abuses
March 28, 2024
- Elastic Announces 2023 Elastic Excellence Awards Winners
- Woolpert Acquires Ireland-Based Murphy Geospatial, a Leading European Geospatial Solutions Firm
- WiDS Livermore Conference Attendees Network, Share Research and Absorb Wisdom
- Observe Announces $115M In Series B Financing
- Lightning AI’s New Thunder Compiler Boosts AI Development Efficiency by 40%
- Intel Gaudi 2 Remains Only Benchmarked Alternative to NV H100 for GenAI Performance
- Appen Launches Solution for Enterprises to Customize LLMs
- MineOS Unveils AI Asset Discovery
- Cloudera Survey Reveals 90% of IT Leaders Believe that Unifying the Data Lifecycle on a Single Platform is Critical for Analytics and AI
- Snowflake Enhances Secure, Cross-Cloud Collaboration for High Value Business Outcomes with Snowflake Data Clean Rooms
- Domo Announces Winners of the 2024 Community Ovation Awards
March 27, 2024
- New MLPerf Inference Benchmark Results Highlight the Rapid Growth of Generative AI Models
- Qlik Advances Real-time Data Analytics with Solace PubSub+ Platform Integration
- Samsung Unveils Expanded CXL Memory Module Portfolio at Memcon 2024, Enhancing AI and HPC
- Celestial AI Closes $175M Series C Funding Round Led by US Innovative Technology Fund
- Databricks Launches DBRX: A New Standard for Efficient Open Source Models
- Astronomer Unveils New Capabilities in Astro to Streamline Enterprise Data Orchestration
- DataVisor Introduces Enhanced Anti-Money Laundering Solution to Support Financial Institutions
March 26, 2024
Most Read Features
Sorry. No data so far.
Most Read News In Brief
Sorry. No data so far.
Most Read This Just In
Sorry. No data so far.
Sponsored Partner Content
-
Supercharge Your Data Lake with Spark 3.3
-
Learn How to Build a Custom Chatbot Using a RAG Workflow in Minutes [Hands-on Demo]
-
Overcome ETL Bottlenecks with Metadata-driven Integration for the AI Era [Free Guide]
-
Gartner® Hype Cycle™ for Analytics and Business Intelligence 2023
-
The Art of Mastering Data Quality for AI and Analytics
Sponsored Whitepapers
Contributors
Featured Events
-
Data Universe
April 10 - April 11New York United States -
Call & Contact Center Expo
April 24 - April 25Las Vegas NV United States -
AI & Big Data Expo North America 2024
June 5 - June 6Santa Clara CA United States -
AI Hardware & Edge AI Summit 2024
September 10 - September 12San Jose CA United States