

The U.S. National Security Agency (NSA) continues to collect vast amounts of personal data, allowing the agency and other actors to put it to questionable use, according to a new study on “trolling” personal communications and its impact on big data.
Market research Frost & Sullivan found in a study of NSA and the big data market that the spy agency sweeps up information from 99 percent of calls placed within and outside the U.S. Even when calls originate outside the U.S., they are frequently routed over network equipment owned by American carriers. NSA has ready access to the data carried by these commercial networks.
That ability creates a host of vulnerabilities that affect the big data market, said the report’s author, Jeff Cotrupe, Frost & Sullivan’s industry director for big data and analytics. “By figuratively placing all relevant communications in the U.S. on a dashboard for at-a-glance monitoring, the NSA is creating a scenario where an outside entity that gained control of NSA systems could conceivably and swiftly do a great deal of damage,” the report warns.
The best hope for plugging the security holes are research advances and legislation, including a bill in the U.S. Senate that would limit NSA’s data collection mandate. Sen. Patrick Leahy, D-Vt., chairman of the Senate Judiciary Committee, introduced the legislation aimed at reining in NSA surveillance on July 28, just before Congress left for a five-week recess.
Meanwhile, Harvard University’s Center for Research on Computation & Society has launched a “definitional privacy” initiative that would probe political and ethical issues that extend beyond technical questions. One goal is to develop software-driven big data solutions.
“If successful, this could persuade data hunter-gatherers across law enforcement, public policy, and private commerce to use applied technology to support things like a healthier population–while ensuring things like the U.S. Constitution are still breathing, too,” Cotrope added.
Much of the concern focuses on NSA programs exposed by former agency contractor Edward Snowden, particularly the agency’s PRISM program that reportedly focuses on obtaining direct access to most U.S. electronic communications and credit card transactions. The extension of these eavesdropping programs to smartphones and other mobile devices threatens to have a “chilling effort on the big data market, and many industries that rely on big data,” Frost & Sullivan warned.
The stakes are high for the estimated $25 billion global market for big data. Frost & Sullivan forecast that the big data market would grow by 12.7 percent compound annual rate through 2017, with revenues surging from about $28 billion in 2014 to about $40 billion by 2017.
The market researcher said key drivers of the big data market include mobile, retail and location analytics, social media and site analytics, marketing and sales data along with business process and strategic analytics.
While NSA claims to be interested in collecting metadata, it has been working with its U.K. counterpart, according to Frost & Sullivan, on techniques that could be used to crack Secure Socket Layer encryption used to protect web-based communications and transactions.
Along with targeting content encryption, the market research reported that NSA is also probing for weaknesses in the Advanced Encryption Standard, and encryption algorithm used to secure sensitive by unclassified government information.
Recent items:
Big Data Meets Big Brother: Inside the Utah Data Center
Feds Attempt Big Data Rehab After Snowden Revelations
July 3, 2025
- FutureHouse Launches AI Platform to Accelerate Scientific Discovery
- KIOXIA AiSAQ Software Advances AI RAG with New Version of Vector Search Library
- NIH Highlights AI and Advanced Computing in New Data Science Strategic Plan
- UChicago Data Science Alum Transforms Baseball Passion into Career with Seattle Mariners
July 2, 2025
- Bright Data Launches AI Suite to Power Real-Time Web Access for Autonomous Agents
- Gartner Finds 45% of Organizations with High AI Maturity Sustain AI Projects for at Least 3 Years
- UF Highlights Role of Academic Data in Overcoming AI’s Looming Data Shortage
July 1, 2025
- Nexdata Presents Real-World Scalable AI Training Data Solutions at CVPR 2025
- IBM and DBmaestro Expand Partnership to Deliver Enterprise-Grade Database DevOps and Observability
- John Snow Labs Debuts Martlet.ai to Advance Compliance and Efficiency in HCC Coding
- HighByte Releases Industrial MCP Server for Agentic AI
- Qlik Releases Trust Score for AI in Qlik Talend Cloud
- Dresner Advisory Publishes 2025 Wisdom of Crowds Enterprise Performance Management Market Study
- Precisely Accelerates Location-Aware AI with Model Context Protocol
- MongoDB Announces Commitment to Achieve FedRAMP High and Impact Level 5 Authorizations
June 30, 2025
- Campfire Raises $35 Million Series A Led by Accel to Build the Next-Generation AI-Driven ERP
- Intel Xeon 6 Slashes Power Consumption for Nokia Core Network Customers
- Equal Opportunity Ventures Leads Investment in Manta AI to Redefine the Future of Data Science
- Tracer Protect for ChatGPT to Combat Rising Enterprise Brand Threats from AI Chatbots
June 27, 2025
- Inside the Chargeback System That Made Harvard’s Storage Sustainable
- What Are Reasoning Models and Why You Should Care
- Databricks Takes Top Spot in Gartner DSML Platform Report
- Why Snowflake Bought Crunchy Data
- LinkedIn Introduces Northguard, Its Replacement for Kafka
- Change to Apache Iceberg Could Streamline Queries, Open Data
- Snowflake Widens Analytics and AI Reach at Summit 25
- Fine-Tuning LLM Performance: How Knowledge Graphs Can Help Avoid Missteps
- Agentic AI Orchestration Layer Should be Independent, Dataiku CEO Says
- Top-Down or Bottom-Up Data Model Design: Which is Best?
- More Features…
- Mathematica Helps Crack Zodiac Killer’s Code
- ‘The Relational Model Always Wins,’ RelationalAI CEO Says
- Confluent Says ‘Au Revoir’ to Zookeeper with Launch of Confluent Platform 8.0
- Solidigm Celebrates World’s Largest SSD with ‘122 Day’
- AI Agents To Drive Scientific Discovery Within a Year, Altman Predicts
- DuckLake Makes a Splash in the Lakehouse Stack – But Can It Break Through?
- The Top Five Data Labeling Firms According to Everest Group
- Supabase’s $200M Raise Signals Big Ambitions
- Toloka Expands Data Labeling Service
- With $17M in Funding, DataBahn Pushes AI Agents to Reinvent the Enterprise Data Pipeline
- More News In Brief…
- Astronomer Unveils New Capabilities in Astro to Streamline Enterprise Data Orchestration
- Databricks Unveils Databricks One: A New Way to Bring AI to Every Corner of the Business
- BigID Reports Majority of Enterprises Lack AI Risk Visibility in 2025
- Seagate Unveils IronWolf Pro 24TB Hard Drive for SMBs and Enterprises
- Astronomer Introduces Astro Observe to Provide Unified Full-Stack Data Orchestration and Observability
- Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation
- Gartner Predicts 40% of Generative AI Solutions Will Be Multimodal By 2027
- BigBear.ai And Palantir Announce Strategic Partnership
- Databricks Donates Declarative Pipelines to Apache Spark Open Source Project
- Code.org, in Partnership with Amazon, Launches New AI Curriculum for Grades 8-12
- More This Just In…