Follow Datanami:
August 12, 2013

Big Data, the Thought Police, and the Decision Problem

Alex Woodie

In George Orwell’s famous book 1984, citizens of Oceania live in constant fear of the Thought Police, a group of government agents who prosecute individuals for having thoughts that are contrary to the dogma laid down by Big Brother. Now, some prominent observers are saying that the U.S. Government has the Big Brother-esque capability to monitor people’s thoughts, as expressed over the Internet, via surveillance programs, such as the NSA’s recently disclosed PRISM program.

The idea that the U.S. Government can read people’s thoughts was recently put forth by George Dyson, the author and historian who wrote the acclaimed 2012 book Turing’s Cathedral. Dyson, who is the son of theoretical physicist Freeman Dyson, recently expressed his views on government thought monitoring in an essay he wrote for Edge, titled “NSA: The Decision Problem.”

“The ultimate goal of signals intelligence and analysis,” Dyson writes, “is to learn not only what is being said, and what is being done, but what is being thought. With the proliferation of search engines that directly track the links between individual human minds and the words, images, and ideas that both characterize and increasingly constitute their thoughts, this goal appears within reach at last.”

Of course, the machines can’t know exactly what people are thinking from moment to moment. That is not only probably impossible, but it’s unnecessary. “A reasonable guess at what you are thinking is good enough,” Dyson writes.

The U.S. Government mines data gathered from the Internet and uses supercomputers to run algorithms that generate educated guesses at people’s thoughts under the pretext of preventing terrorist acts. Dyson is clearly disturbed at the prospect of an automated thought-monitoring system connected to a fleet of government drones equipped with missiles.

“This is only a primitive first step toward something else,” Dyson writes. “Why kill possibly dangerous individuals (and the inevitable innocent bystanders) when it will soon become technically irresistible to exterminate the dangerous ideas themselves?”

However, this approach won’t work, because it’s impossible to differentiate the dangerous ideas from good ones, Dyson writes, citing Alan Turing’s 1936 paper, “On Computable Numbers, with an Application to the Entscheidungsproblem.”

The Entscheidungsproblem, which is German for “decision problem,” states that there could never be any systematic mechanical procedure to determine, in a finite number of steps, whether any given string of symbols represented a provable statement or not. “In modern computational terms,” Dyson writes, “no matter how much digital horsepower you have at your disposal, there is no systematic way to determine, in advance, what every given string of code is going to do except to let the codes run, and find out.”

Dyson doesn’t dispute the need to conduct surveillance, but is clearly worried about the direction of the U.S.’s security state. The big problem is the cloak of secrecy that the government casts over its most powerful programs, such as the NSA’s PRISM program. He says he prefers the system employed by the UK, where surveillance cameras are a ubiquitous part of British society, and the rules under which they can be used are understood by all.

“We are facing a fundamental decision (as Turing anticipated) between whether human intelligence or machine intelligence is given the upper hand,” Dyson writes. “The NSA has defended wholesale data capture and analysis with the argument that the data (and metadata) are not being viewed by people, but by machines, and are therefore, legally, not being read. This alone should be cause for alarm.”

President Barack Obama on Friday pledged greater transparency of the government’s surveillance programs. “It makes sense for us to go ahead lay out what exactly we’re doing…and see if we can do this better,” the President said in a press conference.

“I don’t have an interest and the people at the NSA don’t have an interest in doing anything other than making sure that where we can prevent a terrorist attack, where we can get information ahead of time, that we’re able to carry out that critical task,” he continued.

“I’m comfortable that the program currently is not being abused. I’m comfortable that if the American people examined exactly what’s taking place, how it was being used, what the safeguards were, that they would say, you know what, these folks are following the law and doing exactly what they say they’re doing.”

Related Items:

 

Big Data Meets Big Brother: Inside the NSA’s Utah Data Center 

Data Athletes and Performance Enhancing Algorithms 

Interplay between Research, Big Data & HPC 

 

Datanami