Follow Datanami:
March 14, 2013

CIA Prophet Pointed to Big Data Future

Nicole Hemsoth

A report written in 1962 recently released by the CIA reveals that in the early 1960s, forward thinking minds in the intelligence community were considering the implications of the tech trend we now call “big data.”

“What does the size of the next coffee crop, bull flight attendance figures, local newspaper coverage of UN matters, the birth rate, the mean daily temperatures or refrigerator sales across the country have to do with who will next be elected president of Guatemala,” asks Orrin Clotworthy in the report, which he styled “a Jules Verne look at intelligence processes in a coming generation.”

“Perhaps nothing” he answers, but notes that there is a cause behind each vote cast in an election and many quantitative factors may exist to help shape that decision.  “To learn just what the factors are, how to measure them, how to weight them, and how to keep them flowing into a computing center for continual analysis will some day be a matter of great concern to all of us in the intelligence community,” prophesied Clotworthy, describing the challenges that organizations around the globe face fifty years after the report was authored.

In his report, Clotworthy describes a day when certain data points can be isolated to provide a type of temperature reading on a given populace.  “Once we had succeeded in isolating these factors, could we not then begin to watch the key phenomena continuously  gathering them in and collating them so that at any instant we could read from them the temper of the populace under study?”

The report provides a very interesting history lessons on computing, with some of the challenges described in the report ringing very familiar to once faced in environments today, such as data governance. Clotworthy writes “IBM has developed for public use a computer-based system called the ‘Selective Disseminator of Information.’ Intended for large organizations dealing with heterogeneous masses of information, it scans all incoming material and delivers those items that are of interest to specific offices in accordance with “profiles” of their needs which are continuously updated by a feed-back device.”

“Why do we need the computer,” concludes Clotworthy. “Partly because of the staggering tasks and the shrinking time limits imposed on us by the space-age cold war, we need to delegate to it routine, repetitive arithmetical and logical calculations, thereby permitting fuller application of human skills to problems of judgement.”

You can view a copy of the prophetic report here.

Related Items:

The NSA, Big Data, and “Total Information Awareness” 

Facial Analytics Take on New Expressions 

Targeting Big Data Ethics 

Datanami