Follow Datanami:
May 21, 2015

5 Ways Big Geospatial Data Is Driving Analytics In the Real World

(Bluemoon 1981/Shutterstock)

Amid the flood of data we collect and contend with on a daily basis, geospatial data occupies a unique place. Thanks to the networks of GPS satellites and cell towers and the emerging Internet of Things, we’re able to track and correlate the location of people and objects in very precise ways that were not possible until recently. But putting this geospatial data to use is easier said than done. Here are five ways organizations can use geospatial data to fuel analytics in the real world.

  1. Dynamic Insurance Pricing

One of the leaders in geospatial data is Pitney Bowes, which maintains an extensive catalog of geospatial data, as well as geospatial encoding engines that run on their own or plug-into high-performance databases, such as SAP HANA. James Buckley, senior vice president and general manager of customer data and location intelligence in Pitney Bowes’ software division, recently discussed several use cases for geospatial intelligence with Datanami.

(Dudarev Mikhail/Shutterstock)

Given any location in the world (in lat/long format), Pitney Bowes can tell you exactly where that is in terms of a street address, and overlay that point with other useful data provided by partners. It’s been delivering that sort of geocoding capability for years, and counts some of the world’s biggest companies as customers. But with the flood of data coming from sensors and phones these days, the potential for geocoding is only getting bigger.

One of the new use cases that Pitney Bowes is exploring with its clients involves dynamic automobile insurance pricing. “If I’m an underwriter with a global data set and a portfolio distributed around the world, I want to look, in real time, what sorts of risk exposure I might be dealing with,” Buckley says. “I need to be able to see, from a spatial perspective, what’s the potential impact of an event. I may want to do scenario modeling or I may want to look at something in real-time that requires considerable spatial horsepower to do that. And you also need a lot of reference data sets, including geocoding, maps, and weather data. etc.”

This type of streaming data analytic problem will combine various real-time and historical data sources together. “You’re tracking effectively where people are driving and how fast they’re driving, and my reference data [of historical traffic data] sits over the road network,” he says. “You need to quickly bring that together and run some analytics on it to give me a score.”

While such a big data solution will require a lot of computing horsepower, what’s really needed is the right spatial context. “Everyone talks about big data. It’s become bit of a cliché. But how do actually give it context so you can draw some sort of sensible conclusion from it? Spatial is critical to that,” Buckely says. “You’ve got all this data, but you need to be able to give it some kind of context.”

  1. Positioning Emergency Responders

When a big storm hits and the power goes out, citizens count on their governmental agencies and public utilities to work together to get the lights back on. While the location and timing of storms remains unpredictable to a large degree, agencies and utilities can leverage the power of geospatial analytics to narrow the gap.

(oo3asy60lfoo/Shutterstock)

One company at the forefront of this sort of work is Space-Time Insight, a Silicon Valley firm that that helps public utilities, logistics companies, oil and gas firms, and federal agencies to combine and interpret various types of data to maximize the effectiveness of physical assets in the field.

Space-Time’s clients are often already storing data generated by field sensors in a big data platform like Hadoop, HANA, or Greenplum, but they lack the special software needed to spot the patterns or anomalies that have spatial or temporal aspects to them. Steve Ehrlich, the company’s senior vice president of marketing and product management, recently explained to Datanami how it all works.

“We help the user identify, out of the mountain of big data, the key pieces of information that they have to pay attention to,” he says. “We’re very good at analyzing data spatially, and understanding what’s nearby, and analyzing it temporally, including what happened in the past, what’s happening now, and what’s going to happen in in the future.”

The company’s product, called Situation Intelligence (SI) Server, helps decision makers see through the data clutter, often by displaying data overlaid on mapping software from Google or ESRI. “We’re trying to help the user understand ‘This needs attention’ or ‘You need to look at that because it’s going to fail’ or ‘Hey there’s a storm coming. Here’s where you should place your crews in advanced of that storm,'” Ehrlich says. “We’re pulling all of that data together, processing it in memory…and deciding, based on the analysis what to present to the user.

  1. Drones!

Drones, or unmanned aerial vehicles (UAVs) as the industry calls them, have been all over the news lately. And as you might expect, there’s a big data angle to them, especially in the context of location intelligence and geographic information systems (GIS) products, such as those from ESRI.

(Piotr Debowski/Shutterstock)

UAVs are emerging as a terrific way to gather image data from the air. According to the Flightline Geographics subsidiary of ESRI partner Waypoint Mapping, UAVs can capture images with resolutions down to one inch, and deliver that data within hours, compared to the days typically required by manned aircraft.

Agriculture is poised to be the single largest beneficiary of the combination of UAV and GIS, according to Flightline Geo’s Devon Humphrey, who was featured in a story last year by ESRI News. “Farmers are already purchasing their own UAVs and inspecting their fields,” he says.

But because of the restrictions on UAVs imposed by the Federal Aviation Administration, the US is lagging behind other countries. “Places like Canada, Australia, New Zealand, Mexico, South Africa, China, and other locations, have already adopted the tool, and it is following the adoption curve of GPS in the 1990s,” Humphrey says. “In other words, explosive growth.”

  1. Boosting Food Production

We currently have about 7 billion people on the planet, and many of them don’t get enough to eat. With another billion people expected over the next 10 years, food insecurity is expected to be an even bigger problem that it is today. Luckily, farmers are starting to use big data techniques to ramp up food production.

(Olga-Maksimava/Shutterstock)

In the U.S., IBM is at the forefront of applying big data analytics in agriculture. Deciding when and where to water, and by how much, is a big part of a farmer’s job, and now Big Blue is bringing big data and location analytics to bear on that problem.

Down in Georgia, along the Flint River, IBM worked with the University of Georgia and local farmers to test a new setup that combines a network of field sensors and atmospheric observations with a supercomputer to create hyperlocal forecasts at super fine resolutions. The idea was to get good at identifying where “pop up” thunderstorms are most likely to happen, and thereby save water for the fields that don’t get rain.

Getting the forecast resolution required to accurately predict those events is really hard, according to Lloyd Treinish, an IBM Distinguished Engineer and Chief Scientist for its Deep Thunder supercomputer. NOAA forecasters typically predict weather in 12-kilometer blocks on an hourly basis. But to predict pop-up thunderstorms along the Flint, IBM dialed the resolution down to 1.5-kilometer blocks at 10-minute intervals.

“Even if you’re looking at something with a four to five kilometer resolution every hour, you’ll miss the thunderstorm cell because it’s smaller than that,” Treinish told Datanami in 2014. “The resolution in time and in geography is really very important. It’s not just pop-up thunderstorms, but also if you have a strong cold front that creates a large amount of thunderstorm cells, the amount of the rainfall can still be very localized based on the topography, and then the soil and vegetation will determine how much moisture actually gets into the soil from the rain. You have to incorporate all those local features to get that information correct.”

  1. Retailer Analytics

There’s no shortage of ways that technologists have attempted to track customer sentiment. But in the retail market, having a geographical element to it can make all the difference in the world.

(jayk67/Shutterstock)

One vendor that’s aiming to apply a geospatial element to the retail space is SpaceCurve. Founded by an engineer who worked on the first version of Google Earth, SpaceCurve delivers what it considers to be the industry’s’ most scalable database for geospatial and temporal data. “It’s a general purpose big data platform,” says founder Andrew Rogers, “but it was fundamentally built…to be able to analyze the relationships across all of those data sources, whether satellite imaging, social media, cell phone telemetry, retail sales,…whatever you like.”

Retailers are adopting the SpaceCurve product to do real-time analysis of time-stamped and location-stamped consumer data, says CEO Dane Coyer. “Retailers are trying to figure out what the demographics are of the people going past their locations. And did they stop at my competitor on the way to my store, or did they stop at my competitor after they leave my store? Those are the sorts of questions they’re looking to answer.”

This type of highly segmented and targeted marketing is commonplace on the Web today, Coyer says. “But not so much in the physical world, so one of the use cases is bringing that capability out into the physical world,” he says.

Related Items:

A Virtual Reality Lens for Big Data Visualization

Building a Better (Google) Earth

Location Intelligence Completes BI Puzzle

Datanami