Follow Datanami:
August 28, 2015

Cast Your Models Into Data Streams and Take Advantage of the IoT

Evan Guarnaccia

Using data in real-time used to be a challenge. Lengthy extraction, transformation and loading processes make analyzing data an activity more akin to glancing in a rearview window than reading road signs as you pass. And with excitement building around the Internet of Things many just assume that harnessing sensor technology is an IT budget breaker. However, that doesn’t have to be the case. Event stream processing combines high throughput (millions of events per second) and low latency (delays undetectable by humans) to make it easier than ever to work with vast amounts of data.

Event stream processing technology has the ability to sort data streams into three categories: information that needs immediate attention; information that can be ignored; and information that needs to be stored for later analysis–giving users and dependent systems and applications the information they need to take quick action.

For example, a telco provider has access to real-time, detailed phone usage activity, such as the knowledge that a customer is on the verge of running out of data. This is information that can be used immediately. The telco provider can provide the customer with a personalized offer to extend their data plan and avoid fees and overages. Meanwhile, minute-by-minute information streaming in from other customers who are not close to busting their data plan can be ignored. In addition, more general information about who regularly comes close to, or goes over their data plan would be stored to help model the best plans to offer these individuals in the future.

The healthcare industry can also greatly benefit from event stream processing technology. The continuing development of wearable medical devices will data modelprovide vital information that hospitals and care providers can use to reduce costs. Because the cost of keeping a patient in the hospital is more expensive than ever, being able to analyze streams of vital data in real-time would allow more patients to be discharged while still being monitored and connected to the hospital. As patient vitals stream in, hospitals, first responders and patients can be notified instantly if there is an urgent need for care.

When using event stream processing technology, it is helpful to have a graphical way to develop and examine models, making it easy for the model designer and others to interpret the data.

In addition, text analytics are increasingly important as the interaction between companies and their customers through social media continues to grow. It is important for event stream processing products to offer a way to analyze unstructured text data from social media and other sources for entity mentions, sentiment and contextually relevant analysis. The ability to process social media feeds or weather alerts and other notification systems in real-time can allow a company to respond quickly and with appropriate messages and actions when important events happen. Applications include addressing problems, orchestrating logistics, advertising, marketing, and more.

The dawn of the Internet of Things will bring with it the need to embed event stream processing in a number of devices. Virtually all event stream processing products can be used in a distributed environment and this is useful when events are to be processed when they reach a cluster, but a distributed environment is not always the ideal implementation of event stream processing. Lightweight and embeddable event stream processing software will allow sensor data to be analyzed and acted on before it ever leaves the device.

Here are three ways for businesses to derive value using event stream processing technology:

1. Streaming Analytics – The Ability to Dip a Model into the Data Stream

In order to act on streaming data in real-time, it’s important to first make sense of that data. By being able to dip models (also known as continuous
queries) into the data stream, the ingested data can be sent through a series of windows that are customized to process the desired logic. For example, in retail, the real-time scoring is what helps marketers know what kind of customized offer to suggest a customer walking by an iBeacon. Without dipping into the data stream, the iBeacon can only suggest generic offers which are far less effective.

2. A Fully Integrated Approach

Many organizations think, “I just need something to work with the iBeacons, I don’t need event data streaming for all our data.” However, many innovations fail to achieve their full potential if they’re used in a discreet, standalone fashion. It is advantageous to be able to integrate event stream processing into other technologies, such as fraud management or CRM solutions. Robust event stream processing allows all downstream technologies to use resources efficiently and operate more effectively. For instance, if an organization is using an asset performance solution, the historic data can be used to identify patterns for analytical models that are ‘dipped’ in the data stream to predict, for instance, when a critical piece of equipment needs maintenance before it fails.

3. A Real-Time Organizer

Not all bits of data are equal. Split-second decisions have to be made about what’s most relevant.

Event stream processing is designed to help make those decisions. Before landing the data to a disk, event stream processing can distinguish between data that’s critical for immediate action, data that should be stored and analyzed later, and data that is of little to no value. It doesn’t matter how wide and deep a data lake is or how savvy an analyst is at using a technology like Hadoop–at some point the data lake is going to reach flood stage. That makes it increasingly important to know what to store in the first place. Equipment armed with sensors that transmit data for instance, might send a bit (or byte) of data every second. If you have a predictive model dipped into the data stream, you can use that to foretell catastrophic failure. But for longer term analysis of performance, only keeping the data from every fifth second, might be more than enough.

When streaming data is first collected, it’s in raw form. Event stream processing can also provide data management tasks such as data cleansing, normalization, standardization, missing value substitution and other common refinement activities.  Having the technology to perform these functions on live data streams means that by the time data lands in storage, it is already in a useful form.

Modeling, integration and organizational components make event stream processing a must-have as the Internet of Things becomes less talk and more reality.

Evan GuarnacciaAbout the author: Evan Guarnaccia is a SAS Solutions Architect specializing
in event stream processing and Internet of Things applications. Using these technologies, he helps customers realize their real-time business goals. He holds a PhD in physics from Virginia Tech.

 

Related Items:

Hortonworks Boosts Streaming Analytics, IoT Plays with NiFi Deal

The Secret to Generating Value from IoT Data

Streaming Analytics Ready for Prime Time, Forrester Says

Datanami