Datawatch’s Big Visualization Strategy for Data
One of the data analytic software vendors to keep an eye on in the coming years is Datawatch, which just spent $31 million to buy Panopticon Software, the Swedish developer of data visualization tools. The company is now integrating Panopticon’s software with its own, with the goal of delivering a single platform for visualizing all of a customer’s data assets, from a historical and real-time perspective.
Prior to its acquisition of Panopticon, Datawatch’s strength lay in the area of processing semi-structured and unstructured data. Its flagship Monarch offering gave customers the capability to ingest all types of data and documents–including reports in plain text or PDF formats–create a model based on the metadata contained in those reports, and yield more structured data as output.
Monarch gave organizations an effective and low-cost alternative to developing big data warehouses to re-house information that was already being generated from critical business applications running on mainframes, Unix servers, and AS/400s. This approach enabled Monarch to attract more than 35,000 customers over the last 20-plus years.
As the big data revolution unfolded, Datawatch’s approach to business intelligence needed an update. Of particular interest to Datawatch was what vendors like Tableau Software and QlikTech were doing with data visualization. With new leadership in place under CEO Michael Morrison, who joined Datawatch two years ago, the company decided to go shopping.
“We started evaluating companies we felt could add the capability to visualize the vast majority of data we were looking for,” says Ben Plummer, a BI industry veteran who joined Datawatch recently as its chief marketing officer and senior vice president of strategic alliances. “We looked at eight to 10 companies, to be honest. Michael and I had both looked at Panopticon in the past. I looked at it when I was an IBM, and he looked at it while he was at Applix.”
The company snapped up the Swedish firm in June, and just finalized the deal three weeks ago, in anticipation of the end of Datawatch’s current quarter today. Datawatch has already completed the first phase of integrating the Windows-based desktop interfaces for the two products. Full integration of the desktops and server components will take one to two years.
The eventual goal is to deliver a single product that combines Datawatch’s capability to ingest and understand less-structured data sources with Panopticon’s data visualization and exploration capability, and to bring the resulting technology to bear against both historical and real-time stream data. Few vendors, if any, can offer this capability today, Plummer says.
The vision of Datawatch 2.0 is to deliver a soup-to-nuts product that gives customers the capability to analyze all of their data assets. “We want to give customers the capability to combine traditional structured data, unstructured data, and real time data to deliver both historical perspective and operational intelligence simultaneously,” Plummer says. “Right now, there’s no other technology in the industry doing this.”
The company will likely have to build, buy, or partner for some of the additional technology required to deliver on this promise. The company already has hooks into Hadoop and Hive, and is working with NoSQL database vendor MarkLogic to expose its data store to the Datawatch-Panopticon combo. It has already been integrated with SAP’s in-memory HANA database and application platform. “We’ll add connectors that allow you to leverage any [data] in your visualizations,” Plummer says.
In the near future, Datawatch will announce a new in-memory database technology for the Panopticon piece, Plummer says. This will give customers the capability to visualize larger data sets than they currently can, he says. This will be useful for exploring historical data, but it won’t be of much use for the real-time component, where data is in and out of the Datawatch software very quickly.
It will be interesting to see which direction Datawatch takes the technology. The company will not succeed if it seeks to be all things to all people. The Monarch technology gives Datawatch a good grasp on what happened, while the Panopticon technology is good at ingesting stream data. Maybe there will be some forecasting added to the mix in the future? It’s hard to say.
For now, the focus is on competing with the likes of Tableau, and maximizing the present, or what Plummer calls “real real time.”
“We’re literally able to take streaming data directly from its origin–whether it’s coming from a trading system or machine sensor or a message bus–ingest it and visualize that immediately,” he says. “The other vendors out there ingest it, put it in a database, and then visualize it. We’re real real time, which makes us different. And we can do the historical stuff too. And then the question is, do we want to go and predict the future with that? It’s always a possibility.”