Follow Datanami:
April 9, 2012

Goodnight to BI Performance Claims

Datanami Staff

Jim Goodnight started small when tackling agricultural data for a software startup he and a group of colleagues at North Carolina State University founded.

Over the last thirty years, however, the data those first agricultural users generated and crunched has multiplied at a staggering rate—right in line with nearly every other industry on the planet.

That small company went on to become of the top analytics software firms in the world many years later. Today, SAS is providing the analytical backbone for countless Fortune 500 insurance, retail, finance and healthcare companies and continuing growth.

The SAS CEO says that there is an important difference between simple BI and true high performance analytics that vendors are not making clear. In fact, he says, there is a lot of talk about “high performance” that doesn’t take into account both the hardware and software foundations that differentiate more traditional BI tools of the trade from more advanced analytics capabilities.

As Goodnight said of the lack of distinction between high performance analytics and run-of-the-mill BI:

“We’re seeing a lot of other vendors talking about high performance, but what they’re talking about is simply moving an SQL database into memory along with the data. That does serve up data a lot faster, but when it comes to things like computing regressions and logistic models, they can’t do it because that type of computational ability is not built into databases and probably never will be.”

According to Goodnight, the use of in-memory technologies, new concepts in database technology and parallel computing have reshaped the playing field—and even some of the users are still catching up and realizing what is possible.

He points back to a time when companies simply had to factor in the understanding that there would be a slow time to solution—it had to be built into the process. However, he says:

“By using more than one processor, we are able to harness the power of hundreds or even thousands of processors in parallel and are able to do the computations much faster.” What this means is that those same companies are seeking to reroute their understanding of what is possible with the ability of near real-time results on massive data sets.

A good example of this can be found in financial services. As he notes:

“In a banking environment where banks have to compute hundreds and hundreds of models every year, and they’re changing almost every month or every week to be able to do better prediction or better forecasting. If these things can be done a hundred times faster, then they’re going to be able to do probably 10-20 times more models in the same amount of time than they would have before. That speed and the importance of changing models rapidly, is incredibly important in the banking industry.”

Related Stories

SAS Shifts Retail Analytics to HPC Platform

SAS Spots Targets in the Wild

SAS Extends Integration with Hadoop

Datanami