Follow Datanami:
January 15, 2020

AtScale Tackles Data Engineering with Virtualization Software

(arleksey/Shutterstock)

It’s often the prep work in analytics that kills you. Before any analysis can be run, engineers must assemble, transform, and standardize the data. In a world where data silos are proliferating in the cloud, on premise, and everywhere in between, that becomes a big challenge. And that’s basiclay the what AtScale is tackling with today’s update of its data virtualization software.

AtScale ostensibly is an analytics software company, one that was founded by Hadoop veterans to target the Hadoop platform. But in response to the popping of the Hadoop bubble and the rise of cloud data warehouses, the company repositioned its online analytical processing (OLAP) middleware to as a data virtualization layer that insulates users from underlying platform changes while speeding up analytic workloads, wherever they happen to run.

According to AtScale’s Chief Product Officer Scott Howser, the goal is to give customers a single virtual view of all their data, while simultaneously optimizing data workloads for performance, cost, or compliance, whether they’re running in one cloud, multiple clouds, on premise, or any combination thereof.

“When a customer creates a data model with AtScale, that effectively becomes the data model for any BI or AI application, for any tool,” Howser tells Datanami. “Our customers have moved away from building data models in individual tools and are creating what we like to refer to as a universal semantic layer. Whether they’re doing application development or BI reporting, all of those applications consume the same business definitions, relationships, measures, dimensions, etc.”

With previous releases of its software, AtScale would generate a single data model, or what’s commonly considered an OLAP cube, for a given data store. So each customer implementation of Teradata or Hadoop or Snowflake got its own data model, which enabled analysts and downstream developers to get the much-needed consistency with the data.

AtScale’s universal semantic layer streamlines user access to data wherever it’s stored

With today’s launch of Adaptive Analytics 2020.1, AtScale’s software can now create a single OLAP cube that represents data residing in all of those silos simultaneously.

“It is, by and large, what the customers want,” says Howser, who previously ran distributed systems for Discover and worked for Vertica before it was acquired by HP. “They don’t want to go from one vertically integrated topology that has lock-in to another. They want the freedom of choice. So with intelligent data virtualization, the autonomous data engineering provides that capability to transcend cloud, transcend platform, and really offer customers as much flexibility as they desire, irrespective of how they want to physically materialize data.”

That’s the first big change with Adaptive Analytics 2020.1. The second big change is the addition of support for enterprise data catalogs, such as those developed by Collibra, Waterline Data, and Alation. The company developed connectors that enable its data modeling software and those data catalogs to work better together.

Howser explains:

“The technology around the virtual cube catalog enables us to plug into the enterprise data catalogs that the customer have, to publish or make available resources and context that are behind AtScale, and vice versa,” he says. “So now if customer are building out their own data management orchestration strategy with things like Waterline or Collibra, etc. they can push the definitions at us, so there’s complete visibility between what’s been defined in the catalogs, so people can see these assets and understand where these resources are.”

At a high level, AtScale is making a big push into data engineering, which remains one of the biggest bottlenecks in big data analytics. The company touts how its “autonomous” data engineering capabilities can alleviate much of the tedious, time-consuming engineering work that typically must go into preparing data for analytics.

By automating some of this data modeling work, AtScale can help get engineers off the hamster wheel of death and working on bigger and more valuable problems.

“If I have to extract, transform, manipulate, and create that logical table, in most cases, organizations that are doing data engineering for performance optimization, by the time they get those done, you start all over again,” Howser says. “It’s like you’re on a treadmill for life. And it’s because those folks don’t have access to perfect information. And they don’t have access to perfect information because there’s no way they can consume or watch all the different application workload and get all the business requirements.”

The whole point of analytics is to iterate, but the shifting nature of data and business requirements makes that difficult. Customers could avoid this complexity if they had standardized all of their data sources up front, but that’s simply not realistic in today’s exploding data environment. In lieu of perfect source data, a data virtualization layer that can automatically do some that standardization work in response to user activity, while presenting a familiar face to traditional BI tools, could be a good compromise to make. That’s AtScale’s approach, at least.

“The whole point of autonomous data engineering with AtScale is that we have the visibility because we see the users’ intent, because it’s well-expressed in SQL,” Howser says. “So we see that SQL or MDX, and based upon that SQL or MDX, we’re able to infer the intent, and base on that intent, we can take action. We can build or automatically create these acerbation structures, and if the users intent changes, the strategy automatically changes.”

This approach can also pay dividends for customers who are moving to the cloud. As long as customers have invested in AtScale as the holder of their version of the truth, the AtScale software gives customers more flexibility to run their workloads where they desire, while enabling customers to continue to use familiar BI tools, be it Excel or Tableau (now a part of Salesforce).

The company has multiple Fortune 50 clients and is targeting the entire Fortune 2000 with this message. One of those clients is Rakuten, the online coupon company. “Thanks to AtScale, adopting Snowflake was the easiest part of our cloud transformation,” says Mark Stange-Treagar, the vice president of analytics for Rakuten, according to a blurb on the AtScale website. “We just repointed AtScale from our old Hadoop environment to Snowflake, and it was seamless for the users.”

Adaptive Analytics 2020.1 is available now.

Related Items:

Combating the High Cost of Cloud Analytics

Former Vertica CEO Takes Helm at AtScale

AtScale Revs ‘Universal Layer’ to Keep Data ‘Big’

 

 

Datanami