Follow Datanami:
October 4, 2017

AtScale Revs ‘Universal Layer’ to Keep Data ‘Big’

George Leopold and Alex Woodie


Keeping the “big” in big data is the mission of a four-year-old startup launched by Hadoop and business intelligence veterans with the aim of bridging the gap between users, their favorite tools and underlying Hadoop data platforms that are no longer scaling.

AtScale, which announced a $25 million funding round on Tuesday (Oct. 3), offers what it calls a “universal semantic layer” designed to manage data definitions and security while allowing analyst to retain familiar business intelligence tools. That, the startup asserts, helps eliminate disparate data stacks that make big data small.

“The whole point for AtScale is a universal semantic layer,” CEO and co-founder Dave Mariani, stressed in an interview. “We connect with BI tools—whatever customers are using, whether that be Tableau or Qlik or Excel or PowerBI—and we connect those business users to these new data platforms, regardless of size or structure.”

The startup based in San Mateo, Calif., focused on supporting Hadoop as its first data lake. The latest funding round, Series C, was about scaling its operations and bringing its catchall semantic layer to market. Hence, AtScale is moving beyond Hadoop to offer its platform on platforms like Google (NASDAQ: GOOGL) BigQuery while also targeting relational databases, said Mariani, a former Yahoo executive.

The strategy attempts to provide business users with access to big data—all of it—rather than delivering only a slice. The company cites industry statistics estimating that only 5 percent of enterprise data is available to decision makers.

While the startup provides standard online analytical processing that underlies many business intelligence applications, it is also delivering a data virtualization engine as part of its big data push into the cloud as on-premise Hadoop loses steam. “We’re intercepting queries from the front-end and redirecting them to whatever data platform that we’re talking to,” Mariani explained.

AtScale CEO Dave Mariani

“You see a lot of customers wanting to move to the cloud, and that introduces a new layer of complexity because now they’re having to worry about data in two places, or three places, depending on how many clouds they have,” the AtScale CEO said. That’s where the startups data virtualization engine comes into play.

The company’s proprietary approach responds to Mariani’s experience at Yahoo where his team managed more than 8,000 licenses covering Tableau (NYSE: DATA) and MicroStrategy (NASDAQ: MSTR) tools. “Every new tool needed its own support system, its own data pipeline, business logic management layer and its own team,” he said. These and other platforms arose “when data was small and when data was organized in star schema,” Mariani asserted.

The company also identifies the transition from data warehouses to data lakes as a key big data innovation, with Hadoop being among the first implementations of a data lake following by Google BigQuery and others. In a “multi-data platform world,” Mariani said, IT managers want “the ability to store data in the most efficient, cost-effective manner possible.”

While acknowledging that Hadoop has “matured,” Mariani added, “We still see clusters getting bigger and a broader range of customers moving to Hadoop. We don’t see that Hadoop is dead.”

The fundamental big data problem is scaling: BI platforms won’t scale anymore using existing tools as data volumes soar. On the platform side, enterprise users want to move to less expensive architectures but are finding they don’t scale.

“So we fill that gap and allow business users to kind of come along with us on that journey to the new architecture,” Mariani claimed.

The startup’s most recent funding round was led by Atlantic Bridge, and included Wells Fargo Securities and Industry Ventures along with Storm Ventures, UMC Capital, Comcast Ventures and XSeed Capital.

Recent items:

Hadoop Was Hard to Find at Strata This Week

Hadoop Has Failed Us, Tech Experts Say