Follow Datanami:
May 22, 2023

Where Tableau Draws the Line on AI Adoption

(AI-generated image/Shutterstock)

Large language models (LLMs) like ChatGPT are all the rage at the moment, and software vendors of all stripes are rushing to adopt them. Tableau, the global analytics leader, is also moving to adopt LLMs, but the ways in which it’s adopting them may surprise you.

AI has been on a collision course with BI for years, thanks to the slow addition of data science capabilities into the everyday visualization tools and dashboards used by millions of data analysts and product managers around the world.

But by all accounts, the past six months have been different. Since the launch of ChatGPT in late November, BI vendors have been ramping up the adoption of language AI capabilities at an accelerated pace. Tableau, which held its annual user conference earlier this week, has been one of the BI companies pushing AI forward.

Francois Ajenstat, chief product officer for Tableau, is bullish on the potential for AI to impact not only business intelligence and analytics, but just about every other software platform out there.

“I think GPT technology will be part of every platform, every technology stack,” he told Datanami during an interview at Tableau Conference 2023 two weeks ago. “GPT has captured people’s imaginations and it’s making people see things and dream of the way technology can help them.”

At the show, Tableau made a big announcement around the adoption of LLMs, including the use of ChatGPT and Salesforce‘s Einstein GPT in Pulse, its latest offering. The company appears to be using generative AI in three main ways today.

Tableau Conference 2023 took place in Las Vegas, Nevada

First, it’s being used on the query side to drive a natural language conversation between the user and the system (although it’s not being used to actually generate executable SQL, as some of Tableau’s competitors are doing). This new user interface enables users to ask questions in a more natural manner, in a way that is “a lot more conversational than maybe the generations in the past,” Ajenstat said.

Tableau also uses GPT for summarization. After the results of the query have been generated, Tableau already supports a feature called data stories, which builds a story line, or a narrative, around the results. With LLMs, those narratives become more compelling and readily consumable by people.

Finally, Tableau is also able to use to pull insights from data and suggest possible questions that the user hasn’t thought to ask yet. In Pulse, Tableau is using the generative power of LLMs to recommend queries, which are automatically run and then pushed to the user.

While the embrace of AI in BI is predicted to be a boon to productivity, we’re still just in the beginning stages. LLMs must be contained by guardrails due to the potential for the models to hallucinate things that aren’t really there. That poses a real risk to businesses, which can’t make decisions based on hallucinations.

“With data, you can’t be wrong,” Ajenstat said. “You have to be right. So how do we make it more deterministic? How do we just leverage the work that’s there, train it on the data that’s accurate for the customer, and then make it available in a way that makes sense?”

Some theorize that LLMs could eventualy replace human data analysts (CHUAN CHUAN/Shutterstock)

That is still being hashed out at the moment. Some BI experts are speculating LLMs like ChatGPT will spell the end of SQL. Other vendors are embracing LLMs to generate SQL, albeit with checks to ensure the queries aren’t malformed.

But Tableau has not yet taken that step. According to Pedro Arellano, the senior vice president and GM for Tableau, LLMs will exist on the periphery of the product suite until the hallucination problem can be solved.

“At least for now, you can’t really rely on GPT to be your analytics engine,” he said at TC23. “That’s where you get hallucinations from. That’s why people have a hard time trusting insights that come from the AI….So we still have Tableau do the analytics, and then we have GPT present them.”

While Tableau isn’t ready to hand the keys over to generative AI yet, that doesn’t the company isn’t seeking to give AI a greater role. The field of AI is moving quite fast at the moment, and new breakthroughs could be just around the corner.

“When you talk to people that have been studying artificial intelligence and natural language for years, they’ve seen everything. Nothing surprises them,” Arellano said. “And they tell you this feels like magic. You know that it’s something pretty special.”

The reason why Tableau is so excited by generative AI, Arellano said, is that it helps the company reimagine how we can solve data problems. It also has the potential to lower the barrier to data and analytics, he said, and to put actionable analytics into the hands of many more people.

Tableau CPO Francois Ajenstat at TC23

About 30% of employees today make use of data to make decisions at work, according to figures shared by Tableau. That’s a big improvement compared to when Ajenstat broke into BI business a quarter century ago with companies like Cognos and Microsoft.

“I think the late 90s or early 2000s, the number was 17%, so we’ve doubled adoption over that period of time,” Ajenstat said. “But we’re not where we want to be.”

There are two big barriers to driving that number higher, Ajenstat said. The number one barrier is skills. Data is still intimidating for people, he said. Secondly, the technology hasn’t necessarily been easy to adopt.

“Although Tableau made data dramatically easier, it’s still out of reach of so many,” he said. “Again, that’s the motivation for Tableau Pulse to try to make data much easier, much more intuitive.”

AI as it exists today has the potential to chip away at these barriers–lack of skills and difficult-to-use technology–and put more data analytics into the hands of regular, everyday workers. While everybody doesn’t need data and analytics to do their jobs, there are many folks who can benefit from greater access to data and analytics, and AI and LLMs will play a role in enabling that, Ajenstat said.

“A lot of people are just trying to do their jobs. They’re not data people. They’re people,” he said. “But we think that every job can be augmented by data. Every person who makes decisions can make a better decision with data. And so part of what we’re trying to do with Pulse is integrate it in the workflow, so you don’t have to go to the data. The data just comes to you.”

Related Items:

Beyond the Moat: Powerful Open-Source AI Models Just There for the Taking

LLMs Are the Dinosaur-Killing Meteor for Old BI, ThoughtSpot CEO Says

Tableau Jumps Into Generative AI with Tableau GPT

Datanami