Follow Datanami:
September 11, 2023

Teradata Gives Customers Their Own LLMs with Ask.ai

(Horoscope/Shutterstock)

Enterprises that want to train their own custom large language model (LLM) have a new option today with the launch of Teradata’s ask.ai. The analytics vendor partnered with Microsoft to build the new offering, which enables users to ask a question in plain English and instantly receive a response.

Teradata’s new ask.ai offering, which is in public preview for its VantageCloud Lake on Azure customers, utilizes an LLM from OpenAI running in the Microsoft cloud. It allows users to query their company’s own data, which was used to train a customized LLM that’s running in their company’s own Azure cloud instance, through a Web or mobile interface.

“It’s an individual model in the customer’s own tenant,” says Teradata Chief Product Officer Hillary Ashton. “Obviously, they have to opt in. They have to decide it’s something that they’re comfortable doing. But the data and the model is not shared outside of the customer tenant.”

In a demo, Ashton queried ask.ai with a detailed and hypothetical question about medical claims. The product returned an equally detailed response, generated from the LLM trained on the hypothetical data contained in the VantageCloud Lake data warehouse.

“So traditionally, you’d have to write all the SQL to be able to do that,” Ashton says. “You probably could go into some tools to go do that. Today, you can do that automatically in the ask.ai interface.”

The simplicity of the ask.ai interface belies a lot of activity going on behind the scenes. To get the right answer from the LLM, there’s quite a bit of prompting that must go on to explain to the model what the context of the question is. All of that is automated for customers, Ashton says.

The product is designed to expand the number of users who can get access to data and analytics. Instead of requiring users to have the ability to write code or have a license for an expensive BI tool, Teradata is allowing users to simply ask questions using natural language, and receive an accurate response.

“What we what we heard from our customers is that they have super users, but they want to democratize access to data and analytics quickly and easily,” Ashton tells Datanami. “And so by providing the ask.ai experience to our customers, we believe that it will continue that democratization of knowledge and analytic outcomes in a way that previously you had to be pretty much a super user.”

In addition to enabling regular users to query data, ask.ai can open up doors to other personas. For instance, administrators can use it to get information about the use of the product, data architects can use it to get information about table design, and data scientist can use it to generate code for them, Teradata says.

The natural language experience is bolstered through the application of industry models that Teradata has created over the past decade. The company has about a dozen models for industries like telecommunications, retail, financial services, and manufacturing, among others. According to Ashton, the models help customers by providing industry-specific syntax for things such as fraud detection and managing stock-outs, thereby providing greater context for natural language processing.

The answers provided by ask.ai are only as good as the data, which could be of poor quality, so there’s definitely a need for humans in the lop. Ask.ai is not meant to replace human analysts, Ashton says. Instead, it will help augment the humans with greater access to data, enabling them to ask more and better questions.

“Humans are still a critical part of the equation,” she says. “You have to bring your brain and look at it. You can’t just take the answer that that a computer gives you and say, well, that means that we’re going to go do this action. You’ve got to look at it and make sure that it’s validated.”

Looking forward, ask.ai will eventually be offered on other public cloud platforms, starting with AWS. Customers running their Teradata warehouse on-prem could also be able to leverage the cloud-based LLM in a hybrid manner, Ashton says. In that situation, the queries would be pushed out using Teradata’s QueryGrid federation technology.

“The system doesn’t care if your data is on prem,” she says. “The model is running in the cloud, but it has access, if you give it access, to your on prem data as well.”

Related Items:

GenAI Debuts Atop Gartner’s 2023 Hype Cycle

Teradata Puts New Cloud Architecture to the 1,000-Node Test

He Couldn’t Beat Teradata. Now He’s Its CEO

 

Datanami