Follow Datanami:
June 11, 2019

AI And Data Streaming In The Enterprise: A Marriage Made In Heaven?

Seth Wiesman

(INGARA/Shutterstock)

As the adoption of artificial intelligence in the enterprise hits a decisive moment, it is worth examining its relationship with data streaming and how the two technologies share similar adoption challenges. Artificial Intelligence (AI) and real-time data have formed a closely interlinked duo. Both bring massive technological advances to the enterprise with AI relying on the availability of data to feed models with information, in real time or with low latency. Streaming data platforms provide the fabric behind the scenes to deliver data to machine learning models. They also forward insight to users in real time so that they can generate immediate value out of the data.

There are clear connections between stream processing and AI which make their marriage a unique opportunity for companies dealing with a massive volume of real-time events. Both AI and stream processing are distributed, organized in logical units, and both support incremental updates and iterative tasks. Last but not least, both demonstrate an asynchronous nature.

A recent study revealed that most organizations plan to increase their IT budget spent on AI projects over the course of 2019. The study goes on to predict a growing gap between leaders in this space and the ones falling behind in terms of AI adoption. The lack of data and skilled people are the two primary bottlenecks that those organizations face. Company culture can also cause similar hold ups as can the inability to identify relevant use cases.

Stream processing and AI share the same qualities that make them unique but also share the same challenges and bottlenecks in their adoption. How can enterprises overcome these challenges and make the right investments and actions related to their data architecture?

The skills gap appears to be a common challenge for both the adoption of AI as well as stream processing in the enterprise. The previously quoted study finds that demand for machine learning experts, data scientists, and data and infrastructure engineers, is one of the most cited reasons that hinder the adoption of AI in many organizations. Organizations should look to remedy this by transforming their teams both at a technical and a cultural level.

The adoption of AI, or stream processing, requires a requisite investment in a workforce’s skillset, training along with embracing a ‘sharing’ culture across the organization; by minimizing data silos and making data accessible to multiple teams working on different projects and applications, you can significantly reduce the time-to-market and subsequent successful deployment.

The survey also cites a lack of data on behalf of respondents which could refer to either insufficient amounts of data that can sufficiently support training machine learning and AI models and applications in the enterprise or, in the case of streaming data, data quality issues.

Companies adopting an event-driven data strategy, using stream processing frameworks, will necessarily have the ability to make better sense of their data: Enterprises can transform data and enrich it in real time. In parallel, they can obtain valuable insight from events as they become available from systems, connected devices or websites.

This is preferable to storing information in a data lake to only then try and make sense of what happened retrospectively. Stream processing allows companies to react to information in real time, making AI models and applications iterative and responsive to how the world responds to change.

What else prevents the adoption of AI in the enterprise? As with any new technology, data and analytics leaders need to embrace an open and inclusive culture in their teams, challenge the status quo and find ways of leveraging the team’s potential to achieve outcomes faster and in a cost-effective manner.

Stream processing and AI can become a catalyst for organizational change in the IT and data departments. Such change requires moving away from data silos and hierarchical structures and bringing together the data, operations and product teams. This bonding of the different teams significantly reduces the time needed to produce real-time applications, AI and machine learning models or deep learning algorithms.

AI adoption will only continue to grow over time and, therefore, identifying relevant use cases for the technology will become easier. AI is already being used to power customer service operations as well as other applications and use cases in finance, accounting, marketing, and advertising. Recently, many organizations have introduced the role of the Analytic Translator as a bridge between the data science and executive teams.

The role has attracted much attention recently with McKinsey identifying the Analytics Translator among the most in-demand positions for analytics in 2019; showing the direction that enterprises take to not only focus their effort and investment in the appropriate data infrastructure but also translate that data into actionable insight for the business. Executives should focus on finding the right talent that can identify use cases for both real-time data and AI, and drive such projects forward, enabling the business to leap forward in its digital transformation journey.

About the author: Seth Wiesman is a senior solutions architect at Ververica (formerly Data Artisans), which was founded by the original creators of Apache Flink. Wiesman works with consulting clients to maximise the benefits of real-time data processing for their business. He supports customers in the areas of application design, system integration, and performance tuning. Prior to joining Ververica, he was a data engineer on the reporting team at MediaMath and has a Masters in Computer Science from the University of Missouri.

Related Items:

Understanding Your Options for Stream Processing Frameworks

Real-Time Streaming for ML Usage Jumps 5X, Study Says

Fueled by Kafka, Stream Processing Poised for Growth

Datanami