Follow Datanami:

People to Watch 2019

Steve Wilkes
CTO and Co-founder
Striim

Steve Wilkes is a co-founder and CTO of Striim, a provider of streaming data solutions. Steve is is a life-long technologist, architect, and hands-on development executive.

Prior to founding Striim, Steve was the Senior Director of the Advanced Technology Group at GoldenGate Software. Here he focused on data integration, and continued this role following the acquisition by Oracle, where he also took the lead for Oracle’s cloud data integration strategy. His earlier career included Senior Enterprise Architect at The Middleware Company, Principal Technologist at AltoWeb and a number of product development and consulting roles including Cap Gemini’s Advanced Technology Group.

Steve has handled every role in the software lifecycle and most roles in a technology company at some point during his career. He still codes in multiple languages, often at the same time. Steve holds a Master of Engineering Degree in microelectronics and software engineering from the University of Newcastle-upon-Tyne in the UK.

Datanami: Steve, Hadoop was almost synonymous for big data, for a while there, but the shine has come off Hadoop lately. Does the industry need a new technology or other entity to rally and congregate around?

Steve Wilkes: I think it already has, and it has chosen cloud – with real-time integration as its supporting actor. There are lots of candidate technologies for the next big thing – AI / Machine Learning, distributed ledgers, fast analytics, quantum computing, and IoT. But what all these have in common is that they are more and more being implemented in the cloud, and they all require continuous data movement.

As part of this, large scale on-premise Hadoop, and other Big Data environments, are moving to cloud storage and cloud warehouses like BigQuery and Snowflake, with Machine Learning being implemented on this scalable dataset. On-premise databases are being migrated to cloud database, especially cross-cloud supported databases like PostgreSQL, for both one-off migrations and cloud-bursting elastic scalability.

At the same time many organizations are adopting ‘cloud first’ strategies for new applications, fast analytics, and reporting purposes, sourcing data from on-premise databases, log files, messaging systems and IoT. Scalable distributed ledgers such as Hyperledger and Ethereum are already available as AWS and Azure services, and costly technologies like quantum computing can be accessed through cloud subscriptions without the bulky hardware and cryogenics required to maintain on-premise installations.

Cloud has changed, and continues to change, the way we work with technology, and almost always requires real-time integration to make it a reality.

Datanami: What are the biggest challenges facing customers who want to implement real-timeanalytics? Are they architectural, process-oriented, or technology challenges?

Three major things – the sourcing of real-time data, ease of implementation, and the perceived cost of real-time systems.

Many organizations are familiar with processing data in batch, and use existing investments in ETL to process and move data. For some use-cases, such as end-of-day reporting, this may be sufficient but, increasingly often, the insights are required much faster, and batch doesn’t cut it. Some enterprises, especially in finance, are ahead of the curve, having already invested in distributed event hubs into which they stream most of their real-time data. But a lot of data is still locked up in databases, log files, and big data systems – accessible only through queries with a lot of inherent latency. To get to true real-time analytics, technologies like change data capture and file tailing are required to push new data into stream processing platforms as it’s created.

But this can’t happen unless the whole process is simplified. Organizations shouldn’t have to struggle with multiple pieces of technology from multiple vendors, or have to write lots of code to get to the point where they can do streaming analytics. Messaging systems with third party connectors end up being like Lego constructions that appear solid, but are brittle at the joins, making it difficult to meet rigid reliability and scalability SLAs. An intuitive UI is also essential, to guide people through the steps needed to source real-time data, and make it easy to start building analytics through familiar languages like SQL.

Finally, the myth that real-time analytics, and the required supporting software and hardware, is expensive needs to be dispelled. With today’s computing platforms, CPU and memory is orders of magnitude cheaper than decades ago, and software is often much cheaper than equivalent ETL and data warehouse solutions.

Datanami: What hopes do you hold for the adoption of real-time analytics in 2019? Will this be a breakout year?

Pretty high hopes. We have been talking about streaming data for the last seven years. The first few years were an uphill battle, educating and evangelizing, trying to help enterprises understand the benefits of real-time data.

This has now reversed, and we are being sought out to help organizations unlock their data and gain access to real-time event streams. The majority of our initial implementations are streaming integration, sourcing real-time data, processing it, and moving it elsewhere. However, once data is streaming the power of real-time analytics can be unleashed. Some of our earlier customers already have hundreds of real-time analytics applications, and some new customers go straight to analytics.

But to truly achieve a break-out year, the majority of enterprises need to start with streaming integration to gain access to real-time data, and the momentum appears to be there.

Datanami: Outside of the professional sphere, what can you share about yourself that your colleagues might be surprised to learn – any unique hobbies or stories?

I started coding when I was 13 years old on a ZX-Spectrum and haven’t stopped since then . You may recognize the ZX-Spectrum from Bandersnatch – I also coded an adventure game in Z80 assembly. I play Pokemon Go with my wife, and Fortnite with my kids, I am trained in Improv, and will sing Karaoke at the drop of a hat, much to the dismay of all around.

 

Sri Ambati
H2O.ai
Julia Angwin
The Markup
Pedro Domingos
University of Washington
Ali Ghodsi
RISELab
Cassie Kozyrkov
Google
Hilary Mason
Cloudera
Bob Muglia
SnowFlake
Oliver Ratzesgerger
Teradata
Dr. Tony Slonim
Renown Health
Kostas Tzoumas
dataArtisans
Evan Weaver
Fauna
Steve Wilkes
Striim

 

 

 

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13

Datanami