Follow Datanami:
October 1, 2019

Neo4j Gets Hooks Into Kafka

via Shutterstock

Today at the Kafka Summit, Neo4j unveiled a new product called Neo4j Streams that will make it easier to connect streaming data from Apache Kafka with the company’s graph database. Analytics use cases, such as financial fraud analysis, knowledge graphs, and customer 360, are the expected beneficiaries.

Kafka has emerged as the defacto standard for stream processing, a field that’s exploding in popularity thanks to the huge amounts of event data being generated by people, applications, sensors, and other instrumented entities. This week, the Kafka community is descending upon San Francisco for the Kafka Summit, which brings two days of Kafka keynote and education.

Neo4j Streams will be made available to users of open source Apache Kafka, as well as to paying customers of both Confluent and Neo4j. The goal is to help funnel Kafka data into and out of the Neo4j database, which uses graph database techniques to efficiently analyze data in ways that would be too expensive in a traditional relational database.

The new software is the delivery vehicle for a previously developed sink connector that pushes data from Kafka into the Neo4j database. The connector, which ships in Neo4j Streams, received the Verified Gold certification from Confluent, assuring users that it’s fully compatible with the Kafka Connect API.

In addition to pushing new event data from Kafka channels into Neo4j and converting it to graph data structures, Neo4j Streams also helps to expose the results of processing in the graph database back into Kafka, where it can be used to impact downstream systems.

It’s a win-win for the vendors and users of the technology, says Neo4j’s SVP of Business and Corporate Development, Fawad Zakariya.

“We’ve integrated the world’s most powerful graph database with the most popular streaming platform, fully supported for Confluent and Neo4j enterprise customers,” Zakariya said in a press release. “As enterprises demand near real-time capabilities to fight fraud, respond to customer behavior, and react to business events, this integration unlocks new ways for them to innovate by combining the power of connected context with real-time streaming data.”

Data is only as valuable as how quickly someone can take action on it, right down to the millisecond it was created, says Simon Hayes, the vice president of corporate and business development at Confluent.

“That’s why 60 percent of Fortune 100 companies have put an event streaming platform at the heart of their businesses,” Hayes said. “Through this integration with Neo4j Streams, customers can more quickly and easily see the relationship between events and turn that data into context so it can be acted upon faster.”

This is the start of the relationships between Confluent and Neo4j, says David Allen, a partner solution architect at Neo4j.

“The significance of the relationship is that we add graph superpowers to their already formidable framework, and together focus on mutual verticals of strength like finance,” Allen tells Datanami. “Neo4j and Confluent share a significant overlapping customer base and are seeing strong demand for graphs applied to streaming data for real time fraud detection, customer 360, recommendations in particular. Imagine taking snippets from your data stream, detecting communities and writing them back, and then looking at how the communities evolve over time.”

Kafka Summit continues today.

Related Items:

Kafka in the Cloud: Who Needs Clusters Anyway?

Want Kafka on Kubernetes? Confluent Has It Made

Kafka ‘Massively Simplifies’ Data Infrastructure, Report Says

 

Datanami