Follow Datanami:
August 15, 2017

How Kafka Helped Rabobank Modernize Alerting System

Customers of Rabobank now receive alerts on bank account activity in a matter of seconds, as opposed to the hours it would take with its existing transactional platform, and it’s all because of the speed and simplicity of Apache Kafka.

With more than 900 locations around the world and about €700 billion in assets, Rabobank is among the 30 biggest financial institutions on the planet. The Dutch firm, which owes its unusual name to the 1972 merger of Raiffeisen and Boerenleenbank, maintains a focus on the agriculture sector, and is considered one of the safest banks in the world.

The company’s central organization, Rabobank Nederland, has run its core banking systems on secure mainframe computers that are considered bullet-proof for transactional tasks. But Rabobank found the mainframe to be too slow for some customer-facing activities, notably sending customers alerts about credits or debits made to their accounts.

What’s more, the process of developing new alerts in the mainframe environment also proved too slow for Rabobank’s liking.

Rabobank’s Big BEB

Earlier this year, Rabobank Nederland overhauled the system behind the alerts and migrated it to a Kafka setup in order to increase the speed. Jeroen van Disseldorp, the founder of the Dutch Kafka consultancy Axual, recently shared his experience with developing the new Kafka-based system in a blog post for Confluent, the commercial outfit behind Kafka.

Real-time alerting is important to Rabobank

“For the past years, Rabobank has been actively investing in becoming a real-time, event-driven bank,” van Disseldorp writes. “If you are familiar with banking processes, you will understand that that is quite a step. A lot of banking processes are implemented as batch jobs on not-so-commodity hardware, so the migration effort is immense.”

Specifically, the bank has sought to build a Business Event Bus (BEB) to be the underlying mechanism where business events can be shared across the organization. Kafka is the main technology powering the flow of data within the BEB, while Kafka Streams is the component that executes business logic upon the stream of data.

Like all major financial institutions, Rabobank has extensive investments in application and data integration systems used to connect various business systems. According to this IBM case study, Rabobank invested in IBM’s Sterling Commerce electronic document interchange (EDI) middleware to facilitate communication for non-core activities with partners, such as ordering cash for ATM machines. Internally, Rabobank uses other technologies for secure application-to-application communication, including IBM’s MQ Series message bus.

For communicating with its customers, Rabobank used a custom-developed system called Rabo Alerts.

“Rabo Alerts is a service that allows Rabobank customers to be alerted whenever interesting financial events occur,” van Disseldorop writes. “A simple example of an event is when a certain amount was debited from or credited to your account, but more complex events also exist. Alerts can be configured by the customer based on their preferences and sent via three channels: email, SMS and mobile push notifications.”

Rabo Alerts isn’t a new service. In fact, it’s been in production for over 10 years and is available to millions of account holders. However, the system wasn’t able to keep up with customer expectations. “The former implementation of Rabo Alerts resided on mainframe systems,” van Disseldorop writes. “The implementation was very stable and reliable, but there were two issues that Rabobank wanted to solve: (1) lack of flexibility and (2) lack of speed.”

Enter the Kafka

With the old Rabo Alerts system, it could take up to five hours before a customer received his or her notification. That may have been fine for 2007, but it’s a tad bit slow for 2017’s technically sophisticated and demanding culture.

Similarly, developers found the old system somewhat rigid, which made it difficult to add new alerts to the service. Consumers are becoming more and more comfortable with electronic banking systems, and the capability to view when one’s paycheck has been deposited or if there were insufficient funds to pay a bill, for example, are growing in importance.

The company made the decision to build a new Rabo Alerts system on a Kafka messaging backbone, with Kafka Streams providing the intelligence and business logic atop the data flowing through the BEB. As van Disseldorop explains, the first step was redesigning the alerting process upon a Kafka-based system that could do several things, including:

  • Tap into the core banking system to view transactions;
  • Translate account numbers to a list of customers with read permission on the account;
  • Look up Rabo Alert settings for each account number;
  • Determine if the transaction meets alert criteria;
  • And send out an alert if it meets the criteria.

Axual configured each of these data components as Kafka topics flowing through the Kafka cluster (actually two parallel clusters for redundancy). The Kafka Streams API provides the interface to a series of microservices written in Spring Boot, a Java framework developed by Pivotal.

The new Rabo Alerts function now runs on Kafka Streams

Van Disseldorop developed the first rudimentary version the system, which could process four types of alerts (the balance is above or below a threshold, and credits or debits are above a threshold), and decided to take it for a test drive. The results were good. “The whole round trip from payment order confirmation to the alert arriving on a mobile device takes only one to two seconds, more often being around one rather than two,” he writes.

Much of that time can be attributed to the payment factory, and its need to validate the payment order and process the transaction, he writes. The Kafka portion of the alerting chain — from the time an entry is posted on Kafka to the time the senders push out a message to a customer — is typically executed within 120 milliseconds.

The new Kafka-based system is “elegant and simple, consisting of only a few Java classes,” van Disseldorop writes. While the code took about four weeks to write, architecting the topology took about six months.

After some internal testing by Rabobank employees, it went live on June 8. It was a very exciting moment for us,” van Disseldorop writes, “not only because it works, but also because we can never go back.” The next step involves migrating about 10 other alert types from the mainframe to the new Kafka system, he writes.

“And it does not stop there! The new implementation also spurred a flood of new ideas, which we’ll be able to talk publicly about (and even demonstrate) soon,” he writes.

Related Items:

A Peek Inside Kafka’s New ‘Exactly Once’ Feature

Kafka ‘Massively Simplifies’ Data Infrastructure, Report Says

The Real-Time Future of Data According to Jay Kreps

Datanami