Event-based integration in DNB

By
  • Atif Usman
Oct. 20 20234 min. read time
1748437547
  • Dataarchitecture
  • AWS

Modern data architectures allow organizations to process and analyze large volumes of data in real-time. By utilizing event-based integration, we can build reliable and scalable data architectures that can adapt to changing business needs and let them stay ahead in today’s modern and rapidly evolving world.

The last decade has seen an immense rise in the popularity of event-driven architecture. It has been adapted vastly for implementing asynchronous, real-time event processing solutions. In this blog, we will explain the concept of event-based integrations, why organizations should invest in this concept and some critical factors while designing event-driven applications in general.

What is event-driven integrations?

This is one of the frequently encountered questions. And it requires an understanding of the different terminologies in use. i.e., Event-driven architecture, Event-driven integration, and what are the even and odds in these two.

Event-driven architecture (EDA) is a design pattern focusing on application decoupling, where these loosely coupled applications asynchronously produce and consume events via an event broker. Event is an industrial term for messages in EDA technology. The simplest definition of an event is "a signal that a system state has changed". These signals must have enough data to act upon for consumers. In Event-driven architecture, applications or teams must assume the responsibility of data such as creation, transformation, validation, filtering, etc., and communication with an event broker, like Apache Kafka. The event broker ensures decoupling between producers and consumers, by holding each message for a defined time period, so that any consumer may read or re-read new messages whenever it is ready.

Event-driven integration is an application of EDA. It applies EDA to solve different integration problems focusing on data integration in an asynchronous way between multiple applications. Event-driven integration allows applications to take a subset of their data which they produce as an event (that may be useful for others) and shares it with other applications using data movement patterns like pub/sub, data streaming, queuing, etc., via event brokers. In large organisations, several teams are involved in developing applications, teams with diverse ways to connect, store, transform, enrich, validate, and transmit data. These are typical integration considerations. Event-driven integration uses supporting technologies to expand over EDA and then provides standardization to such integration challenges.

Picture-1.jpg

Why event-driven architecture is important for today’s organisation?

EDA evolved over many years, initially with the electronification of trading, online gaming, and betting (the early 1990s) and nowadays EDA is becoming mainstream, especially in financial Services and any major enterprise going through digital transformation.

Why is it important?

The reason is Disruption in Technology and business. For example, look at the major micro trends of disruption within banking. Originally it was only Online Banking (considering the internet era), but now users are coming from various internal and external channels like retail, corporate channels, mobile apps, PSD2, ERP systems etc., and true omni channel experience is also not that far away. Just imagine the rapid increase in volumes and ponder upon what is happening behind the scene to the downstream application.

All these technology shifts, require a more sustainable environment than ever before. On one end, there are challenges with scalability, while on the other side, there are sky-high customer expectations of information that is no less than real-time. This demands a real-time information flow and application access to the data at scale.

Events appear in the organisation all the time. The ability to get those events in motion from where they occur to where they can be useful will dictate future success. Companies doing that well can improve operation efficiency, customer experience, customer satisfaction, cost reduction, and the ability to be more innovative.

At the heart of all of this is event-driven architecture. As we push through these technology disruptions, EDA is at the core to become the lifeblood of the organisation.

What are the key factors which should be kept in mind while designing applications over event-driven architecture?

Events are triggering every moment within the environment at a considerably high frequency. Regardless of the nature, whether a technical or a business event, these events/data are valuable to the organisation. In addition, achieving a value of timing and speed is critical.

Because the value of an event diminishes dramatically with time, it is not over weeks/months, it is seconds, or some are even less than a second at times. The values go from being a company’s piler of achieving operational or business efficiency, to where they will only be used for historical purposes. As an example, when a road incident is reported and if that event reaches to its subscribers after hours, then it might be of no use.

Picture-2.jpg

As today’s world applications are more agile, moving data dynamically closer to the concentrated application across platforms is key. It helps to bring an important concept and a layer of Event Mesh. This is complemented by the event-hub platform in DNB.

What is event-hub platform?

Event-hub is a data streaming platform. It is like an ecosystem of software components that enables application teams to govern, connect, collect, persist, and process real-time events at scale. At its core, the event-hub platform consists of Apache Kafka: a distributed storage system with capabilities ensuring high scalability, fault tolerance, and availability. Kafka is known for its large data processing capabilities, along with a set of components to enable applications to ingest, process data events, and define data models.

Picture-3.jpg

Strategically, the event-hub platform will be placed across multiple clouds to maintain data gravity while sharing required events across platforms and creating a data mesh across the DNB IT landscape.

Picture-4.jpg
Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of DNB.

© DNB

To dnb.no

Informasjonskapsler

DNB samler inn og analyserer data om din brukeratferd på våre nettsider.