Skip to main content

Journey to the event-driven business with Kafka

article banner

In case you missed our webinar on the topic of event streaming with Kafka for business, this blog gives you the highlights.

This blog is on why event streaming has become so important for business success and what it takes to become an event-driven organisation with Confluent and Kafka.

There are four key elements that a solution needs to have in order to enable the organisation to achieve this level of event-sourced, real-time situational awareness. It needs to be built for real-time events, scalable for all data, persistent and durable and finally, capable of enrichment.

Traditional data platforms, like databases, messaging solutions, ETL solutions or data warehouses, fail to meet all four criteria and this is where event streaming platforms and Apache Kafka come into play.

Apache Kafka was originally developed by LinkedIn that wanted a unified, high-throughput, low-latency solution for handling real-time data feeds. LinkedIn donated the open-source platform to the Apache Software Foundation and it has since been widely adopted by many well-known enterprises, like airbnb, Uber, Netflix, Wikipedia and Twitter. Today, more than 60% of Fortune 100 companies are using Apache Kafka.

Pre-streaming era

Companies have been using many different solutions to integrate their systems, applications and data, often siloed within the confines of the various business lines. In doing so, they created maze-like architectures, expensive to maintain and communicating primarily with batch processes. This hindered agility and made the companies reactive rather than proactive to changes that occurred. If we think about fraud, cyber attacks, online shopping, inability to act in real-time when an event occurs could cost hundreds of thousands to the business.

Steaming era

Entering the streaming era, organisations need to start thinking in terms of events instead of enterprise architectures. Events are discrete units of information, also known as data-points, that originate from various systems that continuously generate data. With event streaming capabilities and an event-focused architecture, companies enable a constant flow of events that can be retained, transformed and enriched while they are being streamed, but most importantly, they can be consumed in real time. This ability to consume events in real-time opens the doors for IT to start developing contextual, event-driven applications that can take action instantly based on a specific event that just happened.

Securing stakeholder sponsorship

Today there are over 200,000 Kafka installations around the world but a lot are still at an experimental stage, with IT teams failing to secure stakeholder sponsorship. This is because there is no clear path between the infrastructure use case that architects and data engineers are working on and the business value that this use case can generate.

When IT teams start thinking of the use cases and the strategic drivers from the business’ perspective, and connect the dots of how the infrastructure can support them, then the benefits become clearer. If we take a Financial Institution for example, with a key business objective to mitigate risk, the business use cases could be fraud detection and regulatory compliance. From there, IT will receive requirements like online security, online fraud detection and faster processing and analysis of transactions. These requirements can be met by event streaming, feeding back to the top and to the business objective of mitigating risk.

It is therefore critical, especially at the early stages of event streaming adoption, for IT to be able to identify and prioritise the use cases, in terms of value to the business and, of course, feasibility. Consequently, the higher a use case is in terms of value and feasibility, the higher it should be in priority.

Stage 1: Streaming awareness and pilot

The journey to adopting an event streaming platform typically begins with a developer, infrastructure engineer, data engineer, or architect, showing early interest in Apache Kafka.

This may be triggered by the Business pressing for more timely visibility of business events, or more commonly, by IT’s frustration with existing systems and ways of managing large volumes of data in a resilient and performant way. Here, we typically see one or two individuals within IT, experimenting with Kafka.

Two key drivers for a pilot or POC:

  1. Connecting to data stuck in external data stores or legacy systems.
  2. Producing new streams of data (new business models).

Stage 2: Early production streaming

This stage focuses on evolving the small scale project into an ‘Event Pipeline’. At this stage we typically see some, albeit minimal, funding which enables the small team at stage 1 to grow to a slightly larger coordinated group - the Product team.

This first team is typically responsible for both building the platform and supporting it. The ‘project’ evolves from ‘set-up’ to ‘continuous delivery’.

Value drivers of business-application adoption:

  1. Connecting to data stuck in external data stores or legacy systems.
  2. Producing new streams of data (new business models).

Stage 3: Mission critical, integrated streaming

First, the team running the Universal Pipeline (from stage 1 & 2) tends to move towards a shared service, or centralized, utility - serving and integrating apps across multiple LOBs.

With centralization comes efficiencies and economies of scale.

That is, an Enterprise should aim to avoid creating a different Streaming Platform for each business unit / LoB, thereby reinventing the wheel each time.

Mission critical: Streaming platform tied more closely to the business. Capabilities that matter include security, durability, exactly-once guarantees, together with the ability to monitor the event flows across multiple applications and maintain data completeness SLAs.

Integrated streaming: Kafka helps break silos by joining events that originate in separate parts of the business. A streaming platform can be used to join streams of business facts and provide a unified view of data in real time.

Stage 4: Global streaming

Once the event-streaming platform proves itself for mission-critical use cases, things often accelerate - and fast.

This is when organizations enter the flywheel of adoption; the more apps you add, the more events there are and the more powerful the platform becomes.

In some large enterprises, we’ve seen over 30 new applications leveraging Confluent Platform in just a matter of weeks.

Challenges a business needs to address:

  • How do you make data available across different data centers or regions?
  • How do you serve data efficiently from closer geos?
  • How do you implement data sovereignty rules, like GDPR?


Stage 5: Central nervous system

Finally, as your event streaming platform becomes ubiquitous and contextual event-driven applications become the enterprise standard.

Here, the business has largely transformed into a real-time digital business. You will be responsive to your customers and be able to create business outcomes in a way that was never possible before.

Related reads.

WHAT WE DO.

Explore our wide gamut of digital transformation capabilities and our work across industries.

Explore