Posted inBusiness

The key to staying competitive in an always-on, data-driven world

Part of the value of data stream processing lies in its ability to guarantee up-to-date, accurate information

In today’s digital economy, innovation, pricing, and a slew of other factors that determine business competitiveness are heavily influenced by the quality and availability of data.  We see this daily across the GCC – businesses trying to individualise their marketing campaigns, services, products, and aftersales in a hundred different ways through the use of data made available in real-time. The frontline professionals that use this data are reliant on the backend systems that make digital processes possible. Speed up your ability to process data and you increase the rate at which you receive insights, thereby increasing the rate at which you can innovate.

How is this acceleration achieved? Through data stream processing. This method allows organisations to handle continuous, real-time flows of data and innovate fluidly rather than having to wait for batch processing to feed people insights periodically. We lived with batch processing for years, and it still fits some modern use cases where time is not a factor. But if you are in a neck-and-neck race to gain an insight that will pull you ahead, then you must be able to access data that is accurate, current, and of high velocity.

Holger Temme, Director Product at Confluent

Stream on

To understand the importance of real time stream processing consider just one-use case (albeit, one which ultimately impacts all businesses and consumers) – digital payments. The UAE’s market for digital payments alone is projected to reach a transaction value of almost US$29 billion this year. Spare a thought for the authorities trying to sift fraudulent transactions from legitimate ones in the seconds taken for those transactions to complete.

Such real-time checks are made possible due to data stream processing. Through stream processing, IT teams can break millions of transactions into billions of events that can be processed in real-time. For this to work effectively, we must turn to event-driven architecture such as that provided through technology rooted in the Apache Software Foundation (ASF). ASF has won a special place in the hearts of developers and has become recognised worldwide among the open-source community. The Foundation has embraced that community and found ways of bringing developers together. As such, it has resonated with individual developers and organisations, making Apache projects a cornerstone of the open-source ecosystem.

Apache delivers event-driven architecture (essential for stream processing) through two pillars: storage and processing. Apache Kafka is a distributed streaming storage platform designed for developers to build high-performance data pipelines, streaming analytics, and more. And Apache Flink is an open-source stream-processing framework that offers a distributed processing engine. Kafka and Flink together are what allow organisations to deliver the real-time data stream processing architecture that leads to real advantage. This potent combo will deliver the speed that makes fluid insights possible. But we should be aware that it only delivers the data. It is up to the enterprise to devise ways of interpreting it and acting on it.

Enabler of effective AI

Part of the value of data stream processing lies in its ability to guarantee up-to-date, accurate information. We must learn an important lesson about the nature of results: they are only as good as the underlying data that feeds them. When we deliver data stream processing, we are moving away from traditional AI models that are built on static data – data that can expire and end up giving results that are only important in a historical context, albeit a history that is only a few weeks old. Imagine an advanced AI model built on data that is continuously refreshed. It could generate results that are just as current.

As you can see, stream processing is the future for AI models. Many of the caveats of AI centre on data quality and relevance. A model that is self-correcting can rid itself of drift and remain in service for longer. Competitive advantage is baked right into this system. With the most up-to-date information available, business leaders no longer need to worry about ever-changing markets.

Once again, to understand the potential impact, let’s turn to an example. Consider two competing retail businesses. Both have hopped on the AI train and use the technology to deliver product-recommendation services. Now imagine that both use plug-ins for weather data, but one uses overnight batch updates while the other receives real-time updates by meteorologists. The first may be advising shoppers to buy umbrellas because its data shows a prediction of rain. But the second has moved on because current data shows more probability of sun, and it can recommend sunscreen instead. When the day is done, and customers from both websites compare notes, the second will have gained a bump in brand reputation.

Right time, right place

Stream processing has real-world advantages. It empowers businesses to make better decisions based on the best possible version of the truth at every moment of operation. Stream processing gives a competitive edge to any business from any industry – the right data at the right time in the right context.