Data flow has been stuck in the past, slowing down decision making in key areas such as supply chain, retail and finance. The data agility required during the pandemic was a baptism of fire for many organizations.
Companies needed to change operations and business models more quickly than their well-worn legacy applications would allow. Trying to rearchitect a business model so quickly ultimately exposed some significant gaps between IT systems – gaps that limited organizations’ ability to quickly move essential data.
Going forward, there is a clear and pressing need to get data “in motion” and at speed, to deliver the real-time insights craved by business leaders, employees and, most of all, the customers they serve.
As a result, businesses are looking at how they can best stream data events across their organization, whether for customer requests, inventory updates, or sensor readings.
It’s from this view that I base my predictions for three emerging trends in streaming data:
Also see: Real Time Data Management Trends
1) Data Will Shift from Static to Fluid
In 2022 and beyond, moving data in real-time between increasingly distributed application architectures will be a high priority.
In response, Forrester sees event-driven architecture (EDA) as the first key trend impacting software development this year. Specifically, “the growth of distributed application architecture hits a wall when only using synchronous APIs for integration due to fragility and scalability limitations. Over the years, EDA has gained more interest as it addresses this wall through APIs, microservices, and integration. We predict that in 2022, that interest will expand, with 35% of enterprises putting a major focus on EDA.”
EDA enables data to be moved in real-time event streams via an event mesh – helping link together previously unconnected processes across business running multiple siloed systems. A global study of C-Suite and IT architecture professionals has found that 85% of organizations are looking to incorporate real-time data and event-driven architecture (EDA) into their operations.
Leading edge industries looking to exploit real-time movement of data include financial services, retail and manufacturing – all industries where automation, APIs and IoT technologies are converging. As we expand out from this event-driven approach to move data in real-time, we see its mainstream impact on some more macro-level trends set to define the next twelve months.
Also see: Best Data Analytics Tools
2) 5G Multiplies Data Flow Across the Supply Chain
We’ve seen during the pandemic the need for smart inventory management, product tracking and supply chain optimization, all of which rely heavily on the essential flow of data. The data underpinning these processes is wide and all-encompassing – think location, weather, order status, the whole lead to cash, and source to pay process. All of this is becoming real-time and connected as event streams.
Enter 5G. If we combine this event-driven infrastructure with quicker connectivity advancements such as 5G, we see the potential for richer data to move quickly over an event mesh, between processes such as eCommerce, CX, warehouses, plants, transport and logistics, and business insights and reporting – all in real-time.
Statistics indicate that 5G increases data rates from 1GB/second to 20GB/second, increases data traffic from 7.2 exabytes/month to 50 exabytes/month and reduces latency from 10ms to <1ms.
Everyone involved in a supply chain, from finance to manufacturing and distribution and critically, the end-customer, will become more connected via real-time streams.
Also see: Top Data Visualization Tools
3) The Metaverse Will Boost Streaming Data
In the next few years, how do you think we will buy a new pair of sunglasses, or the latest dress or shoe online? Your avatar will roam in a virtual mall in 3D, try items on in 3D and pay, all with the click of a button or a tap on a smart device. Welcome to the Metaverse, where research corroborates 66% of consumers say they are particularly interested in using AR.
Granted, we are still in the Nokia 3310 era of virtual reality at the moment; the smartphone form factor is probably around five years away. But the common theme will be moving data at speed to support such next-gen Metaverse interactions – and the key enabler in making this happen is event-driven data architecture.
Consider the potential impact of the Metaverse on another data-heavy industry, financial services, where transaction numbers are huge and systems have to run 24x7x365. Decentralized Finance (DeFi) is on the rise, with predictions that the market will reach $800 billion within the next year.
The convenience of a decentralized approach is clear to see, with financial institutions bypassing intermediaries to deal directly with each other in a secure, rapid manner. But to make Decentralized Finance a reality, financial institutions must be able to interface their core systems with a distributed ledger, and this is where event brokers – and in turn the event data mesh – come into play.
McKinsey believes there is more to be done to address the digital divide in customer experiences, and data will be at the core of this change: “With the rise of big data and predictive analytics, companies can now build a customer-management capability that is holistic, predictive, prioritized, and value-focused.”
The Metaverse (or Omniverse) by definition cannot be static and non-reactive or disconnected from the real physical world. It needs to be a 3D digital twin of the real world. Most important, it needs to be connected in real-time with the real world; it needs to be in motion. The Metaverse needs an event driven architecture and a real-time event mesh to support it.
Also see: Top Business Intelligence Software
About the Author:
Sumeet Puri, Chief Technology Solutions Officer, Solace