Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Global supply chains are suffering unprecedented disruption in the wake of the COVID-19 pandemic, with the average large business reporting a loss of $182 million in annual revenue as a result. But many supply chain challenges predate the chaos of the last couple of years. Digital transformation across domestic and global supply chains is long overdue.
Improved visibility, increased flexibility and effective communication are crucial for efficient and resilient supply chain operations. Yet, until now, the focus has been on the generation of data — which is why the Internet of Things (IoT) has not lived up to its hype when it comes to solving supply chain problems. More than 10 billion IoT devices worldwide are constantly adding data to already overflowing data stores. But the problem is not a lack of data — which is why IoT is not the answer.
In isolation, IoT data is meaningless — it’s just another strand of data. Supply chains inherently comprise many stakeholders, each performing its own critical functions which, collectively, make up the overall supply chain network. The data present across the supply chain sits across these many stakeholders, siloed among them. IoT data does add value but, without context, not very much.
The missing ingredient that’s key to unlocking the hidden value buried deep in the supply chain is to bring the various strands of data together in a way that provides meaning. For example, many vehicles are fitted with some form of tracking and monitoring capability that can report swathes of information wirelessly. But unless the IoT device is paired with a specific vehicle registration, that data is fairly pointless. What the vehicle is transporting, for whom, and to where is the information that adds value — providing context and helping organizations harness data to drive operational efficiency.
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Data accessibility at scale
This simple example highlights the real challenge of supply-chain digital transformation. The information required to provide the picture that business leaders need to cope with the ever-growing challenges of modern-day supply chains requires data to be captured and brought together from a myriad of supply chain systems — owned, operated and controlled by numerous independent organizations. The battleground now is data accessibility at scale.
There is much debate as to how this can be achieved. But it all boils down to four central questions. How do you connect to the various systems to gather the data you need, when they all have different maturities and interfacing capabilities? How can you ensure the data is accurate and trusted? How can the data be put together in a scalable, consistent and coherent way? How can the security and privacy of the respective organizations be maintained, ensuring only relevant data is captured and shared only with credentialed others?
Intelligent data orchestration
The breakthrough has come from new data mesh technology, which is based on distributed architecture and enables users to easily access and query data where it lives — without first transporting it to a data lake or data warehouse. Accessing data from systems used by organizations in the day-to-day running of their respective operations adds confidence to the accuracy and validity of the data, since if the respective domains do not maintain accuracy of data within their systems, their respective businesses will suffer.
Using data mesh technology, each system connects directly and only to a central “conductor” platform. The conductor platform must be flexible enough to interface with target systems in a way that requires little or no change — for example, with APIs, via FTP, or perhaps even offering manual entry applications to facilitate data capture from systems that do not have external interface capabilities.
To ensure the data is structured consistently in a way that humans and IT systems can easily interact with and use — and also to ensure that only relevant data is captured — “digital twins” are created within the central platform. These predefined twins represent objects within the supply chain. A digital twin of a consignment provides a central “object” to which all relevant data can be added. Intelligent data orchestration then captures and maps the data, defining and assigning policies as the consignment digital twin is established, helping the next piece of data capture, and ensuring only relevant data is sourced from the connected systems.
For example, consignment and inventory data can be combined with transport schedules and allocated transport. IoT data captured from a vehicle telematics system can then be added to the relevant consignment digital twin, offering real-time information contextualized to an individual consignment.
Supply chain visibility
When it comes to supply chain visibility, the requirements of each stakeholder are different. The telematics system used by a hauler, for example, will provide visibility of all its vehicles, all the time — something necessary for the company to monitor its vehicle fleet. The manufacturer whose goods are being transported, on the other hand, simply wants to know where their consignment is in real time. They only need the GPS data of a specific vehicle for a specific period of time while the goods are being moved.
It’s crucial to consider the challenge of supply chain digital transformation in layers. Think of it like an orchestra. First you have the instruments. These are pretty dumb in isolation — capable of producing sounds. The same goes for data generation. Then you have the musicians playing individual instruments in a composition. Think here of pointed systems used by single stakeholders for a specific purpose. A telematics system, using the data from IoT devices to monitor vehicles, is a very pointed application.
Standing in front of the musicians is a single conductor responsible for making the various musicians play together — delivering a more comprehensive piece of music collectively than they can individually. In the supply chain, the conductor platform captures and orchestrates data across multiple stakeholders to provide granular visibility.
The layer above is the audience. In a concert hall, the audience listens live, but individual listeners may want to listen to a recording on a portable device. The audience in a supply chain varies. Different stakeholders want different things, and want to interact with different segments of the data in a way that delivers value for them.
The application layer must be capable of delivering to multiple audiences in a variety of ways. But the central conductor platform must be able to deliver outputs that allow this flexibility. After all, without recording equipment, no music could be enjoyed on a portable device.
The digital twins created by the conductor platform must be independent, so that they can be interrogated individually and in different ways. This will ensure that the outputs — the applications — can range from custom dashboards to event-driven push notifications via email or SMS, or via APIs.
Operators and analysts
As each digital twin acquires more data, the dynamic intelligent data orchestration uses that data to capture key events. These events can be distributed to operators across the supply chain, helping to create efficiencies and streamline processes. They can also be interfaced with other supply chain systems — driving automation of processes and removing paperwork.
In addition, these events are plotted to form lifecycle records for each individual consignment. These lifecycle records can be used by analysts for macro analysis — bridging the current chasm between operators and analysts by ensuring they are all working from the same, contextualized data.
Research suggests that businesses with optimal supply chains can halve their inventory holdings, reduce their supply chain costs by 15% and triple the speed of their cash-to-cash cycle. And, of course, traditional supply chain monitoring technologies — such as sensors to monitor temperature-controlled goods — have their uses, not least to mitigate disputes if something goes wrong during transit. But they offer very narrow and limited value.
The real power of supply chain visibility technology is that it can start to move the supply chain to a point of autonomy, building on analysis of the detailed consignment lifecycle records generated. Decisions can be made — and processes started or paused — autonomously, for example. And predictions can be made at a macro level, rather than being limited to whether an individual product will arrive on time and in the right condition.
To achieve this kind of transformation, technology needs to be far more embedded in the entire supply chain. Businesses need to be able to connect every system they interact with — and the data needs to flow back into the organization to drive automation. The data also needs to be structured in a way that lends itself to macro analysis, enabling smarter and more precise macro decision-making in the future.
Supply chain visibility needs to offer more than a dot on a map. It’s time to join the dots and navigate the route to digitalization.
Toby Mills is founder and CEO of supply chain visibility company Entopy.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!