Supply chains have become so complex and tangled that the traditional way of navigating everything from suppliers, inventory, transportation and analytics has been upended.
As with all messes, there is deep commercial opportunity for any company that can handle clean-up in a novel, more streamlined way. In a post-2020 world, that need is stronger than ever — but it will take more than a traditional supply chain management system. It will need more data from more sources, all delivered and synthesized as close to real-time as possible.
Few companies can bring together so many disparate data sources with the backend platform tooling. Google is one.
The catch is you have to use the cloud and it has to be Google’s. Then again, supply chain management systems have always been proprietary and impenetrable.
Where else can companies most sensitive to supply chain disruptions go when each day seems to bring a new bottleneck? Weather or container and port backups, delayed flights and trucks, supplier outages or disruptions — these are all macro issues that are difficult to integrate into existing supply chain management systems in near-real-time.
It might be that only the hyperscalers can offer the kind of nuanced supply chain impacts companies need. They have the hooks into a massive data partner ecosystem, their own data in spades, the compute to bring to the table, and all the leverage in the world to get the big ERP and supply chain software makers to partner.
In other words, why let those ISVs have all the share when to get the grander view (and all the infrastructure) takes far more than a single database? AWS has had complex supply chain offerings for years and Microsoft has worked with SAP to get its own approach to supply chain and logistics down to a science. But when it comes to truly big data — the kind that matters in the post-2020 case — Google might have the most compelling tools at its fingertips.
Software startups take heed: there may not be an opportunity to “disrupt” the supply chain software space, but there is big value in scoring a slot as a focused “data component” provider. 2021 might be the best possible year to ground-floor with a highly nuanced risk-driven assessment tool or service to feed the giant engines of supply chain understanding, and the worst for anyone hoping to release a “mega-platform” despite the clear need. In this case in particular there is truly no competing with cloud, especially once that integration piece is solved. Assuming, of course, supply chain victims are in any mood for a big technology shift in the middle of the worst year imaginable.
Google Cloud is doing everything possible to make this offering look non-traditional, while capitalizing on the AI buzz and more under-radar concepts like digital twins.
Google Cloud Platform’s approach to supply chain tech goes beyond a database-driven platform on the surface and into the realm of something we have been hearing about in supercomputing: digital twins. Instead of a traditional way to analyze and forecast, digital twins are a virtual outlay of a company’s entire supply chain, which can be manipulated to make more accurate predictions. At the end of the day, it’s all just databases — but it’s the sources, ingest, and simulation that might make the difference, in addition to a more real-time emphasis.
The company’s newly-announced “Supply Chain Twin” can pull together and synthesize data from public risk and weather data as well as its own proprietary data wells. Google can mesh with existing ERP systems as well as integrate data from suppliers and transportation companies. Users can get alerts when something might change and can use Google Cloud’s built-in machine learning to get suggested responses to any number of possible impacts.
Supply Chain Twin provides ready-to-deploy connectors and transformation pipelines based on Cloud Data Fusion to bring data from ERP systems like SAP into the BigQuery data platform. It uses Google Cloud public datasets and the Analytics Hub to enable secure access to curated datasets from multiple data providers without complex onboarding. This semantic layer spanning the private, community, and public data segments enables data to be leveraged directly and scalably for a variety of uses, including data science.
The offering is still in preview mode but companies like European auto giant Renault have been early adopters. “At Renault, we are innovating on how we run efficient supply chains. Improving visibility to inventory levels across our network is a key initiative,” said Jean-François Salles, Supply Chain Global Vice President at Renault Group. “By aggregating inventory data from our suppliers and leveraging Google Cloud’s strength in organizing and orchestrating data, with solutions like the Supply Chain Twin we expect to achieve a holistic view. We aim to work with Google tools to manage both stock, improve forecasting, and eventually optimize our fulfillment.”
Hans Thalbauer, who leads Google Cloud’s Supply Chain and Logistics division, says the problems with existing systems are rooted in siloed and incomplete data. He points to this lack of visibility and says the digital twin approach to supply chain management lets users get “deeper insights into their operations, helping them optimize supply chain functions — from sourcing and planning, to distribution and logistics.”
Getting all of this to sync with existing systems and platforms is going to be the biggest challenge. While companies might know their existing supply chain approach is no longer robust enough, making the leap to something new during times of crisis might not seem like the natural thing to do. Google Cloud is working with Deloitte and Accenture, among others, on the integration side to merge relevant datasets.