Who’s Winning The Bloody Fight For Manufacturing Data

Trends / Who’s Winning The Bloody Fight For Manufacturing Data

Summary: The battle for controlling manufacturing data is raging. But the days of single source of truth are long-gone. The answer is no longer MDM or mBOM. The answer is a distributed network of stakeholders sharing data with changes cascading throughout automatically and virtually in real time.

The term “data is the new oil” gets thrown around a lot.

The Economist even got David Parkins to draw them a nice cartoon.

David Parkins Data is oil

It means that data is the most valuable resource on the planet.

In the context of manufacturing, it means that the companies who will win are the companies who can:

  1. Gather the most data from the value chain, specifically from sensors embedded in end products, and
  2. Work out how to leverage that data during the production cycle.

Essentially, the key to building better products that people want is to observe how products are used in the real world by collecting data on a massive scale and the taking that and feeding innovation.

But we’ve accidentally created a problem for ourselves: we have too much data and no comprehensive way to deal with it.

we have too much data and no comprehensive way to deal with it.

The problem with manufacturing data today

Let’s get the lay of the land. What does “manufacturing data” even mean today?

Manufacturing data is split between CAD files (CAD) and bills of materials (BOMs).

Designers work in CAD, and then export their 3D files to BOMs (very long lists).

This BOM becomes the engineering BOMS (eBOMs) that reflect the ideal-state or “as designed” product.

That BOM becomes the manufacturing bill of materials, or mBOM.

This is usually owned in an ERP because it’s used to do things like order parts, pay suppliers, and fuel manufacturing execution systems (MES).

In a perfect world, where there are no revisions and no mistakes, this looks a bit like this:

manufacturing systems showing mbom ebom and CAD

The problem occurs between the different existing software stacks

CAD is generated in CAD programs like SolidWorks, which is great for 3D design but terrible at data management.

So organizations use a PDM for version control within SolidWorks, and then export (usually manually) the CAD metadata to generate a BOM, which then gets (usually manually) ported into a PLM

… at which point, the BOM data and the CAD data are no longer connected.

Organizations then need to move this disconnected bill of materials into an ERP and generate a manufacturing bill of materials (mBOM).

And all this is fine — until something needs to change.

If the underlying CAD file changes, then each of these steps needs to be repeated. What’s more, stakeholders in complex value chains need to access BOM data who don’t necessarily have the PLM/ERP software that the end brand/OEM has.

For instance, just because Airbus uses SAP ERP software, that’s no guarantee that every one of their thousands of suppliers does too.

just because Airbus uses SAP ERP software, that’s no guarantee that every one of their thousands of suppliers does too.

The result is a system where keeping data up to date is extraordinarily difficult.

The rise of master data management (MDM) for manufacturing data

Currently, the leading solution is master data management software, or MDM.

MDM is the idea that enterprises can link all their data together into a single metaphorical database, which then powers every technical tool, business unit, supplier, and contributor in the entire value chain.

According to Gartner’s PLM analyst Marc Halpern:

“managing BOMs with high-level MDM solutions has the most impact on life cycle performance”

And he’s no doubt correct.

That’s what drove PTC and GE Intelligent Platforms to form a partnership way back in 2014.

According to Gartner, MDM is going to solve these five main challenges:

  1. Master data has too many masters
  2. Effective new product development and introduction (NPD&I) requires unstructured collaboration
  3. Complex manufacturing chains make product data harder to track
  4. Orchestrating different enterprise applications is proving too costly and complex
  5. High business risks and costs of product data errors

And these are all problems. But MDM isn’t the solution.

The problem with MDM and Single Source of Truth

The underlying ethos of MDM is straight-forward. If there’s a SINGLE repository of data, then it’s a lot easier to keep track of. Update once, cascade everywhere is the basic idea.

And as engineering.com argued, this is a goal for major players: to control the mBOM and make that the heart of manufacturing, probably by marrying it to either PLM or ERP solutions.

But the underlying ethos, while fundamentally simple, is also fundamentally flawed.

Value chains are complex and not everyone will buy into the MDM walled garden

MDM doesn’t solve the core problem of complex value chains: there are a lot of suppliers, and they all need data access.

MDM doesn’t solve the core problem of complex value chains: there are a lot of suppliers, and they all need data access.

If there’s only a single source of truth, those who can’t (or won’t) buy into that access will just export data out to cheaper formats like Excel spreadsheets — ultimately creating multiple versions of disconnected truth.

Second, this problem isn’t just a problem for SMEs — the MDM owners are equally affected because they’re going to get the wrong parts from different suppliers.

Finally, MDM and other single source of truth techniques don’t account for the dynamic nature of modern manufacturing. There needs to be a fluid movement of data throughout a broad, distributed system.

For instance, suppliers and participants in the supply chain need to connect with feedback from products so they can fold them into design processes.

It’s possible that MDM can fill this role, but we don’t think so.

What’s to be done?

The answer isn’t in single source solution. It’s a shared version of truth across a distributed stakeholder network.

We need a shared version of truth across a distributed stakeholder network.

Shared vs single version of truth

First, shared and single are very different. A single version of truth is a single “master” copy of data that all other data springs from. This is usually imagined as a single database that gobbles up information and then spits out “truth”.

But keeping that single version of truth up to date is incredibly difficult because as those who can update increase, then the validity of the data decreases.

Shared version of truth is when data is distributed across a network. There is no single repository; rather, suppliers contribute their unique bits to the greater whole. This works because:

  • It’s system agnostic — CAD software, PLM, ERP, SCM, etc… all integrate so data can cascade from one system to another.
  • All participants have nominal participation in a shared system. A walled garden approach doesn’t really work for manufacturing, but in this case, everyone is there.
  • Everyone has the ability and the right to create new iterations of truth and inform the network. Just like anyone can edit Wikipedia to make it better, anyone can iterate and improve their piece of the product without worrying about throwing everything out of whack.

Why shared version of truth isn’t chaos

At about this time, someone usually puts up their hand and say “this anarchist paradise sounds great, but I work in a highly regulated industry making a highly complex product with thousands of dependent parts. If any single one of these doesn’t work, then the whole thing collapses!”

And that’s a fair point.

But let’s go back to Wikipedia. It works as a valuable resource because each author only edits the pages that they know about.

Likewise, with suppliers, each one only works on their discrete part of the project. Then, as their change comes through, all the other affected suppliers do the same, and so on. Thus, changes cascade through the entire complex value chain without there having to be a single repository of truth.


Who’s winning the war? In 2014 the battle lines were drawn around the mBOM and whether it was going into an ERP or if PLM was going to take over.

The idea was that the mBOM would become an MDM system, fuelling collaboration and growth.

Well, that didn’t happen.

Instead, single source of truth has proven to be a great idea that’s virtually impossible to execute on.

Suppliers don’t use the same systems, and even intra-company collaboration is difficult. The problem is that, inevitably, you end up with multiple disconnected versions of the truth, and everyone loses.

The real solution is a distributed data network that connects the entire value chain to create a shared version of truth:

  • Everyone has access because access can be bought at an extremely low rate.
  • Changes in one area cascade out automatically to keep files and users up-to-date in the tools they’re already using (e.g. if someone updates a CAD file, the Excel files another supplier have automatically change).

With IoT fast approaching, the data wars are sure to heat up and enterprise software leaders grapple for control. 

But in our humble opinion, the only real solution is democratized data for the entire value chain.

Image credit: Unsplash

Free report

Find out how Upchain streamlines compliance

Upchain and CIMdata logos.