Years ago, the term “big data” was coined, and today, that big data comes with big challenges: dozens of systems that need to talk to one another, legacy systems from mergers that are unwieldy to deal with, and regulatory requirements that differ from one country to the next.
To manage this intricate web of potential issues, you’ve got to start with a good data foundation and robust data governance.
Mo’ Data, Mo’ Problems
Big data brought with it the eternal conundrum: quality vs. quantity. When it comes to quantity, we have a’plenty.
How much?
Well, the “Global Datasphere,” as IDC calls it, was 33 Zettabytes (ZB) in 2018. By 2025, it’s slated to reach 175 ZB. That’s a lot of data.
Quality, on the other hand…well, that’s a different story. If you ask most enterprise leaders who work with data, they will tell you the same thing: data quality continues to be a big challenge for most global organizations due to the sheer number of different systems, a legacy of mergers and divestitures, different regulatory requirements from country to country, a giant web of third-party vendors and partners, and a diverse product portfolio, to name just a few.
Bad data could be data that hasn’t been updated across systems or that contains incorrect information, and it can have large negative impacts. Poor data can cause operational challenges like incorrect addresses on shipping labels, or serious fines for being out of compliance with industry regulations. And bad data comes at a cost: companies can spend 15-25% of their annual revenue trying to correct data errors and deal with problems caused by bad data.
In most transformation projects, the longest pole is always data, because it requires harmonization before it can be transformed, and before you can begin any harmonization, you need to have clean data. This always takes the longest time; market leaders know this from painful experience, so, if they’re smart, they start the data work well ahead of any software development or process design.
Data in a Bubble is Useless
Big data such as market research or even post-market vigilance (e.g. complaint monitoring) is only as useful as how well it can be linked to existing operational data such as supply chain operations, financial costing, and profitability analysis at the individual product level.
And yet, getting data into multiple systems for the greater good of the collective, especially legacy systems post-merger, can be as challenging as trying to conduct an orchestra on Zoom. The cello may be slightly out of sync, and the harp may be playing a totally different tune.
What can prevent this disharmony is a solid data foundation and data governance plan. Your data governance strategy should include policies on how you will ensure data quality and accuracy, who should manage the data, and what systems need which data components.
If you don’t have robust data governance (though, you should), invest in data quality early on in your technology project to de-risk the project and identify landmines that may otherwise end up costing you much more down the line.
Share: