Examining the data culture and governance trends in North America with the role of Advanced Analytics in expediting the adoption

This is part I of a 3-part blog that discusses the data culture trends and challenges in the North American market, comprising the US and Canada. There’s also a related blog that homes in on Europe and South Africa’s data trends and challenges.

As recently as fifteen years ago, North American companies did not regard data governance and data quality as high on their agenda. They were perceived as a nice-to-have in complementing their long-term roadmap and existing system landscape.

But as more and more companies begin to recognize the value of high-quality data, this viewpoint has taken an almost 180° change. Business processes, no matter how efficient and optimized, simply won’t work to their advantage if there’s no data quality at the foundation.

Data governance prescribes the management, handling, and access of data with the main goal of upholding data quality. It’s becoming a strategic imperative with its implementation taking center stage alongside crucial transformation initiatives like business process reengineering and platform migration.

This view is adopted not just amongst the clients and prospects we’ve engaged in, but amongst ERP technology providers like SAP, Infor, etc. This is evident through their product differentiation that branches out to master data management and data governance.

How Advanced Analytics changed the game

In the last 5 years, advanced analytics and its applications have become more mainstream. According to Gartner, “Advanced Analytics is the autonomous or semi-autonomous examination of data or content using sophisticated techniques and tools, typically beyond those of traditional business intelligence (BI), to discover deeper insights, make predictions, or generate recommendations”.

As the definition suggests, advanced analytics helps in use cases that require some degree of foresightedness such as predictive maintenance, quality control, and customer buying patterns by leveraging the latest technologies like IoT and machine learning (ML). It didn’t take long for organizations to realize that good foundational master data has to be a prerequisite to get maximum value from their investments in advanced analytics.

For predictive maintenance, IoT components collect sensor values from different machines and run algorithms on them to predict impending failures. But if the master data for the equipment is wrongly maintained or incomplete (e.g., part, model number, maintenance plan), you’d end up having to do massive rework involving repetition of the whole cycle of maintenance planning, reordering, etc.

To determine buying decisions, a plethora of customer data points are needed to train the ML models. Again, if the data is fraught with fundamental errors like duplicates in addresses and names, the ML models could give erroneous foretelling that leads you even further from the truth.

Efficiency and productivity get hurt this way, not to mention the bottom line.

This has contributed to the upward trend in the adoption of master data management and governance solutions amongst North American companies in the last 5 years.

But some common challenges have made the need for data governance and data quality even more pressing.

Check out part II to learn about these challenges!



Richard Anderson,

Prospecta’s Executive Vice President Sales – Americas