Imagining the possible applications for Artificial Intelligence is not difficult. You will probably also be able to formulate concrete ideas about it. But ultimately it all comes down to using quality data as input for AI. For the old cliché ’garbage in, garbage out’ is still true today. 

Focus on data quality

The focus on data quality is far from new. With the advent of new digital technologies, organizations began to increasingly recognize several decades ago that success in a digital environment requires a solid foundation. Data about customers, products, people and own processes, among other things, must be in order, because only then can you get to better insights and thus better decisions. This applies at all levels – from strategic decisions in the boardroom to daily actions in the workplace – and in all domains. As a CFO, you can better predict where opportunities lie, as a doctor, you can make better diagnoses and as a marketer, you can identify customer needs with more precision.

All of this (of course) only works if those insights are based on reliable data. And that is not a given, as practice shows. Systems have become cluttered with outdated data, different data definitions are in circulation, after an integration of organizations, there are gaps in the data, and there are numerous other problems.

This has been a concern within a lot of organizations for many years, and so as digitization became more and more of a strategic backbone, the stakes of addressing data quality became increasingly high. Yet in practice, the attempts were not always successful. Data Management – securing reliable and consistent data – and Data Governance – having clear rules of the game and responsibilities – was in many cases an inevitable necessity and sometimes produced only temporary improvements. Unfortunately, it too often became an objective in itself, and there was no clear view of its added value.

The need for data quality in an AI era

Many organizations realize that this is no longer tenable, especially now that the application of AI is becoming a critical part of strategy. Therefore, more than ever before, there is a clear business case for investing in data quality. Data quality may often not be seen as a sexy topic, but in an age full of AI, hardly anyone doubts its necessity anymore. 

On the one hand, in our experience, a lot of speed is required. To be able to deploy AI – and not fall behind in the competition – organizations must ‘upgrade’ their data to a level where AI can lead to reliable and useful results. Only then does it provide a good foundation to, for example, use AI to advise customers personally – with a chatbot or in some other form – or to streamline internal processes. Specialized AI tooling can incidentally play an important role in reprocessing the data by detecting inconsistencies or omissions in large amounts of data. This tooling speeds up such projects considerably.

A structured approach is essential

On the other hand, it also requires a structured and somewhat longer-term approach. After all, the best way to structurally bring the quality of data up to standard is to set up processes and systems in which checks on data are enforced. However, this is not done in a few months, if only because it is not just about adjustments to hard issues – such as automatic validation and workflows that enforce data entry or checks – but also about the creation of broad-based awareness in the organization. This structured approach deserves at least the same urgency.

The best approach is to ensure a portfolio of (improvement) projects that focus on both short-term wins and long-term improvements. So that there is a quick and fundamental response to a reality in which Artificial Intelligence takes a leading role.