Skip to main content

      Against the backdrop of the increased use of big data, ever more complex requirements are being placed on IT systems and data. In the age of cloud computing and dashboard culture, the integrity and quality of data is becoming increasingly important and forms the basis for modern reporting. Not only is the volume of data to be processed constantly increasing (data volumes are doubling every two years), but the frequency of reporting cycles is also becoming more frequent.

      Only with an established and well-functioning data quality management system can unstructured data be transformed into meaningful insights and competitive advantages. Well-maintained data sets are not only the basis for institutional customer reporting (e.g. Solvency, Basel III) but increasingly also the basis for ever more extensive regulatory reporting (AIFMD, BCBS 239).

      Data quality does not receive the necessary attention

      In recent years, many KVGs and custodians have made extensive resources available for the development of centralised data warehouses (DWH) and have increasingly invested in the associated system and application landscape. However, the issue of data quality (DQ) has not yet been given the necessary attention. No matter how up-to-date and well parameterised an IT system may be, data deficiencies cannot be eliminated. The individual components and data streams usually originate from a large number of different source systems, which rarely come from a single source or are compatible with each other.

      In the area of investment compliance, for example, incorrect position, market and master data regularly leads to a large number of so-called "false alerts", i.e. alleged border violations that subsequently turn out to be incorrect. This not only costs companies time and resources, but also impairs the acceptance and reputation of the specialist area.

      Development and establishment of a DQ control process

      The standardised definition of DQ criteria and DQ standards, particularly for risk reporting and the creation of an integrated database, is therefore essential for functional reporting. This includes the development and establishment of a DQ control process for measuring, analysing and correcting DQ problems, including responsibilities. The development of a control approach for measuring data quality along the data processing chains ensures consistently high quality. The summarisation of information into usable findings and the integration of these into existing company processes create lasting added value.

      Analysing and evaluating balance sheet and business data has been part of KPMG's core business for more than 100 years. We are not only able to process data efficiently. We condense the information into usable insights and integrate them into existing corporate processes.

      Take advantage of this wealth of experience and get in touch with us.

      More KPMG Insights

      FS Tax - Asset Management

      KPMG advises investment companies, credit institutions, custodians, administrators and investors on all relevant tax issues.
      Frankfurt Skyline

      Your contact

      Elmar Schobel

      Partner, Financial Services

      KPMG AG Wirtschaftsprüfungsgesellschaft