Our readers responded exceptionally positively to the article “Liquidity planning in heterogeneous system landscapes" (Treasury Newsletter No. 140), in which we looked at the challenges facing treasury departments in liquidity planning: rising interest rates, volatile markets, and often a legacy, heterogeneous IT landscape that complicates precise management. The use of modern BI, ETL and advanced analytics platforms was introduced as a flexible and pragmatic solution for integrating and processing data from different sources. Our readers have expressed interest in learning how this crucial first step looks in practice.
That's why this article takes a closer look and uses real-world examples to show how data from different sources can all be connected and how smart data modeling creates a solid basis for reliable and analyzable liquidity planning—even without a fully integrated system solution.
The ultimate goal: a central, structured and trustworthy database that serves as the starting point for all further analysis and planning.
1. Data connection – Putting the puzzle pieces together
The first step in overcoming data silos is to connect all relevant source systems to a central data analysis and automation platform (referred to as the ”platform” below). These platforms typically offer a variety of connectors and interfaces for different data formats and systems. Flexibility is key here.
- ERP systems (e.g., SAP S/4HANA, SAP ECC, Oracle Financials): At the heart of lots of financial data.
- Specific example (SAP): Rather than developing their own complex solutions, many platforms (such as Alteryx, Power BI, Tableau Prep) use certified connectors. One common method is access via ODP (Operational Data Provisioning), which allows standardized extractors (DataSources) for financial accounting (e.g., FI-GL, FI-AP, FI-AR) or even specific treasury modules (SAP TRM) to be tapped. Alternatively, remote function calls (RFCs) can be used to access business application programming interfaces (BAPIs). It is technically possible to query tables directly in the database, but this is often not recommended for performance and governance reasons. A pragmatic, if less elegant, approach is to automatically export reports or tables to formats such as CSV or XLSX and then import them into the platform.
- Important: Check access rights and use dedicated technical users with the minimum permissions needed.
- Bank details (account info): Crucial for analyzing actual cash flow.
- Specific example: Electronic account statements in standard formats such as MT940 or camt.053 are the norm. Many platforms offer modules or allow scripts (e.g., Python) to automatically retrieve these files from an SFTP server of the bank or a service provider (e.g., TIS, Serrala). The platform then processes the XML or MT940 structure and extracts relevant fields such as value date, posting date, amount, currency, purpose, and counterparty data (if available). Direct bank APIs (within the framework of PSD2/XS2A or specific corporate APIs) are increasingly gaining ground, as they allow for timely, often transaction-based retrieval and can be accessed directly via the platform.
- Treasury management systems (TMS, e.g., Coupa Treasury, Kyriba, ION Treasury): a source for financial transactions, bank account management, guarantees, etc.
- Specific example: Many modern TMSs offer REST APIs that can be used to retrieve data such as due FX transactions, money market investments or loan disbursements/repayments. Alternatively, it is often possible to access the TMS database directly (SQL) or export data in CSV files, which are then processed by the platform. The idea is to integrate future cash flows from financial transactions that have already been completed into the planning, for example.
- Excel files and other flat files: Often used for manual planning data or data from niche systems.
- Specific example: Controlling generates a monthly Excel file with sales forecasts for each company and saves it to a defined network drive. The platform is set up to monitor this folder, recognize new files, validate their structure (e.g., column names, data formats), and import the forecast data. The same approach is used for HR data (planned salary payments) and specific project data.
- Challenge: Consistent structure and naming of files and worksheets.
The power of these platforms really comes down to combining these different connection methods and automating and monitoring the data loading process. And if a source stops working or a new one is added, it's usually pretty easy to make the necessary changes.
2. Data modeling – bringing structure to the data jungle
Raw data from a variety of sources is rarely suitable for coherent liquidity planning without some form of processing. It comes in different structures, levels of detail and qualities. This is where data modeling comes in. It creates a uniform, logical structure that serves as a single source of truth for all analyses and reports. A proven approach for this is the star schema.
- Concept: The core consists of one or more fact tables containing the actual metrics (in this case: cash flows). These fact tables are grouped around dimension tables that provide descriptive attributes (who, what, when, where?).
- At the core: the fact table(s)
Specific example: You need at least one fact table for actual cash flows (from bank data/ERP) and oftentimes a separate one for planned cash flows (from planning tools, Excel, TMS due dates).- Actual Cashflows fact table
In this table, the key figure fields for the amounts in local and group currency are important. On top of that, a link to the dimension tables is created. Mainly to the time, account, company, and currency dimensions. Not to forget the link to the dimension that has the plan item info. - Planned Cashflows fact table
This table is set up like the Actual Cashflows table, but it has the plan amount as a key figure field. The planning scenario can be added as additional dimension information.
- Actual Cashflows fact table
- Dimension tables: The context
- Time dimension (DimTime): Essential for any time-based analysis.
The time dimension should contain all trivial fields. In addition, company-specific calendar information can also be stored here. For example, you can specify which date is used as the monthly reporting date (last calendar day vs. last working day) or entity-specific holidays. This table is generated once and covers a relevant period (e.g., 2010-2030). - Plan position dimension (DimPlanningPosition): The key structure from a treasury perspective. This is where the “clearly defined plan position” mentioned in the first article is mapped.
Along with a primary key and the name of the plan positions (and any other meta information), the plan position hierarchy structure is also mapped here.
This dimension must be carefully defined with Treasury and forms the basis for the subsequent mapping of raw data in the ETL process. - Company dimension (DimCompanyCode): Master data of your own company divisions.
- Bank account dimension (DimBankAccount): Master data of your own bank accounts.
- Currency dimension (DimCurrency): List of relevant currencies.
- Business partner dimension (optional) (DimCounterparty): For cases where customer/vendor information can be extracted from the sources.
- Scenario analysis dimension (optional) (DimScenario): This is where the necessary information is stored if planning data is created for different scenarios. For example: Scenario name and parameter data.
- Time dimension (DimTime): Essential for any time-based analysis.
This structured model is the objective of the ETL (Extract, Transform, Load) process, which extracts the raw data from the connected sources, cleans it, transforms it (e.g., performs mappings to plan items, converts currencies), and finally loads it into this target model. The model itself then gives you a clean, consistent basis for dashboards, reports and further analyses such as plan/actual comparisons or forecasts.
Conclusion & outlook
Systematically integrating all relevant data sources and building a well-thought-out, centralized data model are no small feats, but they're the key to successful and efficient liquidity planning, especially in complex system landscapes. This does not replace the need for clean data collection at the source, but it does give Treasury more control over the data and creates a unified view. BI/ETL platforms give you the technical flexibility you need to do this. When implementing the data model, it makes sense to use platforms that are already in place in your company. These could be databases in SAP BW, Oracle, or DB2/IBM, for instance. Since the data model outlined above is relatively simple, it is ideal as a pilot for introducing a cloud-based database solution, which can then provide more comprehensive analysis tools depending on the manufacturer.
Building on this solid data foundation, the next value-adding steps can then be taken: implementing the transformation and cleansing logic, performing the actual planning or forecasting as well as compiling meaningful analyses and reports. These aspects warrant further consideration and show how the data foundation created can be used to generate operational and strategic added value for Treasury. Ultimately, this approach makes financial management more robust and strengthens the company's resilience in dynamic times.
Source: KPMG Corporate Treasury News, Edition 154, May 2025
Authors:
Börries Többens, Partner, Finance and Treasury Management, Corporate Treasury Advisory, KPMG AG
Jakob Fisahn, Manager, Finance and Treasury Management, Corporate Treasury Advisory, KPMG AG
Börries Többens
Partner, Financial Services, Finance and Treasury Management
KPMG AG Wirtschaftsprüfungsgesellschaft