New approaches to data and analytics programs can help organizations fail fast, optimize costs, and realize business value faster
Without good data it is difficult to trust insights—especially with complex analyses involving machine learning and artificial intelligence. Our KPMG Global Tech Survey Report shows that one of the top challenges that businesses face when adopting new and emerging technologies is that they are working with sub-optimal data and data management practices.
When asked to describe their organization’s position regarding data and analytics, 30 per cent of respondents said leadership supports their overall strategy and has provided funding. Yet, implementation of data and analytics strategies are slower than expected or behind schedule.
That is because many businesses are taking a technology-first approach, rather than focusing on solving complex business issues with analytics. Read on for leading practices to transform your data and analytics strategy in ways that will help create value in the short-term and long-term.
Start by framing the business-problem first and work back to the data problem
Oftentimes, organizations get lost in their data and don’t know how to move forward. They might spend a lot of time and money looking for the ‘right’ analytics platform to solve all their business problems at once, as opposed to adding value and finding quick wins by quickly resolving a specific business problem.
A different approach to large-scale, multi-year data programs is to take a business-first perspective and then look at how the data can help. Traditionally, all data has been treated in the same way. But not all data is equal. Data is complex, it’s constantly growing, and it has mixed value and quality.
By starting with your problem-statement, outlining and testing assumptions, then working backwards to your data strategy - your organization will start seeing the value of transformation faster.
We undertook a large global modernization of an insurer’s valuation systems environment. The work included rationalizing models, automating data feeds, and providing analysis tools that would allow the business to explain their results, including the analysis of change from one quarter to the next.
Getting analysis requirements correct and curating the right data is typically the biggest challenge. This reporting process is complicated with multiple data sources and thousands of potential attributes.
We worked with their end-users to help build the reporting and analysis solution in a series of iterations. Thus, the end-users were able to interact with their data on their strategic tool and understand the art of the possible.
This is the opposite of the traditional “paper-based” process where requirements are formed through meetings and workshops that deliver the current state in newer technology. Or even worse - it fails to deliver a user interface because the cost of data sourcing consumed most of the budget. By starting with the user-experience, and not the “data-problem” we were able to help make an impact faster, and on-budget.
Starting small and failing fast
Starting with small, achievable objectives and focusing on a business problem—rather than on finding the right analytics platform—can add significant value to the organization and help shape long-term objectives. When you start moving into more advanced analytics with AI (Artificial Intelligence), it becomes even more important to start small and build credibility to achieve value.
What we see with organizations is that businesses run out of budget before they get to the final mile – and the final mile is where you get the value. Our approach is to tackle the final mile first and implement a strategy that tackles small incremental problems as you work toward the more advanced analytics and technology platforms.
By making small, incremental changes, you can test hypotheses and solutions quickly. In this framework, you’re reducing the cost of failure by failing fast. This iterative approach is important to being able to “turn off” tactical solutions and “turn on” strategic solutions.
We embarked on a 5-year program with a multi-national company. In year 1, We focused on finding the biggest pain point to solve for, which turned out to be statutory reporting for one of their entities.
The team was in perpetual close due to their annual reporting processes - which took over 42 days and bled into the monthly timetable. This resulted in significant (90%) staff attrition and penalties for material errors in their disclosures.
We spent 2 weeks assessing thousands of spreadsheets to confirm that we could help solve the problem. This was followed by a series of sprints - each starting with us sharing our hypothesis regarding how many reports we could finish and how we do it. The business-as-usual team was able to challenge our assumptions, and we would then proceed to test our hypothesis and update our assumptions.
We proceeded to deliver 21 of the 23 disclosures and reduced the reporting timeline to 6 days. Our solution delivered a more efficient reporting process, using their strategic visualization tool, while sourcing data tactically. We replaced each of the tactical feeds one-by-one as we extended the scope of their Finance and Actuarial data warehouse. We were able to help improve their reporting and the employee experience by starting with a small hypothesis, testing, and learning from each iteration.
Switch up your hiring profile
In addition to sub-optimal data, other top challenges faced by organizations are a lack of skilled talent to carry out key technology roles and a risk-averse corporate culture that is slow to embrace digital change and disruption.
While there is a global talent shortage, organizations are also struggling to find the right talent for data transformation projects because they are still hiring the way they did 10 years ago—yet the hiring profile has changed.
Many businesses gravitate toward data scientists to run their analytics programs. But data scientists are trained to solve the ‘data’ problem rather than the ‘business’ problem. While data scientists have a valuable role to play, they should be part of a broader multidisciplinary team rather than the central driving force behind an analytics program.
A different approach is to seek out multidisciplinary individuals who understand how to solve business problems but also have an aptitude for technology. For example, an actuary may understand the financial cost of specific risks to the business, but an actuary who also has an aptitude for technology could help to frame a data solution around that business problem—which can then be handed over to a technology specialist to harden the architecture.
It is not always easy to find these multidisciplinary individuals, but organizations can also develop skill sets in-house, which can also help with talent retention. These individuals should be comfortable with ambiguity and be able to solve problems or build solutions through an iterative approach.
Bring agility into the work environment
An agile work environment embraces failure. It means failing fast, so you can try again and make iterative changes to get the result you want. It’s where people can share their work, get feedback, and adjust, but they need to feel comfortable exposing flaws.
A data program requires a hybrid approach between agile and waterfall methodologies. An agile environment can help to articulate the problem, do the data discovery, and build out the solution, but it then must go through a series of tests—including bias testing—before going live.
This hybrid approach requires working in smaller teams, with people who have multidisciplinary skill sets—and a solid grasp of both technology and the business. It also requires structure, rigour, and governance around data, as well as support further up the chain to ensure the data is fit for purpose.
Our approach to creating cultural shift needs to remain constant. It is not just something that happens during kick-offs. It needs to happen within teams itself. We have found that finding champions within each team ensures there is a constant challenge of the status quo.
Tying it all together with data governance
Data is the foundation for analytics. An analytics platform provides insights from data, whether for simple analysis, such as helping users visualize their data, or more complex AI applications. Without governance, it is easy to misinterpret the output of analytical platforms and therefore make poor decisions. Lack of governance also makes it easier to introduce bias into analysis.
While analytics are meant to replace gut-based decisions with data-driven ones, decisions based on poor data can be even more damaging, since you do not know whether that data is accurate or biased. Analytics is only as good as the data that underpins it. Users should understand why they made a decision, not because a tool told them to make that decision. Governance around how the data is interpreted inside the engine is important to the analytics outcome. This includes bias testing, auditability, and the ability to explain what the output is being used for and if that analysis is appropriate for that objective.
We were brought on to help standardize risk/liability assessment models across the group and integrate them into an automated data solution.
The teams working on the models were not able to define their data requirements for data integration upfront. However, without these data integration requirements the models would need to handle data transformation reducing consistency across the group and increasing cost of ownership.
The modelling work required an iterative approach. We created a data team to work alongside the modeling team to help develop the future state data layer using modern cloud services and an agile approach. This agile data team started with the data interface and worked back to source systems, providing intra-day changes to the data interface as required. Once development work on the models had been completed, the data solution was optimized and made more operationally robust. We were also able to leverage the group’s chosen future state data technology stack.
This approach had the added benefit of testing and validating the data mappings and driving transformation alongside the models. This provided greater confidence in the result amongst business users.
It comes back to finding the right people and the right governance framework. To overcome the challenge of sub-optimal data, it requires finding and training multidisciplinary individuals who challenge the status quo and become part of a larger culture shift around data and analytics. By starting small, failing fast, making iterative changes, and solving immediate business problems - data can become an asset to the organization.
How we can help
KPMG in Canada can assist you in streamlining processes to achieve operational excellence. No matter where you are in your journey to operational excellence, from reviewing your current processes to implementing game-changing new technology, our team of experienced professionals is here to help you increase business operational efficiency by designing and embedding effective and transformative business analytics across your organization. Contact us to see how our team can help you drive change and improve your data and analytics capabilities across front, middle, and back-office functions.