Optimising financial processes

Posted on:

‘Getting the Data Right’ is Key to Financial Performance

A number of discussion themes collided this week. 

Since I don’t believe in coincidences, I thought I would share the common thread.

An article in CFO Dive “scratched an itch” that is getting quite a lot of attention right now.

“To drive digital transformation, ‘getting the data right’ is key”.

Grace Noto asserts that “putting data skills and governance front and center for both leaders and employees is essential for CFOs to make digital transformation headway.”

I can’t argue with that.

One of the challenges is that the “Data” discussion is bifurcating (understandably) with a strong center of gravity around the AI narrative and everything associated with Large Language Models, ChatGPT et al. That is a great topic, but only indirectly associated with this specific challenge.

Getting core data right as early as possible is fundamental in core finance processes and for streamlining operational performance. 

We know from painful experience, that poor quality and incomplete master data is one of, if not the biggest, cause of systemic process failure. Critical master data on Customers, Vendors, Employees and so on . .

Lack of attention and various system migrations and consolidations over years has led to master data that is duplicated, conflicting, incomplete and just plain wrong . . . . creating confusion, processing errors, unnecessary rework, wasted effort and damaged customer and supplier relationships.

In the same vein, I recently had a discussion on Source to Pay Transformation with Dario Kulic, an “end to end” Procure to Pay (P2P) leader and Global Process Owner. His experience highlighted and reinforced the critical role of Vendor Master Data quality and integrity in achieving Straight through Processing, Payment on Time (PoT) and productive supplier relationships. You can listen to that discussion here . .

Many finance leaders understand the master data problem but feel that it is too big a “monster” to tackle.

It is easy to get diverted into broader strategy discussions about MDM (Master Data Management) and MDG (Master Data Governance) but still make no progress on the core “here and now” issues holding back progress.

There are smart ways to rapid success though, not least in applying the Pareto Principle to prioritise those customers and vendors that are currently active, which often amount to just 15-20% of the master data volume.

Once we have taken action, the critical need is to change behaviours so that we don’t pollute the master data again and revert to the dismal situation prior to the “cleanse”. We know that changing habits and behaviours is hard, but do-able with consistency and determination.

I also participated in a great call this week with some of the luminaries of the Accounts Payable Association (APA), led by Jamie Radford. We shared some collective experiences and tactics on this very topic.

To come back to the CFO Dive commentary, “If firms lack data governance or if their systems are cluttered — clogged with duplicate customers and addresses, for example — then the insights they gain from their predictive analytics are “not going to be very valuable.”

But we can use these very analytics to accelerate the data cleanse and monitor alignment to agreed practices, policies and behaviours. As my mother once told me “Don’t put off till tomorrow what you can do today”. There is only NOW.

It is 2023. There is no excuse for not tackling this problem. It is limiting progress, efficiency, morale, transformation and core financial performance.

You can read the full CFO Dive article here . . .

You can learn from P2P colleagues in the APA global community here . . .

Thanks for reading . . .