We recently had the opportunity to speak at the CFO Strategies Conference in sunny Monte Carlo (writers note: OK, someone had to do it) about the role of dashboards and scorecards in performance management initiatives - great setting, great CFOs and great dialog and interaction on addressing some really important issues that are affecting every one of their organizations. However, every time the discussion would turn to how to implement a dashboard or scorecard to help get better visibility into their financial operations, the discussion would turn back to the issues that they all saw with - you guessed it from the title of the column - data quality.

Now, that these executives would understand the need for correct data is obvious - after all, it's their heads on the chopping block if the numbers are not reported correctly. No, what surprised us the most was their understanding of the technical difficulties of getting good data on the scorecard and dashboard in the first place. And to us, that represented a clear opportunity for IT managers of all sorts to make data quality a joint effort between the line of business and technical functions within the organization.

There are obviously pitfalls to doing this. For sure there are enough CFOs who view data quality like a jar of Cheez Whiz - no desire to understand what is in it, they just want it on their nachos. And we've certainly spoken to enough IT managers who are regretting they ever invited the line-of-business folks into their inner workings, because their lives haven't been the same since. However, what if one of the performance management projects you worked on with your CFO related to data quality throughout the organization? What if the "data quality dashboard" became a key tab on the executive cockpit you've been trying to build?

The benefits of doing this were apparent to the CFOs to whom we spoke. They consistently recited the litany of excuses they were dealing with in their own organizations when it came to why people always reverted back to their complex spreadsheets and macros they created in their silos. In the end, they trusted their own data and no one else's. In fact, as a group they cited this perception gap as one of they main reasons their performance management initiatives were not progressing faster - people could not make the leap to a centralized system because they did not trust the numbers.

How can IT help with this? Unfortunately (and especially unfortunately for us as big fans of the listing concept), there were no easy answers and no uniform way of solving the problem -everyone has different systems, different need, and different starting points. But the key point that the CFOs seemed to be making is that IT needs to start "somewhere - and soon." For the CFO is being hit both in the U.S. and in Europe by corporate accountability and compliance management vendors from all directions. But to them, many of these systems only present more "garbage in, garbage out" situations for them to manage. In order to get the value our of their performance management initiatives, they first need insight as to how the data is aggregated and second, to understand the impact to the data if conditions change in the systems' environment. If you're meeting fierce resistance to the dashboarding or scorecarding project you've been trying to push, this may be a good area in which to start.

That takes us all the way from the flash, sexy, visually rich dashboard environment back into the dirt, grime and muck of the data integration process. Its user interfaces are not the prettiest, but trust us, when done correctly, it's a beautiful thing to see. We've noticed a significant upturn in data integration and ETL sales in the last six to nine months and, given the feedback from the CFOs, that is not surprising. Why the sudden surge? Well, if our CFOs are a reliable barometer, it's due in part to the tiredness of the business folks seeing the same old "garbage" in their reports and dashboards.

What does your data integration strategy look like today? Are you still maintaining a hard-coded legacy system? What happens to the end-user reports when a table structure changes in the data warehouse or new tables are added? Do you have the visibility into the effects of these changes down the line? What about fault tolerance? If one of your key operational systems goes down, is there a way for you to quickly monitor the affects of this on your reports? These are all tough questions and, unfortunately, it takes time and research to understand the answers. The questions may also produce competing requests and requirements as well, which makes the problem all that more complex to solve. You have to ask yourself, why would you spend people resources to hand code when you can solve this problem out of the box for not much money at all?

If we learned one thing this week while navigating between the road blocks set up for the weekend's running of the Monaco Grand Prix, it's that the companies who solve their data quality challenges first are the ones who will clearly become the leaders in their market, and the ones who will survive in today's business environment.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access