For as long as I can remember, analytic and other downstream nonmission-critical operational applications have been at the mercy of the data quality of transaction systems such as the point of sale, inventory, AR, AP - really the entire ERP system. Even the most impressive systems - those that that jump over hurdles, through hoops and back again to give end users the view of data and functionality they want - are not effective if the data within them is not accurate.

The first response to bad data is often, "Let's get it cleaned up in the source system." In my experience, that is rarely the implemented solution. Also absent is adherence to the old acronym GIGO.

I searched for the origin of the phrase "garbage in, garbage out," and as you might expect, it dates back to the start of computer programming, correctly stating that a computer or application will do exactly as it is told: If you put bad data in one end, you'll get bad data out the other end unless there is code cleaning things up.

There are tools to help with standardization and correction, but that's not what I'm talking about here. I'm talking about the value that data can provide to an organization when it is complete and accurate. Even in this day and age when the importance of data quality is widely accepted, I've seen a lack of effort or ability around capturing and storing quality information in the ERP system that other downstream applications need and use.

Why does the feed have transactions associated to products that aren't really products (or aren't in my product feed)? Why can't I see a grouping of products or customers the way I want to without maintaining spreadsheets outside the ERP, when association to a group or type could be made at setup time?

Problems like these have been documented and talked about, but when projects get under way, timelines are tight and downstream integration is often neglected. The prospects are even dimmer when the ERP system has been in place and an enhancement would be required to assist a downstream system. I often sit in meetings where the idea of requesting the change from the source is rejected outright. (If the request is made, it is often done so with the assumption that it will be rejected, so a workaround effort is started immediately as well.)

Data governance initiatives, master data management and technology-based advances have come a long way. But as I go from company to company, I still see, on the whole, massive resistance to even approaching the source system when a change is advantageous, opting instead to add rules to data integration.

I understand that the source system's main purpose is not to feed data to applications. I understand that speed and accuracy in getting a customer what they want when they want it is critical, but I don't see the company-wide benefit of maintaining this status quo. In fact, I see significant savings, data accuracy and data usefulness in tighter integration of all IT applications.

As with all initiatives and enhancements, return on investment needs to be the mitigating factor. Will the change I invest in bring more customers to my door, sell more product, position the company to realize savings or do anything that improves the bottom line? Proof of ROI is core to the decision-making process. But information management in today's environment should be across all of IT.

Enhancements to ERP applications should have impact analysis for downstream applications, and downstream applications should be asking more of the ERP. Getting data right in the beginning (great data in) will greatly improve the usability and functionality downstream applications (great product out). Everyone in the company will benefit.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access