Information overload? Is your company’s data not being used in a meaningfully? Join the club.


Has your company grown significantly through acquisition? Is your company’s IT infrastructure decentralized? If you answered yes to either of these questions, then the problems that confront you on a daily basis most likely stem from your inability to derive timely, meaningful information from your corporate financial or operational systems. Nothing has changed from the perspective of the company: executives still need to be able to make uniform and profitable decisions for an organizational entity, not just a segment or unit. However, with a history of decentralization or growth through acquisition, it is often difficult to compile information from these divisions in a timely fashion, leading to an escalation of costs, lost productivity and an unquantifiable loss of potential revenue while decision-makers wait for information. Couple these issues with the rising volume of data that an organization creates, retains and disseminates on a daily basis, and the process to obtain information will become an even more daunting task moving forward for everyone involved. It is clearly evident that this problem is not going to solve itself.


As an example, take Company A, a privately held manufacturer with locations in Germany, the Czech Republic, France and Spain. Each location, operating under autonomy and virtual isolation, reports its summarized results to corporate on a monthly basis. The effort to effectively audit multiple systems operating on various platforms not only made the month-end close process burdensomely long, but it usually resulted in invalid data. As a result, the company has been in distress for more than three years, has been through three rounds of private equity funding, including a sale from one equity firm to another, has closed operations in one location and is currently assessing the need to close another. They have yet to contemplate either the need for a new IT system or a data consolidation tool to assist their decision-making process. The company, like many others, could be on the brink of a death spiral, systematically eliminating the “worst” performing unit or location based on erroneous data.


The ability to aggregate, compile and display information in a uniform manner from multiple disparate IT systems has become more of a necessity than a desire over the past several years and will become more pertinent to the future success of a company making decisions in the coming years. Companies that have grown through acquisition or multinational organizations that have experienced rapid segmentation have typically not standardized the financial or operational systems of the overall company due to the significant costs and time involved. This leads to segments, subsidiaries or geographies utilizing different charts of accounts, expense and product codes, and various groupings and rollup categories when reporting on their segment.


For example, Company B is a joint venture with six segments operating in Australia, China, Brazil, Spain, England and Germany. In order to reconcile revenue allocation between the two owners, six different extractions of data were required, each with its own chart of accounts, classification of revenue and cost breakdown. The normalization, cleansing and alignment process to track and establish “like kind” pro forma revenue amounts on an ongoing basis was extremely complicated given the various unique systems. As a result, limited analyses were performed on an ongoing basis, decisions to expand in certain geographic markets and limit growth in others were made on less than perfect data. After review, an imbalance of revenue favoring one side of the joint venture was determined to have occurred for approximately $15 million over an eight-year period for a company with top-line revenue of about $20 million per year (or a 10 percent impact on revenue annually).


Because this is traditionally not seen by companies as an IT-related issue, a requirement arises for the business planning and analysis (BPA) group to implement a method of standardization for this information across these systems. This is a need not only for monitoring, but in order to provide meaningful information and analyses for the executive team. This standardization process takes time, but as noted in the example above, not doing it ultimately costs the company significant money.


In order for the BPA group, or other analytical parts of the company, to spot trends, identify problems and forecast needs, processes or methodologies need to be proposed, developed and implemented in order to consolidate and interpret the information extracted from the these disparate systems. The results of any analyses have to be summarized and accepted by the organization as truly depicting of the overall business. This is traditionally where the executive team becomes uncertain as to the reliability of the data upon which they are making decisions. Ultimately, the true complexity and variety of data and systems becomes evident to the decision-makers through simple requests or “deep-dive” questions on specific line items within the result sets or analyses upon which they are relying to make critical decisions about the company’s future. When the BPA group or even the system owners are unable to provide this information in a timely manner or not at all, the executive team becomes uncertain of the accuracy of the underlying data. This leads to delayed decisions and consequently can cost a company millions in lost revenues or additional costs as windows of opportunity pass while BPA and others scramble for data across an enterprise. Therefore, a tool that can aggregate, normalize and report from multiple systems in a reliable and timely manner becomes a critical component to a company’s success.


What to Use and How?


The development of reliable, relatively easy to implement and fairly easy to use tools to help with the problem of data consolidation has been slow. Historically, firms have relied on manual consolidation efforts in a “cut and paste” method in Microsoft Excel, or even advanced consolidation and querying in Microsoft SQL Server, Oracle or other more antiquated database tools. These tools have tremendous learning curves, and the institutional knowledge of these systems tends to reside in a select number of individuals within the IT segment of an organization and not with the business users. During the past several years, however, the marketplace has indeed given birth to several tools that can provide assistance. Many of these new tools were designed on the premise that companies are not willing to expend the time or money to implement new IT systems in order to standardize reporting. While this solution may remain the most pertinent to a company, budgets realistically focus on customer-oriented solutions and improvements, and IT remains an in-house item for most executives. Therefore, a product had to be designed that could be attached to or placed on top of the existing architecture and be used to aggregate and visualize the information in a meaningful manner.


One example of this problem and solution is as follows:


Problem - Company C was unable to complete its monthly financial close process in a timely manner. The books took between 30 to 45 days to close. This delay was causing the executive to lose credibility with investors and analysts, and because of the length of time that this had been occurring, they had suffered a 20 percent decline in their stock price.


Why was it so difficult to close the books? This was due to the nine acquisitions the company had completed over the past two years and the fact that each entity had a different chart of accounts, more than four different general ledger systems, multiple products codes for the same product, etc.


Solution - The implementation of a data aggregation tool that created a standardized set of accounts, uniform product codes and produced auditable and reliable monthly financial and product reports within 10 to 15 days of the month end.


The implementation of this new tool is now a major topic of discussion within the company. Counterintuitive to the traditional means of IT software implementation, this tool was not identified by the company’s IT group, rather, it was suggested by a third party and then designed and implemented in conjunction with the controller’s group - the accounting department. The company’s IT group remained involved, but only in order to ensure adequate hardware infrastructure was obtained. As noted above, this is exactly the opposite manner in which most new IT solutions are developed and implemented. In order to truly understand and identify the business problem and as a result of the simplicity surrounding many of these new tools, a new paradigm of systems design and implementation is being created. The business owner is becoming designer and report writer for these new systems. They allow for more interactive reporting, automatic drill down for transactional detail and can produce visualizations of trends, patterns and comparisons. Gone are the days of creating charts and graphs by hand on a stagnant set of data in Microsoft Excel. We are entering the age of data visualization, in providing immediate “shift on the fly” demonstrative representations of underlying data from numerous data sets.


This new manner of solving a technology-related business problem will significantly alter the IT landscape over the next few years and will allow executives to make decisions using aggregated and reliable data faster than ever. What this means is that executives who are used to manipulating their information using tools like Microsoft Excel, Essbase and Hyperion will become the power users of new tools like Endeca, Tableau, Visual Sciences, FAST and many others, and will drastically reduce the need for the IT group to be involved in the systems design and implementation process.


So What are the Benefits of These Tools?


Simply put, these tools empower the management team to make timely decisions from data extracted from multiple sources simultaneously. We can eliminate the onerous and cumbersome process of converting data from one system to another simply to be able to have a uniform method of creating management reports. We eliminate the need to manually standardize all of our enterprise resource planning systems, financial systems, operational systems and other enterprise-wide solutions, thereby saving significant IT expenditures, personnel costs and capital investment. We have not had to change the way in which subsidiaries or segments manage their business. We have instead altered the executive and boardroom tools to interpret the data emerging from the subsidiaries and segments. We have introduced a tool to reside across multiple systems with no disruption to the daily users, and simultaneously provide tremendous visibility for the management team in the aggregation and visualization of information from disparate data sources in a timely, reliable and meaningful manner.


With the explosion of systems and the nature of large multinational companies or those that have grown through acquisition, there is inherent need to view information across the organization in order to be able to make reliable and informed decisions that are required to meet either an exit or business strategy. Hence, timely and cost-effective data consolidation and aggregation is key to a company’s overall health. Executives making smart, timely decisions will always improve the bottom line.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access