Continue in 2 seconds

Will the Real Analytic Application Please Stand Up?

By
  • Ken Rudin, David Cressy
Published
  • March 01 2003, 1:00am EST
More in

Many years ago, industry pundits began pointing out the need to marry data analysis tools with operational systems to maximize your enterprise's competitiveness. In response, many companies were created to provide these analysis tools. This gave rise to the industry known as business intelligence. A few years ago, these same pundits began saying you need more than a tool – you need an analytics platform to build sophisticated analytics solutions. Seemingly overnight, every data analysis tool vendor's marketing materials spoke about its products as analytics platforms. Pundits have taken the next step. Now, they say, the real answer lies in providing analytic applications. Almost every vendor now claims to provide analytic applications.

You can't blame the vendors for talking about their products. However, we are left with crucial questions: What are the components of an analytic application? How should such a system be developed and executed? Is it advantageous to choose a prepackaged solution over a custom design/build/integrate model?

Cornerstones of an Analytic Application

To accelerate the availability of actionable business insight that transforms gains in operational efficiency into effective return on investment (ROI), organizations need analytical applications. As the old saying goes, "If you can't measure it, you can't manage it." However, the implementation, integration and testing of every component necessary to build an analytic applications architecture is a complex, expensive and inherently high-risk strategy. Introducing analytical capability is not as simple as merely installing a query or reporting tool for an operational system.

Analytical information must span all processes and systems, not simply the billing, resource planning, sales or other systems. However, many organizations struggle to achieve this level of integration because they have developed their businesses around silos of operational information ­ products, distribution channels, billing – and few have optimized their customer-facing business processes. Organizations need to step beyond the fragmented, incomplete information contained within their operational silos and build a broader understanding of their businesses that incorporates both historically aggregated and real-time data.

To overcome these challenges, organizations must implement a prebuilt analytic application – one that has built-in functionality tailored to the needs of the business; one that seamlessly and quickly integrates with the company's existing data and systems infrastructure; and one that delivers up-to-date analytical information in the right format, at the right moment, to everybody in the organization, not just a few power users.

View Across Multiple Sources

What are the building blocks of analytic applications? Clean data, drawn from a multitude of sources, and delivered in an easily computable format is the preliminary cornerstone. To do this, organizations need a data warehouse. To ensure that it addresses core business issues, the data warehouse must be carefully designed to support specific subjects useful to business users.

Unless the data warehouse is small and simple, designing and managing the data extraction, transformation and load (ETL) process from the operational system will require a powerful and flexible ETL capability.

However, the database structures supporting these operational applications are often complex. It is not unusual for the databases in these source systems to have several thousand tables. To extract the necessary information requires an in-depth understanding of the data stored within each table, as well as knowledge of how all the tables relate to each other.

Linking Real-Time and Historical Data

Although the data warehouse provides a source of unified and aggregated historical information, in today's business environment it is vital to augment historical data with real-time data taken from specific tables in various operational systems. This integration is necessary to make decisions based on an up-to-the-minute, complete picture of the business. For example, a telesales manager assessing priorities for a product promotion needs immediate access to key opportunity and customer profile attributes to assess demand. In conjunction with the offer, the telesales manager also needs current inventory levels to effectively evaluate potential supply problems that would delay shipment.

The business model should intelligently draw together all the information from various historical and real-time sources. It should then make this information available to the business users in a way that is easy to understand and navigate, without them needing any technical knowledge of underlying systems. Designing the business model is frequently the most time-consuming and contentious portion of any analytic application development cycle because it defines how the data is hierarchically structured.

For example, a financial services company typically defines a particular territory by regions, broken out by districts and supported by a client relationship manager. This structure enables an end user to navigate the business by starting at the country level, selecting a country and drilling down to see the regions of a particular country, drilling down again into a particular region to see each district, and then drilling down to see the client relationship manager associated with a particular district. The user can then drill down to an even deeper level and locate the names and account balances of each client associated with a particular relationship manager.

The Right Information at the Right Time

Defining the various hierarchies and data relationships is critical to ensuring that information is easy to find. Executives, managers and frontline employees need varying degrees of information, presented in a format that allows them to perform critical job functions more effectively. For example, a consumer goods company's sales manager needs to know total revenue by brand per quarter and the average length of a sales cycle for the various types of distribution outlets to negotiate better business terms. By contrast, the company's call center service representatives need to access their average call-handling time relative to pre-agreed targets. Marketing managers want to know which customers to target in forthcoming campaigns to effectively up-sell and cross-sell their products.

A business model defines the information the user can navigate and display. However, it doesn't actually do any of the navigation and displaying. For that purpose, organizations need to evaluate and select a user presentation tool that will convert the information into reports, charts and graphs. The presentation tool must enable users to interact with the information, such as filtering portions of the data, drilling down into more detail or navigating between different reports. Typically, this tool must also provide some level of generic portal functionality so organizations can flexibly design how they want the data to be presented.

The reports, graphics and dashboards should be designed to answer users' most common questions and satisfy most of their immediate needs for information and insight. However, answers to some questions will often lead to further questions, which will require additional reports or graphs. Traditionally, IT departments created the report, which invariably resulted in a significant delay and potentially led to lost business opportunities. To maintain and update accurate reports, the presentation tool must provide an ad hoc interface to the data. This component enables users to quickly create their own analyses by selecting the types of data they'd like to see in the report or graph. Such a self-service model allows users to receive answers to specific questions in minutes, not weeks.

Proactive Intelligence

As part of the development of a world-class analytic application, organizations also need to evaluate and select a proactive information delivery mechanism to alert users when significant events occur. As businesses increasingly emphasize agility and responsiveness, the traditional model of providing static reports only in response to a user's request is no longer adequate. What is required instead is a mechanism that will continually monitor all relevant data and proactively alert the user of any important issues via e-mail, PDA or other mobile devices. For example, a customer service agent is alerted the moment a critical service request is raised by one of his or her largest customers; or, an events manager is alerted in real time when the number of registrations exceeds the capacity of the planned venue.

Any application that will be used by a wide range of users must include appropriate security and visibility rules to ensure that individuals can see only information that is relevant and appropriate for them. Therefore, organizations must either define these rules for each user – and keep them constantly updated as users change roles within an organization – or have a mechanism for importing these rules from a standardized access-control mechanism.

Role-Based, Guided Analytics

The pattern of usage for analytic applications will vary enormously across the organization. The information one user views when analyzing a situation might be different from the information another user needs when analyzing a similar situation. As a result, each user defines his or her own process for analyzing a situation, leading to inconsistent or even inappropriate business decisions. To overcome this challenge, organizations must implement guided processes for using information, in conjunction with suitable reports and dashboards for displaying the information. Together, role-based dashboards guide users through the key analyses, ultimately enhancing their business effectiveness. Guided analytics are essential to a robust analytic solution.

Rich analytical information must be available to all users, not just top management or back-room "power users." Analytics transcend everyone in the organization. For example, a senior executive uses intelligence on the company's latest sales figures to deliver a gross-aggregated view of its revenue goals for promoting prepaid services. Additionally, a local retail store manager needs access to real-time sales data to hit store target revenues.

Even individual staff members benefit from analytics that allow them to review and improve their personal performance against predetermined objectives. For example, in addition to simply measuring the number and length of calls against personal targets, a contact center service agent can review the number of calls logged against specific complaints or issues to identify and diagnose the root cause, thereby enhancing effectiveness.

However, to fully enhance operational effectiveness, analytics must be seamlessly integrated with the operational systems. There is little point in generating colorful graphical reports of sales trends unless users are empowered to act upon the intelligence contained within. When a senior manager in a retail company is alerted by the analytic application to a slowdown in a particular clothing range, he needs to act fast. He must be able to coordinate his team to quickly identify, develop and execute successful marketing and sales initiatives within each of the stores, aimed at turning this downward trend into upward demand.

The Traditional Custom Design, Build and Integrate Model

The traditional approach to developing the analytical reference architecture was a straightforward custom design, build and integrate model: create a project team; allocate a large budget; buy the various necessary tools; design each component, such as the data warehouse and the business model; build and test the solution; and finally install the solution to the end users. The nature, complexity and sheer scale of building such an architecture is a costly, time-consuming, difficult and error-prone process.

Examine the process of building a data warehouse. META Group estimates that it takes between $2 million and $3 million to build a data warehouse covering a single business subject area. Also, experienced analytic application integrators estimate that building a business model for any large-scale solution takes a team of two or three people a minimum of six months, often a year. Finally, there is a high level of risk associated with the entire project because there are such a large number of required components and a large number of end users. Statistics on building customized analytic applications for broad use show that the chances of failure are very high.

Prepackaged Analytic Applications

Instead of purchasing the various tools and then building a bespoke analytics architecture, the new approach is to purchase prebuilt analytic applications, which deliver prepackaged, out-of-the-box architecture. This new approach is a natural evolution of what has happened historically in the software industry. Approximately 20 years ago, companies decided that rather than buying a COBOL compiler and building their own accounts payable, accounts receivable and general ledger applications, it made more sense to buy prepackaged back-office applications that could be configured to meet their specific needs – thus establishing the basis for the ERP industry. Then, approximately a decade ago, companies again realized that rather than building their own sales force automation, service automation and marketing automation applications, it made more sense to buy prepackaged front-office applications and tailor them to meet their needs. Thus, the CRM industry was born.

Now, organizations are taking the next logical step to buying and configuring prepackaged analytic applications, rather than buying ETL tools, meta data design tools, report writers and data access tools and building their own analytic applications from scratch.

Benefits of a Prebuilt Analytic Application

The ability to leverage prebuilt analytic applications yields tremendous benefits in lowering costs and increasing return on investment.

First, prebuilt applications ensure the lowest total cost of ownership (TCO) because organizations do not have to buy separate components and incur costs and risks associated with implementation and integration. The development costs of a homegrown analytic application must be amortized over just a single company, whereas the development costs of a prebuilt analytic application can be amortized over hundreds of customers.

Second, prebuilt analytic applications are implemented quickly and easily. Because all of the components are prebuilt and preconfigured, implementations take weeks, not years.

Third, analytic applications deliver real and rapid return on investment. Because prebuilt analytic applications incorporate best practices garnered by design input from hundreds of organizations, the solution's effectiveness rests on proven industry expertise.

Finally, because the applications are built on a robust analytics architecture, they have the underlying performance, scalability and flexibility to grow and evolve as the organization's analytic needs increase and change over time.

The key is to integrate these prepackaged analytic applications closely with the associated operational systems to achieve a shift in emphasis away from simple operational efficiency toward greater effectiveness and return on investment.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access