I’ve been working recently with a small financial institution that has a significant need for analysis capabilities but does not have the budget for a lengthy project or expensive technology. The organization currently has a standalone marketing customer information file (MCIF) that provides precanned profitability and product ownership reports and a fairly robust householding engine. They also have call center reporting capabilities through a partial license for a popular business intelligence (BI) access tool that came with their newly installed customer relationship management (CRM) application. While these tools provide some analysis capabilities, both suffer from lack of integration (each presents a siloed view), both have little to no history and neither really fulfills the analysis needs.

While pondering the conundrum of large need/small budget, the client hopefully suggested, “How about a series of independent data marts that we could convert to a data warehouse later?” In this case, the organization really does understand the benefits of a data warehouse, so the question did not engender the panic that it might have if they were serious. Instead, it caused us to sit back and jointly work out a set of steps to enable them to construct a data warehouse in an environment with limited funding, limited resources and a shortened time to delivery. Here is what we came up with.

Leverage known requirements. Actively look for leverage points that might jump-start the project and shorten delivery time. Look at existing analysis and reporting applications such as the MCIF and the call center reporting application. Either of these could become the basis for the first BI project, particularly if there are enhancements that can be provided by migrating the applications into a data warehouse. Shortening the time and effort to create new reports by leveraging the data warehouse, improving data quality for more accurate householding, providing the business with the ability to drill down into the details of summary information where they once had only a static report and providing history are all powerful value additions that don’t require the company to start from scratch. Migrating existing applications can give the organization a leg up on time and effort because the requirements are generally known, the data is documented and quality problems are understood.

Utilize the existing toolsets. Data warehouse technology can be quite expensive, and the selection and user education processes can be very time-consuming. If there are existing tools within the organization that can be utilized for the project, it can save both time and money. In my example, the financial institution already had a partial license for a robust BI access tool and had gained valuable experience in using it to develop CRM reports. Expanding to a full license for this tool would allow them to further capitalize on the investment already made in the CRM application and substantially save in selection and education time. Additionally, the MCIF interface and existing reports already serve as a comprehensive independent data mart. Turning this into a dependent data mart by feeding the application from the data warehouse leverages all the existing reports and queries and adds value at the same time.

Satisfy existing power users. Power users are those individuals or departments that are already doing some type of analysis. The MCIF users fit this bill nicely: they are analysts by trade, they don’t have to be educated on how to use the information residing in the BI environment and they don’t have to be convinced of the value of integrated information. Also, they already have a good understanding of the requirements or information needs that can jump-start the project.

Carefully monitor the scope. Best practice delivery time frames for BI should not exceed 120 days; however, the time-to-delivery requirement of the financial institution mandates an even shorter project time frame. To shorten the delivery time frame, the organization must clearly document the deliverables up front, dictate exactly who the users will be, identify potential risks to scope and develop appropriate mitigating actions. Continuing vigilance should help to prevent the common occurrence of scope creep as users identify yet another report, programmers pull just these additional data elements and business units clamor for inclusion.

Limit sources and users. Up to 80 percent of the typical BI project comes in the processes that extract information from source systems and move it into the data warehouse. Depending on the complexity of the source, it can take anywhere from two to six weeks to perform the source system analysis, and that does not yet include writing a line of code. It is imperative that the financial institution limit the number of sources and data elements added to the data warehouse in a project if short delivery time frames are to be accomplished. Limiting the number of users is also important. The larger the number of users, the longer the training and initial support, the more queries and reports are requested, and the higher the potential for requests of additional sources and data. Choosing a small group of power users and leveraging an existing set of requirements should help in limiting the sources and user base.

At the conclusion of our brainstorming session, the client and I both agreed that we had identified a series of actions that would enable them to begin implementing their data warehouse despite the constraints of limited budget and short time-to-delivery requirement. The steps identified should help any institution facing budget or resource issues.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access