There is a lot of buzz around the idea of automating analytics. Companies, weary of investing huge amounts of time and money into a project that yields a one-time boost in sales or profits, are lured by the idea of automated analysis solving problems without teams of specialized experts or high-priced consultants.

The reality is that there is no magic button. Analytics practitioners can automate many analytical processes, allowing your experts – whether they investigate claims or build marketing lists – to work more efficiently. Accounting software didn’t replace accountants, and automating analytic functions doesn’t replace modelers and analysts. The initial processes still need to be built and automated. Maintenance and modifications also need to be performed on the processes over time as business needs, data structures or other factors change.

What automation can do is power huge efficiency gains and allow a company to cost-effectively explore and test models to find the right customers for a specific offer or the optimal way to flag suspect claims. The combination of a well-designed data warehouse and high-powered analytics helps automate scoring, validation and tuning, leaving the business users more time to create and explore. It allows companies to work with large volumes of data quickly and efficiently. Having an analytical development environment, or sandbox, within your data warehouse or in an attached analytic appliance is key in enabling analysts to develop, test, automate and deploy their processes.

How Automation Helps Companies Expand Into New Markets

Consider the experience of one major, multichannel retailer. When the company introduced a new store and catalog geared toward the teen market, it needed to quickly figure out which of 33 million households were the best to target with a catalog featuring new products. A modeler was able to create several lists of 30,000 to 50,000 households each. The modeler used several variables and scored each by potential profitability using several types of models, including traditional regression analysis, logistic regression, two-stage modeling and neural networks. It sounds like a time-intensive undertaking. In reality, the process is so automated that three modelers create more than 100 models a year, allowing the retailer to send upward of 100 catalogs to different subsets of its household list. This project was just one of many the small modeling team could handle without needing to add staff or hire consultants.

When the process is automated, it is much easier to use multiple models, test them and tweak them. If it takes weeks or months to build a model, the market changes before a company can test several – unless they engage many more modelers in the process. Once the right model is selected, it can then be deployed repeatedly for the specific business issue (selecting potential shoppers for teen offers) and tweaked along the way to account for customers “aging” out of the market or for the addition of new stores in a geographic area.

The Importance of Speed and Accuracy

Making the technical changes and process adjustments required to implement a high-powered analytic sandbox pays huge dividends, as a financial company discovered. The firm offers a credit card and wanted to do a better job of identifying customers that needed to be called to encourage debt repayment. Prior to establishing a data warehouse that could be accessed easily for analytics, the model-building process was quite cumbersome. It took three weeks to prepare the data, and upward of 14 weeks to build a model. Model runtime typically took seven days. Inconsistent data compromised results.

Because of this, the company often took a shortcut – chunking up its list so that it built a model against a list of 350,000 customers instead of the entire database.

When they created a data store with 1,400 variables and went with a high-powered analytic approach, data preparation took just 90 minutes. As is common, simply standardizing and automating the creation of key metrics of interest can drive huge dividends by itself. If an analyst can get right to analysis with up-to-date data instead of spending time simply gathering data, productivity will greatly increase. In the case of the financial firm, models were built in two weeks (versus 14) and runtime dropped to 36 minutes. Most importantly, the new approach brought in about $1 million a month in payments from customers at risk for default.

How Not to Automate

There are good ways to use automation and bad ways. In a nutshell: Don’t try to use an existing model for a brand new question, and don’t let a process outlast its reasonable life span. Say you have a data set that you’ve used to sell products to those teenagers. Now you want to predict who will respond to a coupon offering a 10 percent discount. Even if the mail is going to the same group and the goal is the same, it is critical to build a new model. It’s a different question, so it needs a different model and an analysis of a different focus. Likewise, with the financial services company, the data set of late-payers might be a starting point for a product designed for people with less than stellar credit – but you wouldn’t use the same model to build your marketing list.

When you implement an automated process, it is also important to have specific dates planned to revisit the process. Even the best process will weaken over time as the market changes, business evolves and customers mature. A process should be expected to be outdated and in need of updating or retiring within a few quarters, or at most a few years. Just as you can’t buy a house and never maintain or renovate it, you can’t build an analytic process and do nothing either.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access