Jack Noonan started his tech career fixing computers for IBM in February of 1967. A year later, three academics at Stanford and then the University of Chicago began building statistical software to boost decision-making and what would become SPSS. In the decades that followed, Noonan honed his software chops in key roles at system-builder Amdahl and later at Candle, where he first came across SPSS. “The technology wasn’t really OEM-able, but it interested me and I stayed in contact with the company.” In 1992, Noonan left Microrim, where he’d been president and CEO, to take the same top role at SPSS - and says he’s been happy ever since. When he’s not in his office in Chicago’s Sears Tower, you’ll find him wakeboarding in the summer and fixing up vintage boats or his prized old Jeep in the winter. Noonan recently sat down with DM Review Editorial Director Jim Ericson for an update on predictive analytics, information resellers and the data mining space.

DMR: How do data mining and predictive analytics fit into the modern picture of business intelligence [BI]?
Jack Noonan: I think they’re alike in that they both start with the same fundamental data. The main difference is in reporting form where BI looks at the past, while predictive analytics are forward-looking. The BI community today talks about applications, and what they usually mean are dashboards associated with a set of key performance indicators by vertical market. What we add to that are key performance predictors, or KPPs, that are associated with key performance indicators, because if it’s worth measuring the past, it’s surely worth measuring the future.

DMR: How does that connect with the use of operational data for nearer to real-time business intelligence?
JN: You can update the analogy of an automobile, where the old BI was driving while looking out the rear-view mirror. Operational analytics gets you as close to real time as you can get, the equivalent of looking through the windshield. What predictive analytics can do is to actually help you look around the next corner.

DMR: The predictive analytics users I talk to are very customer oriented and building propensity scores and lifetime value figures for individuals. Is that universal?
JN: I can tell you that SPSS has always focused on data about people. SPSS stands for Statistical Package for the Social Sciences. The founder of SPSS was building propensity scores for voter preference back in the late ‘60s and writing his Ph.D. on the subject. When you think about it, we’re doing exactly the same thing today, for voter preferences but also for thousands of other preferences. But we really do five things. We focus on finding new people, keeping the best ones longer, selling them more stuff or improving outcomes, detecting fraud and minimizing risk. Those five things drive what we call revenue enhancement. Just like me, every CEO is looking to drive the most effective revenue in their operation. You want the most profitable revenue you can get, and that’s what this technology is great for.

DMR: Would your customers pursue those five things separately or are they tied together?
JN: I look at customer relationship management [CRM] as something that takes in all five. If you’re not going to up sell or cross-sell somebody who is a risk or potentially fraudulent, this is just the bigger continuum of CRM. We’re like that BASF commercial - we don’t build CRM software, we make CRM software better. It doesn’t matter whether you’re dealing with a customer through an ATM, a Web store, a call center or at point of sale in your store; you want to make the interaction as positive as possible. Five years ago everything was about cutting cost. You would be focusing predictive analytics on changing business process but typically to save money. Now the focus is on changing business process to increase revenue profitably.

DMR: Partly because of cost issues, we are seeing a new generation of “failed CRM” stories.
JN: We’ve all been investing for the last 10 years to improve the effectiveness of dealing with customers. That includes everything from BI software to our CRM, financial and EDI systems. Those are all about improving effectiveness with customers. By layering predictive analytics on top I can actually get the return on investment many of those vendors promised 10 years ago.

DMR: That sounds like just another promise.
JN: Yes, but the change in the business over the last five years is in what I call real-time analytics. That’s the ability to take the output of an analytic process, which is a predictive model, and literally deploy that into a business process to change that business process in real time based on forward-looking information.

DMR: It sounds like a rules engine.
JN: Exactly. In fact, the integration of predictive analytics and business rules are an absolute must. Predictive analytics automates the decision-making process, so you’re looking to plug this stuff into existing business processes to shorten them. Business process definitions typically take the worst case, which really means long and expensive. If I can get forward-looking information along that process, I can “not” do things. By not doing them, I save money and increase revenue. I was involved in the quality movement at Amdahl 20 years ago when statistical process control was the way to improve manufacturing and increase quality. What we learned then is the same thing going on now in the front office. If you shorten cycle time, quality goes up and costs go down. Typically you would use predictive analytics where you have very high volume complex rules that are changing often and difficult to manage. You would typically replace those with data-driven predictive analytics.

DMR: Give us an example.
JN: Say I’m going to up sell someone. Instead of building 20 rules, I could create a simple rule with predictive analytics around 15 or 20 variables of age, backgrounds or histories to predict what the best product is for you to buy. I’d also want a rule to tell me if the product is in stock. If I were using a tree algorithm, I get to the bottom and say, “Here’s the prediction.” I can build that in a rules language. With a predictive analytic, I can also rescore a particular record without having to recode the rules. I start with a huge and complex rule set [and replace it] in a single model.

DMR: How do you deliver confidence in these models?
JN: Part of using predictive analytics is the ability to prove whether the stuff actually works by using holdout samples, which take a piece of history and don’t show the model or algorithm the data, and then runs the model against that data to see how well it predicts what’s already happened in the past. There are some good techniques evolved over the decades to prove the validity of this stuff before you apply it to a real situation.

DMR: Shifting gears, information brokers such as D&B and Experian seem to be changing the market. Do you agree?
JN: Data providers are getting smarter all the time. They’re not just supplying data, they’re augmenting data with what are effectively propensity scores to enhance their services. I know one that uses our technology to build over 1,000 scores per record. A little over half of those are sold to everyone and a little less than half are built specifically for clients using their data.

DMR: Will it lead to outsourcing data services down the road?
JN: It depends on your core competency. If you’re an organization that believes data is important to your future success, you’ll in-source it. If you believe it’s just something you need but it isn’t your core competency, you’ll outsource it.

DMR: What about the data-hosting providers?
JN: Companies like Experian and Equifax built their data warehouses to integrate with customer data because warehousing of data has only a certain amount of value. It’s difficult to sell a service if all you’re doing is trying to host data less expensively than the next guy. You have to add value in some form.

DMR: There’s a lot of scrutiny around data mining these days. How do you see the future playing out?
JN: Privacy is an evolution. [When the] automobile was invented there had to be changes in legislation, traffic signals, speed limits, taxes for roads. A whole industry was created with controls and rules that followed. Think of manufacturing and OSHA where there are many controls and rules. I see exactly the same thing in the data environment. As we use it more, we’ll learn and know how to legislate as we get smarter.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access