Tom Davenport

President's Chair, Information Technology and Management at Babson College

Q: You’ve been writing that the connection between information and decision-making is not as certain as people assume.

A: In loosely-coupled environments that make up the bulk of our BI and analytic decision-making we’ve created a sort of decision playground. We go and get data out of the data warehouse, we use analytical tools, but we don’t look closely at how people use it to make decisions. As a result, they often don’t make decisions effectively.

Q: You point out that too many or too few people might be looking at data, that good information is unused or unsuitable information gets used and there’s no methodology.

A: I don’t think it boils down to one methodology, there will be a lot of tools and frameworks and approaches to become more effective. But my bet is that more and more organizations will become serious about this because it’s kind of the next frontier. We’ve made some really bad decisions over the past several years. We don’t even know the extent of the problem because most organizations don’t identify, assign responsibility or track the results of key decisions. We’ve invested all this money and effort trying to produce information for better decision-making but we rarely link the two.

Q: You studied dozens of companies and found some, like P&G and Citi that have created decision management groups.

A: I found four or five. At P&G it’s mostly a name change but it was intended to be symbolic of a real change. At other places it’s more than that; it’s people who are analytically-oriented saying, ‘it doesn’t really help us that much if we just come up with an answer and nobody uses it in a decision so let’s broaden the idea and call it decision management.’

Q: Could this be an “uh-oh” moment that creates a roadblock where decision-making has to work its way through a group?

A: Yes, but I hope that’s not the usual case. I found, for example at Motorola, there were some people who were thinking that decision-making should be an IEEE officially defined process and I think that would be a big mistake. But other people at organizations I am talking about are coaches and advisers who don’t actually approve anything. The degree to which they are effective results from having a close relationship with the decision-maker and demonstrating better results. This idea of jumping through decision hoops would generally be a bad one.

Q: How does all this relate to data and information governance as we discuss it?

A: I think governance is farther down in the stack. Governance is usually asking whether the data has integrity and if it is being defined the right way. Those are important attributes but don’t ensure that information will be used to make effective decisions.

Q: What are downsides of the loosely-coupled decision model?

A: It’s the most common situation and easy to see why, because it’s quite effective for the information providers who can say, "Here’s everything you need, your data warehouse, your quality data, your tools, be fruitful and multiply," and then those IT providers tend to back away. But how many times have we heard about drilldown and ad hoc query and reporting only to find it never really happens as much as we’d like to believe? There just aren’t enough people who understand the structure of the data and the tools and, the more analytically complex the tools are, the less likely you’ll have people doing the right thing with them.

Q: And at the other extreme you have very automated decision models. What are the pluses and minuses there when it comes to decision-making?

A: With those systems you are often very near the customer and making decisions such as, should I give this customer a loan, though often there is an intermediary between the system and the customer. The nice thing about automated services is that you know the decision is being used because it’s built into the rules base or the algorithm and you know it’s going to be made quickly because computers are doing the work. On the downside, if you have the wrong decision logic or algorithm, you can make a lot of bad decisions really quickly. Just creating automated systems takes time and money and requires a pretty structured decision environment just to be able to pull it off. But it’s clear those systems have and will continue to expand, they’re all over the place.

Q: Are businesses automating the right things or the things they can automate first?

A: This gets back to the need for a better systematic process for saying what we should do with decisions, which really implies some kind of inventory of key decisions that really matter to your organization, what we do with them, how we make them better and in what order we should we go. It is not just about what somebody decides ought to be automated or improved; it’s about what really matters to the success of the business. Even organizations that have created decision management groups haven’t done much to prioritize the decisions that really matter.

Q: This gets us back to a middle ground of what you describe as structured human decisions.

A: Yes. It’s doing something to get information used more effectively in the context of decisions. Maybe it’s a framework, maybe it’s an analytical project, maybe a group of people to help address the issue. One of my favorite examples, which wasn’t included in the report, looked at the Miami School District. They had a pretty conventional loosely coupled environment. There were some efforts to make the teachers use the information better, the superintendent looked at it frequently and would call a principal or a teacher and ask what they felt about how the information in the data warehouse reflected on their school or classroom. Then I noticed that the New York City schools system has a similar performance assessment environment called ARIS (Achievement Reporting and Innovation System). Instead of assuming that educators would be able to look at the data and figure out what it means for improving student outcomes, they developed Inquiry Teams of three or four people who are already familiar with data and the problems in the schools. They could go to the sixth-grade English teacher and say, "The data suggests we have a problem in terms of student performance. Let’s work together to figure out what it means." They’re acting as intermediaries between the data and the people who are able to take action. I thought that was an interesting example of the difference between the loosely coupled and the structured human approach. Plus, if I had a kid in that school I wouldn’t want somebody saying, "Our system has decided that my son needs to be held back next year." I’d rather have humans making the call.

Q: Looking ahead, if we’re still getting our arms around making better use of information in the decision process, is there a logical path for business units? If you just give it to IT, won’t you just be back to more technology?

A: To the last bit of your question, yes, you don’t want IT to own this. I don’t think you necessarily need to establish a decision-management group now, but maybe you already have some sort of performance improvement organization or a Six Sigma group. Those teams have not focused traditionally on decision-making, but they could. They might be a good group to say, "Let’s identify, working with our management team, what are the most important decisions we make around here and then which ones are worthy of intervention because we know they’re not made very well or made too slowly." It may be a good first step to simply assess how well decisions are made. The fundamental philosophy here is that a decision is not something that an individual manager or employee makes, and making it better is not just the job of the individual manager or employee. It’s more of an organizational responsibility to decide how a decision gets made. There’s a lot of knowledge you can bring to bear on better decision-making that too often doesn’t get applied. Just the idea that you should have some alternatives considered for every decision wasn’t something our previous government was very good at. I think you could argue that approach had some negative effects.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access