Continue in 2 seconds

Counterintuitive Behavior

Published
  • March 01 1998, 1:00am EST

The cybercorp, a global human-electronic creature with conditioned reflexes and a nervous system, has certain types of behavior inherent in its rules and mechanisms. It may, under certain circumstances, behave in ways which we do not anticipate or do not understand. It may have surprises in store. The surprises can be pleasant or unpleasant.

On Black Monday, October 19, 1987, Wall Street, with almost no warning, crashed 508 points. It was a collapse that almost all investors at the time thought was not possible. The crash was made large and precipitous by the electronic systems in use. Chaos spread rapidly to every stock exchange on the planet.

Much of the skill of managers is based on what experience has taught them. They observe the results of their actions and learn from them. However, there are certain results which they cannot observe because these results take place far away or in the future. Intuition is trained by experience, but that experience has a black hole in it: it cannot directly observe cause and effect when the effect is distant in time or space.

In the cybercorp world, we increasingly build systems that span long distances, span separate organizations, span time and, hence, cause and effect are separated. Managers are, therefore, unable to learn correctly from observable experience. Such systems can give counterintuitive results. A manager does the obvious thing but it does not produce an obvious outcome.

For over twenty years, at MIT's Sloan School of Management, a classroom simulation has been conducted of a retailer/wholesaler distribution system referred to as "The Beer Game." In the beer game, retailers respond to customer orders for a specialty beer by placing orders with a wholesaler. Wholesalers respond to this by placing orders with the factory that makes the beer. Classes are divided into three groups playing the roles of retailer managers, wholesaler managers and factory warehouse managers, respectively. Each group is told that it will be judged on how well it runs its business and is instructed to manage its inventory in an optimal manner, placing appropriate orders so as to maximize profits.

What happens astonishes the participants. They think they are making clever decisions but the results, far from maximizing profits, are disastrous. The beer game has been played innumerable times with people trying to run their simulated business as well as possible, but all generate similarly catastrophic results; they all build up excessive inventories of beer they cannot unload, and there are wild oscillations. The fault lies not in the intelligence of the decision-makers, but in the structure of the system. What actually happens is that the simple change in customer behavior triggers wild overreactions in the system.

Erratic or sub-optimal behavior is usually caused by decision-makers not having the right information. Long-span webs of activities tend to have bad information because of time delays and long distances. Information is often channeled in an organization and not given to a decision-maker who needs it. The flow of information needs to be changed.

Peter Senge, of MIT, describes three lessons from the beer distribution simulation. First, systems cause their own crises. Different people trying to maximize profits produce similar results. The cause of the problem is not external forces, such as customer behavior, and not mistakes by individuals; the cause is the system. The systemic mechanisms need to be redesigned. Second, human decision making is part of the system, and the effects of this are subtle. We translate perceptions, goals, rules and cultural behavior into action, and this often has counterintuitive effects. Third, leverage comes from new ways of thinking. The cause of a system's bad behavior is often not understood. If it were understood, we could redesign the system so as to avoid bad behavior. We fail to understand that it is the design of the process that is causing instability.

In the cybercorp world, the wholesaler and factory should have immediate knowledge of customer orders. In general, the shorter the delays in a system, the less prone it is to extreme over-shoots or oscillations of high amplitude. Knowledge of sales ought to pass directly from the customer outlet to the factory production planner. In worldwide organizations like Benetton, information about customers' buying patterns should go immediately from retail stores to the central computers which plan production and distribution.

Throughout corporations today, one finds numerous examples of not getting the right information to the right people at the right time. For every decision and planning process, the cybercorp designer should ask, "What is the best possible information? What system is needed to produce it fast enough?"

Corporations of the cybercorp world will be increasingly intertwined in complex interdependencies. We need to redesign old procedures with an understanding of long-span webs of interactions. The speed of electronic linkages makes it possible to simplify interactions and eliminate many of the delays which cause overswings. At the same time, new complexities are being invented with virtual relationships with computer-to-computer links among the trading partners. IT organizations should be designing new systems with an understanding of cybercorp interactions.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access