I was pretty distraught over a column I recently read in the Economist. The article, Light Work
, challenges the legacy of the Hawthorne effect, an esteemed experimental finding on human behavior I'd held as sacred ever since I took Industrial Psychology as a college freshman.
The Hawthorne effect derived from a series of worker productivity studies conducted at the then Hawthorne Works Western Electric plant located in Cicero, Illinois from 1924-1932
. In an early study, Hawthorne Works' investigators set out to determine if worker productivity were impacted by the level of light in the workplace. While initial measures demonstrated that enhanced lighting improved worker productivity, follow-on investigations seemed to indicate that diminished
lighting increased productivity as well. Study coordinators at the time explained the findings in a social context, noting that workers felt better about their work knowing that management was attentive to them and their performance. Over the years, some researchers have been less charitable to that interpretation, hypothesizing instead that the effects are probably spurious, due to the confounding that occurs if experimenters fail to realize how the consequences of subjects' performance affect what subjects do. The Hawthorne effect thus became a metaphor for unintended consequences in behavioral field experiments that threaten internal validity, a topic discussed in Analytical Designs for BI Part 2
There's been no shortage of academic revisionism and commentary on the Hawthorne Works experiments over the years, so I was a bit surprised to see the work prominent in a current Economist article. I'm pretty sure, though, it had a lot to do with the names and pedigree of the latest researchers: celebrated Freakonomics
author Steven Levitt and his University of Chicago economics colleague, field experiment expert John List. Levitt and List were able to retrieve original study data from archives in Milwaukee and Boston and analyze them using the latest advanced statistical/econometrics techniques.
The authors' re-analyses confirm that some of the findings might be spurious, the result of inadequate experimental design. Lighting was always changed on Sunday, when the plant was closed. Monday's output was generally superior to Saturday, and productivity continued to increase for the next few days in the week. The authors, however, observe this pattern of productivity in the absence of the lighting experiment as well. Could it be the timing of change rather than the change itself that caused the differences?
In addition, Levitt and List found the evidence for the Hawthorne effect enhanced productivity engendered by the attention of the experiment itself not as strong as commonly thought. The lower worker productivity that resumed at study conclusion could be alternatively explained by the study's demise in the summer a period of historically-diminished productivity. In acknowledgement of a potential Hawthorne effect on the other hand, the authors note that productivity was more responsive to changes in artificial in contrast to natural light. Since the amount of artificial light exposure was under control of the investigators, the authors conclude this experimental attention might indeed represent a mild Hawthorne effect.
Levitt and List note
: Our analysis of the newly found data reveal little evidence to support the existence of a Hawthorne effect as commonly described; i.e. there is no systematic evidence that productivity jumped whenever changes in lighting occurred....We conclude that the evidence for a Hawthorne effect in the studies that gave the phenomenon its name is far more subtle than has been previously acknowledged.
Levitt and List's findings should serve as a cautionary tale for BI investigators, drawing focus on the need to carefully consider confounding factors when designing BI investigations. Even when randomization drives design, analysts must be attentive to factors outside the study that can trip up findings. Analytical validity and factors that can contaminate results should thus be given critical path priority for BI investigation planning.
Steve Miller's blog can also be found at miller.openbi.com.