- Category: Olivier's blog
- Published on Monday, 30 January 2012 09:58
- Written by Olivier Morin
As an illustration of the power of priming experiments to produce astonishing findings, a recent study shows that people tend to underestimate their age (but not their father’s) after listening to the Beatles’ song « When I’m 64 ». The study was published in Psychological Science.
"We asked 20 University of Pennsylvania undergraduates to listen to either “When I’m Sixty-Four” by The Beatles or “Kalimba.” Then, in an ostensibly unrelated task, they indicated their birth date (mm/dd/ yyyy) and their father’s age. We used father’s age to control for variation in baseline age across participants. An ANCOVA revealed the predicted effect: According to their birth dates, people were nearly a year-and-a-half younger after listening to “When I’m Sixty-Four” (adjusted M = 20.1 years) rather than to “Kalimba” (adjusted M = 21.5 years), F(1, 17) = 4.92, p = .040."
The effect is both statistically significant and fairly important: it really seems that the song induces a downward bias in a subject's estimation of his own age. Incredible? Maybe, but not more so than other priming studies. It has been shown, after all, that subjects primed with words related to old age walk more slowly than others (here); that infants are twice more likely to help an adult spontaneously after they have seen two puppets facing each other (rather than turning their back to each other) (here); that people are more generous when they are holding a cup of hot (versus iced) coffee (here). Strange as they are, those are widely cited results. Yet, the Beatles’ song experiment was not greeted with the same enthusiasm. Why was that?
The Beatles’ song study did not convince anyone because its authors made it clear that they had intentionally produced a bogus result. The result was bogus, not because the authors fabricated their data, or made glaring mistakes. Rather, they relied on very common procedures — testing many variables and reporting just a few, adding subjects untill the results were significant, etc. The biases thus created are usually invisible to a reviewer’s eyes, which of course was the authors’ point.
"Current standards for disclosing details of data collection and analyses make false positives vastly more likely [than the 5% threshold of ‘statistical significance’]. In fact, it is unacceptably easy to publish “statistically significant” evidence consistent with any hypothesis."
Of course, this point goes beyond social psychology and priming studies, beyond psychology even, and it would be unfair to lay the blame on a tiny portion of the literature. Still, I could not help thinking of this paper when I read the recent reanalysis of one the most iconic priming studies: Jonathan Bargh’s showing that people walk slower after being exposed to age-related words in a questionnaire. Not only do the authors fail to replicate the famous effect. They also suggest an explanation for why it appeared in the first place. Their culprit is not false-positive statistics, but the good old ‘Clever Hans’ effect, whereby an experimenter unconsciously induces a subject to yield the desired behavior (more details here). These authors recruited subjects that they trained to act as experimenters. They managed to bias them through a combination of scientific persuasion and outright deception (the first subject was a confederate who acted exactly as he would have if the hypothesis had been true). When the expectation was that people should walk slower in the test condition, the experimenters proved it true. Yet when they expected subjects to walk faster, they found a way to prove it as well.
Then again, the ‘Clever Hans’ effect is a potential bias in many fields beyond priming studies — but that is not exactly comforting.