Blog author: jcarter
Tuesday, October 23, 2012
By

In March 2009 the deputy chief of Italy’s Civil Protection Department and six scientists who were members of a scientific advisory body to the Department held a meeting and then a press conference, during which they downplayed the possibility of an earthquake. Six days later an earthquake of magnitude 6.3 killed 308 people in L’Aquila, a city central Italy. Yesterday, the seven men were convicted of manslaughter and sentenced to six years in prison for failing to give adequate warning to the residents about the deadly disaster.

The news reports imply that the scientists were sentenced because of their failure to predict the earthquake. But Roger Pielke, Jr., a professor of environmental studies at the University of Colorado, says “one interpretation of the Major Risks Committee’s statements is that they were not specifically about earthquakes at all, but instead were about which individuals the public should view as legitimate and authoritative and which they should not.”

Whether it was because of their predictions or because of the authority with which they made their claims, the scientists were sent to prison for making an erroneous prediction about how nature would act. Such a judicial ruling would strike most of us Americans as absurd. We’d rightly assume that it might provide scientists with an incentive to not make any predictions at all. As Thomas H. Jordan, a professor at the University of Southern California, says, “I’m afraid it’s going to teach scientists to keep their mouths shut.”

This seems reasonable until you consider what this says about the current incentive structure. As Stephen J. Dubner recently wrote, “the world is awash in prediction, much of it terrible, in large part because there are strong incentives to make bold predictions and weak penalties for bold predictions that don’t come true.”

This would be a trivial concern if there was no cost associated with “bold predictions that don’t come true.” But in many cases someone—though often not the predictor—has to pay a significant price to either protect against the predicted outcome or to prevent it from occurring. Take, for example, the case of anthropogenic climate change. Some scientists claim that we need to take drastic (and expensive) action to prevent global warming. Other scientists claim the threat is overstated and believe we should avoid implementing costly preventive measures.

In the first case, climate scientists expect the public to make an expensive bet that they’ll be proven right. In the later, scientists expect us to make a low cost bet that they will be correct—even if we will have to pay dearly later if they turn out to be wrong. In each case, the brunt of the cost of being wrong is transferred to the non-experts. The experts, however, often have an incentive to make a bold prediction even if there is a low probability of their being right. For them, there is almost no downside for being wrong. But for the rest of the world, economic deprivation or even loss of freedoms could result from their erroneous prediction.

What if scientists (and other predictors) faced a penalty for their inaccurate claims? Sending scientists to jail for being wrong about earthquakes is probably excessively harsh, of course. But what if they lost their job or had to pay a stiff fine when their prognostications failed to come true? I suspect the result would be that fewer bold predictions would be made and that the ones that were would be more reliable and based on incontrovertible evidence. Whatever the case, we would likely all be better off if the personal cost of being wrong were substantially higher.


  • tom

    I have a suspicion that this would happen: many academics whose work is inconvenient to the powers that be would find themselves out of a job. Got the wrong opinion about economics? You’re done. Do you view heavy pesticide use as dangerous? Big agricultural will find that inconvenient, and you’ll find yourself out of a job.

  • http://www.facebook.com/profile.php?id=1060874428 Dylan James O’Brien Pahman

    I see the point about finding some way to discourage sensationally bold predictions, but this case still sets a bad precedent, in my opinion. Furthermore, the current system already has a different merit/demerit system. If a scientist (or economist or other professional) continually makes bad predictions, they will find that no one, or at least increasingly fringe groups, wants to publish their work or to hire them. I get that the immediate bad consequences usually are paid by others, but having intelligent people who try to use the best knowledge we have to accurately predict things like earthquakes ahead of time, potentially saving many lives, is better than no prediction, isn’t it? In the absence of these and similar scientists, no one would ever have any warning about any earthquakes. Indeed, one could even view their prediction as conservative, rather than bold: apparently, they didn’t think that there was sufficient evidence to recommend taking extreme and costly precautions. And it is, at the very least, unjust to convict someone of something that was not previously publicly prohibited or a matter of common sense. Manslaughter, in this case, is a huge stretch without previous precedent or prohibition. They were just doing their jobs.