Sunday, March 02, 2008
Uncertainty, Considering What We Don't Know
But what if we're wrong? What would be more costly, if Alarmists are wrong, or if Skeptics are wrong? Which is more dangerous?
Cass Sunstein's book Worst Case Scenarios is a good start at contemplating what we don't know.
Dr. Sunstein considers many modes of thought and the book should get you thinking.
Throughout the book Dr. Sunstein uses two main narratives to examine how people react to uncertainty, Terrorism and Global Warming. He provides a very PC perspective, probably to better reach a broad audience and get them thinking about how we deal with what we don't know. The book is an exercise to get the lay person thinking about uncertainty rather than an objective analysis of how to deal uncertainties. It should get the reader to consider how we over and under react and how costly that is.
Sunstein makes the very good point that people have over-reacted to the historic risk level of terrorism (with much unnecessary cost in life and resources) and explores several reasons why. He also makes the point that probabilities alone are not enough to make decisions. Context is also important. That said, he possibly over emphasizes the fact that people over-react to more highly salient risks and he fails to thoroughly consider why people react strongly to risks involving justice and intent (especially that there is also a signaling component).
While most of us make the mistake of over-reacting to highly salient risks, Sunstein does the opposite. He makes the mistake of greatly exaggerating the risks of Global Warming. For the sake of argument, he makes up numbers. But the numbers are absurdly high, even when considering them as the likelihood that a risk will increase rather than the likelihood of an actual event happening or not.
When dealing with the uncertainty created by human actions, Sunstein also neglects uncertainty that already exists. The human component of global warming is small (and the greenhouse gas component of that is less than half). A large amount of uncertainty exists whether we reduce CO2 emissions or not. Building a particle accelerator creates a small, immeasurable risk of catastrophe. But Earth is surrounded by many particle accelerators which bombard the Earth and even produce collisions. Building one doesn't significantly affect the level of risk we face (and may provide us with knowledge that will help us avoid other risks).
The biggest qualm is that Sunstein advocates a policy of generational neutrality. He makes a good rhetorical argument, but logic is not on his side. He notes the consensus among economists that future lives should be discounted is unraveling. And unraveling is an apt term. The principle was long considered obvious to economists for the reason that it is obvious. When Sunstein asks whether the life of a 10 year old today is worth more than the life of a 10 year in 2040, he quickly answers, as many would like to, "No." But he fails to consider the obvious; that a 10 year old now will have 10 year olds of his own in 2040.
As to which is more dangerous, I think it's clear that regulation is far more dangerous than diversified growth. It's just that the imagery of catastrophic events associated with AGW (which is very unlikely to be mitigated by proposed regulation) is much more salient than the alternative of a much bigger, healthier, and wealthier future population.
Catastrophic events are discrete, but economic growth is compounding. So small changes in the growth rate make a big difference in well being over time. That's why a life now is worth more than a life in the future, because it affects the well being of the future. A catastrophic event that happens in the future will affect a smaller portion of wealthier society.
And it's unknown whether we are making catastrophes more or less likely.
You may be interested in some commentary I wrote about the FT op-ed piece by Klemperer. His logic assessing the risk of global warming is truly twisted.