I’ve had a number of anti-war folks ask me lately whether I’ve revised my opinion that eliminating Saddam was a good idea. I note that none of them have revised there opinions and have a theory as to why they are asking. They are watching the news reports of bad events rather than actually examining any statistics about conditions in Iraq. This sample bias, exacerbated by the political bias of the media they read, make it difficult for them to come to objective conclusions on the matter. Nissim Taleb wrote a great article about the problem of sample bias in a recent issue of Edge magazine.
Take an example of this probabilistic maladjustment. Say you are flying to New York City. You sit next to someone on the plane, and she tells you that she knows someone whose first cousin worked with someone who got mugged in Central Park in 1983. That information is going to override any form of statistical data that you will acquire about the crime rate in New York City. This is how we are. We’re not made to absorb abstract information. The first step is to make ourselves aware of it. But so far we don’t have a second step. Should newspapers and television sets come with a warning label?
The second one is a journalist. On the day when Saddam was caught, the bond market went up in the morning, and it went down in the afternoon. So here we had two headlines — “Bond Market Up on Saddam News,” and in the afternoon, “Bond Market Down on Saddam News” — and then they had in both cases very convincing explanations of the moves. Basically if you can explain one thing and its opposite using the same data you don’t have an explanation. It takes a lot of courage to keep silent.
We are not made for type-2 randomness. How can we humans take into account the role of uncertainty in our lives without moralizing? As Steve Pinker aptly said, our mind is made for fitness, not for truth — but fitness for a different probabilistic structure. Which tricks work? Here is one: avoid the media. We are not rational enough to be exposed to the press. It is a very dangerous thing, because the probabilistic mapping we get from watching television is entirely different from the actual risks that we exposed to. If you watch a building burning on television, it’s going to change your attitude toward that risk regardless of its real actuarial value, no matter your intellectual sophistication.