Wednesday, April 9, 2014

The Hedgehog who thought he was a fox -- a cautionary tale

The growing chorus of Nate Silver fans critical of (or at least perplexed by) the new Five Thirty Eight have caught a great deal of media coverage, mainly for the wrong reasons. Conservatives have painted it as a case of liberals turning on one of their own. Pundits have tried to use the recent critiques of Silver to undercut his earlier, completely unrelated critiques of them (I'm debating whether or not to write a post on Dylan Byers' laughable misreading of Krugman's position. On one hand, it's bad enough to support a post. On the other hand, I'm busy, Charles Pierce already did a good job with it and I'm pretty sure that most people already know what Byers is).

There has been some good work on the subject. Jonathan Chait does a sharp analysis of Krugman's and Silver's personalities and how they shaped the conflict (best line: "Somewhere, David Brooks is reading Silver’s argument that Paul Krugman refuses to attack his colleagues and laughing bitterly."), but other (for me) more interesting issues have gotten less coverage than they merit, things like the culture of statistics, the often depressing career paths promising thinkers take these days* and the dangers of a bad analogy.

It sometimes seems that there's a convention that once a debate has been framed, that framework must be respected, no matter how badly it holds up. Case in point, the fox and the hedgehog. Here's how Silver puts it:
Our logo depicts a fox (we call him Fox No. 92) as an allusion to a phrase originally attributed to the Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.” We take a pluralistic approach and we hope to contribute to your understanding of the news in a variety of ways.
This is doubly flawed analogy. Expertise is a spiky, complicated thing that doesn't lend itself to scalar measures, let alone binomial. Any attempt to assign people positions on the fox/hedgehog spectrum will be problematic at best with order shifting radically when weighting schemes change. If we do, however, decide to view the world through this framework, we immediately come to an even bigger objection to in Silver's arguments:

Nate Silver is a hedgehog.

There is nothing pejorative about this classification. Silver has done brilliant work. It's just that almost all of Silver's best work has been done using a small but powerful set of analytic tools to address thorny but structurally similar problems in sports and politics. In terms of methods, Silver is a specialist; in terms of data, he's a micro-specialist. Silver has an enormous body of knowledge about working with player stats or polling data, but most of that knowledge is completely field specific.

There's nothing wrong with this kind of specialization -- its absolutely necessary for the kind of results Silver produced -- but it can cause problems when researchers move out of their areas of expertise and fail to adjust for the change. In other words, the trouble starts when hedgehogs think they're foxes.

Being a fox means living with the constant fear that you've just done something stupid that will be immediately obvious to anyone knowledgeable in the field. Ideally that fear leads to a heightened feel for danger levels. Most experienced foxes have developed an instinct for when to seek out a hedgehog. As a corollary, a good fox is always (and I do mean ALWAYS)  more willing to ask a stupid question than to make a stupid statement.

For a case study of what can go wrong when experts leave their area of expertise and don't adjust their caution levels, you don't have to look any farther than Silver's attempt to cover the climate change debate. Michael E. Mann assesses the damage:
And so I was rather crestfallen earlier this summer when I finally got a peek at a review copy of The Signal and the Noise: Why So Many Predictions Fail -- but Some Don't. It's not that Nate revealed himself to be a climate change denier; He accepts that human-caused climate change is real, and that it represents a challenge and potential threat. But he falls victim to a fallacy that has become all too common among those who view the issue through the prism of economics rather than science. Nate conflates problems of prediction in the realm of human behavior -- where there are no fundamental governing 'laws' and any "predictions" are potentially laden with subjective and untestable assumptions -- with problems such as climate change, which are governed by laws of physics, like the greenhouse effect, that are true whether or not you choose to believe them.
...
Unlike Levitt, Nate did talk to the scientists (I know. I'm one of them!). But he didn't listen quite as carefully as he should have. When it came to areas like climate change well outside his own expertise, he to some extent fell into the same "one trick pony" trap that was the downfall of Levitt (and arguably others like Malcolm Gladwell in The Tipping Point). That is, he repeatedly invokes the alluring, but fundamentally unsound, principle that simple ideas about forecasting and prediction from one field, like economics, can readily be appropriated and applied to completely different fields, without a solid grounding in the principles, assumptions, and methods of those fields. It just doesn't work that way (though Nate, to his credit, does at least allude to that in his discussion of Armstrong's evaluation of climate forecasts).
I'm singling out Silver here not because he's a bad statistician but because he's a very good one who fell into the increasingly common traps of believing that the world outside of his specialty is simpler and that, if you understand the math you automatically understand the problem. Each field is complex and , like Tolstoy's families, complex in its own way. If you want to have something useful to say in an unfamiliar area of research, knowing the statistics may be necessary but it is far from sufficient.

* On a related note you can find my thoughts on Five Thirty Eight's business model here.

2 comments:

  1. I think statisticians are almost all foxes. I don't think subject-matter expertise makes one a hedgehog. I'm an expert in U.S. public opinion and elections but I think I'm still foxlike when I work in that area.

    ReplyDelete
    Replies
    1. Andrew,

      I originally had a passage about statisticians being foxes in this post but it opened up bigger issues than I was ready to address. Part of the reason for this fox-tendency is the focus on methods but I think that's secondary. For me, the bigger difference is in a problem-centric culture very similar to that of engineers.

      People from outside that culture tends to equate statistics with the tools statisticians use. The results is often that non-statisticians in strange fields tend to shortchange the thinking-about-the-problem stage.

      I know others will have other standards but I'd list the ability to get your bearing quickly and think through problems in unfamiliar contexts as the main requirements for an effective fox.

      Delete