In the Nineteenth Century, news stories sometimes carried the disclaimer "Important if true." It was used for news that couldn't be confirmed but was too big to leave unreported. That line always struck me as a perfect representation of one of the three fundamental tensions of journalism, the one between getting it right and getting it first (the other two are between these goals and making the story interesting).
In a follow-up to a discussion about the way the press corp handled the 2000 election, I claimed that the culture of journalism was placing increasingly less value on accuracy. To support the accusation, I listed some examples from recent discussions, but there is a more direct and damning way to make the point.
Viewed in absolute terms the level of accuracy in current journalism is certainly not what it could be and is arguably in decline, but what about relative to cost? One hundred years ago, the time required to check a story was substantial. Sources were difficult to contact, reference materials had to be accessed in person and were often unindexed, and there was simply less recorded information. This time requirement translated to both labor and opportunity costs.
Over the years (particularly the last twenty), these costs have dropped sharply. The time required to get a story right has decreased by at least a couple of orders of magnitude. If we hold the value of accuracy constant, we would expect to see factual errors all but disappear. Instead we're seeing something between "as bad as ever" and "even worse." That means the value journalism as an industry puts on accuracy must have dropped as or more sharply than the costs of fact-checking did.
Think of it as Moore's law of diminishing journalistic quality.
From the comments
32 minutes ago