Saturday, May 5, 2012

DVR Alert

Sherlock returns this Sunday.

More on mergers and the growth fetish

As a follow-up to this post (which was part of a larger thread), this NBER paper (courtesy, I believe, of Felix Salmon or Brad DeLong) suggests that the business case for many if not most mergers is decidedly weak.

The study by professors at the University of California, Berkeley, concludes that acquisitions, while nearly always initially cheered by investors, end up hurting a company, and in particular its share price, in the end. Winning by Losing, which was released this week by the National Bureau of Economic Research, found that following an acquisition the stock of that company tends to underperform shares of similar companies by 50% for the next three years. Another finding of the study: Deals done in cash, which is often considered a more conservative way to pay for acquisitions, tend to do worse than deals done for stock. If an acquiring company doesn't want its new owners' shares, you shouldn't either.
Of course, just because the market overvalues acquisitions doesn't mean that it overvalues growth in general, but this is another piece of evidence for the pile.

Friday, May 4, 2012

The issue with targeted metrics

Felix Salmon talks about the Employment Participation Rate:


The number the politicians look at, however, is the unemployment rate, which ticked down to 8.1%. That’s still high, but it’s not a statistic to beat Obama round the head with.


and

For demographic reasons — the retirement of the baby boomers — the labor force participation rate is naturally going to fall over the next decade. But go back just one year, to March 2011, and look at the official CBO projection of the labor force participation rate. The CBO saw a rate of 64.6% in 2012 — a full percentage point higher than we’re at right now. The participation rate wasn’t expected to fall to today’s level of 63.6% until 2017.
What is interesting is that people focus on the Unemployment Rate.  WHich actually makes it a less useful indicator of anything -- as there are incentives to try and frame the number in a way that looks better than it really is.  Dropping people who would like a job but are no longer looking from the sample is a clever way to make the number look better. 

I suspect that the same issue will happen with any high-stakes metric. 

"Because he could do all these things, he imagined that he did do them."

Now that I think about it, I'm surprised that Pauline Kael's "Raising Kane" hasn't come up before while we were discussing taking credit for other people's work.

The Mercury group wasn’t surprised at Welles’s taking a script credit; they’d had experience with this foible of his. Very early in his life as a prodigy, Welles seems to have fallen into the trap that has caught so many lesser men—believing his own publicity, believing that he really was the whole creative works, producer-director-writer-actor. Because he could do all these things, he imagined that he did do them. (A Profile of him that appeared in The New Yorker two years before Citizen Kane was made said that “outside the theatre … Welles is exactly twenty-three years old.”) In the days before the Mercury Theatre’s weekly radio shows got a sponsor, it was considered a good publicity technique to build up public identification with Welles’s name, so he was credited with just about everything, and was named on the air as the writer of the Mercury shows. Probably no one but Welles believed it. He had written some of the shows when the program first started, and had also worked on some with Houseman, but soon he had become much too busy even to collaborate; for a while Houseman wrote them, and then they were farmed out. By the time of the War of the Worlds broadcast, on Halloween, 1938, Welles wasn’t doing any of the writing. He was so busy with his various other activities that he didn’t always direct the rehearsals himself, either—William Alland or Richard Wilson or one of the other Mercury assistants did it. Welles might not come in until the last day, but somehow, all agree, he would pull the show together “with a magic touch.” Yet when the Martian broadcast became accidentally famous, Welles seemed to forget that Howard Koch had written it. (In all the furor over the broadcast, with front-page stories everywhere, the name of the author of the radio play wasn’t mentioned.) Koch had been writing the shows for some time. He lasted for six months, writing about twenty-five shows altogether—working six and a half days a week, and frantically, on each one, he says, with no more than half a day off to see his family. The weekly broadcasts were a “studio presentation” until after the War of the Worlds (Campbell’s Soup picked them up then), and Koch, a young writer, who was to make his name with the film The Letter in 1940 and win an Academy Award for his share in the script of the 1942 Casablanca, was writing them for $75 apiece. Koch’s understanding of the agreement was that Welles would get the writing credit on the air for publicity purposes but that Koch would have any later benefit, and the copyright was in Koch’s name. (He says that it was, however, Welles’s idea that he do the Martian show in the form of radio bulletins.) Some years later, when C.B.S. did a program about the broadcast and the panic it had caused, the network re-created parts of the original broadcast and paid Koch $300 for the use of his material. Welles sued C.B.S. for $375,000, claiming that he was the author and that the material had been used without his permission. He lost, of course, but he may still think he wrote it. (He frequently indicates as much in interviews and on television.)

“Foible” is the word that Welles’s former associates tend to apply to his assertions of authorship. Welles could do so many different things in those days that it must have seemed almost accidental when he didn’t do things he claimed to. Directors, in the theatre and in movies, are by function (and often by character, or, at least, disposition) cavalier toward other people’s work, and Welles was so much more talented and magnetic than most directors—and so much younger, too—that people he robbed of credit went on working with him for years, as Koch went on writing more of the radio programs after Welles failed to mention him during the national publicity about the panic. Welles was dedicated to the company, and he was exciting to work with, so the company stuck together, working for love, and even a little bit more money (Koch was raised to $125 a show) when they got a sponsor and, also as a result of the War of the Worlds broadcast, the movie contract that took them to Hollywood.

A truly remarkable piece of misreading

To some degree we've all been guilty of hearing what we expected to hear rather than what was said. At the risk of belaboring the obvious, this has always been a part of human nature and it's probably grown more common with the advent of the internet. Even with that in mind, though, this is a truly remarkable example.
Here's the context. A few days ago, Jonathan Chait wrote a long and unflinching take-down of Paul Ryan. He razed the man, left no stone on stone, salted the ground so... well, you get the point. What's more, Chait was just as direct and damning in his handling of the journalists who had given Ryan so many free passes, even going so far as to confront James Stewart on his Ryan pieces.

Here's a representative passage:
Ryan’s mastery of these details does not signify openness to evidence or a willingness to shape his views to real-world evidence. It actually signifies the opposite. And yet Ryan has grasped that the aura of specificity he has cultivated paradoxically renders the specifics themselves irrelevant.
For a virtuoso display of this principle in action, return to another vintage Ryan moment: his Dave profile from last year, where he awed a swooning reporter by opening up the budget to a random page and fingered a boondoggle. The item Ryan pointed to was the Obama administration’s reform of the student-loan industry. “Direct loans—this is perfect,” Ryan said. “So direct loans, that’s new spending on autopilot, that had no congressional oversight, and it gave the illusion that they were cutting spending.”
The exchange is so perversely revealing that it rewards explanation. For decades, the government helped make college more affordable through “guaranteed loans”—it encouraged banks to lend money to students by promising to repay the banks if the students defaulted. Banks were making billions of dollars in profits at virtually no risk. The General Accounting Office, a kind of in-house fiscal watchdog for the federal government, issued sixteen reports over the years noting how the government could save money simply by issuing the loans itself and cutting out the middleman.
It was the simplest, no-brainer pot of savings you could find—ending pure corporate welfare, just like in the movie Dave. The cause attracted support from think tanks, as well as the moderate Wisconsin Republican Tom Petri, an eclectic reformer who is sort of the real-life version of the Paul Ryan character who appears on television. Two National Revieweditors endorsed eliminating guaranteed loans in an article advocating a new reform conservatism.
The banks lobbied fiercely to protect their gravy train. Among the staunchest advocates of those government-subsidized banks was … Paul Ryan, who fought to protect bank subsidies that many of his fellow Republicans deemed too outrageous to defend. In 2009, Obama finally eliminated the guaranteed-lending racket. It could save the government an estimated $62 billion, according to the CBO.
Not everything in Ryan’s career, and possibly nothing at all, is quite so undeniably venal. You could pluck any other single example out of Ryan’s long history of strident conservatism and he would be able to defend it, at the very least, on ideological grounds. A tax cut for the rich, a hike in military spending—all those could be explained as a blow for the cause of Reaganism. This was an almost astonishingly unlucky break, an instance where he lacked even ideological cover—standing up for higher spending at the behest of a powerful lobby lacking any plausible rationale for its subsidy.
At the moment the page opened to that unfortunate item, Ryan’s heart must have stopped. Here was a reporter trying to cast him as a movie-hero outsider, and he was performing on cue. Yet the book opened to a page that, cruelly, just happened to expose the gap between Ryan’s image and the reality more clearly than anything else possibly could have.
Ryan probably knew, even in that split second, that he stood little chance of exposure. (The overlap between television news reporters and people with a detailed understanding of the federal budget is quite small.) Yet a lesser politician might have panicked, or hesitated, or possibly tried to flip to a different page. In that moment, Ryan revealed the qualities that have propelled him to his current position. As cool as can be, and as winsome as ever, he said, “This is perfect.”
Chait later said he was prepared for a wide range of responses but he had to admit that this post from the left wing site Crooks and Liars caught him off guard:
Praise Jesus and pass the awesome sauce. Paul Ryan's going to be the next Republican Saint, wrapped in a flag and waving down at all of us who are too stupid to understand the complex thinking and amazing nuance of St Paul's brain.Thank you, Jonathan Chait, for this awesome NYMag article telling us how to count the ways Paul Ryan is the Great American Hero. What would I have ever done without being enlightened in such an obsequious way, beginning with the title: The Legendary Paul Ryan?


It is, as Chait says, a fascinating read despite going on for over eighteen hundred words counting two updates (neither of which improves on the original). Part of the fascination comes seeing a writer with such exceptionally poor reading comprehension. She follows every quote from Chait with a reply that seems oddly inappropriate, as if she were listening to him in a noisy room and didn't really hear what he said.


And then there's the self-effacing "I'm not ambitious at all, no sir!" claim that Chait reinforces:
One trope that has marked Ryan’s media coverage from the outset is that he is consistently described as lacking ambition. It’s a sharp contrast with fellow Republican Eric Cantor, to whom the adjective “ambitious” is affixed like a tattoo. Ryan says, and many political reporters believe, that he is immune to the political concerns that distract his colleagues. He “has a level of disdain for the sort of rank political calculations required of people who want to climb the electoral ladder,” explains the Washington Post. Here is a telling description from Politico: “Of the partisan political game, Ryan confessed, ‘It’s not my natural tendency. I’m a policy guy.’ ” The operative word here is “confessed.”
Because wonks lack ambition? This would be why Ryan has abandoned St. Ayn Rand in recent days, eschewing her "I've got mine, screw the rest of you" philosophy for a kinder, gentler piety that "disagrees" with Catholic bishops and pretends to be a bipartisan kind of guy who gets along with everyone! Of course he's not ambitious. Jon Chait has told you so.

Of course, the operative word here is "operative." Chait specifically emphasizes that Ryan is telling us that Ryan's not ambitious. (The word "trope" is also telling.) It's a difficult point to miss, particularly given that a few lines later Chait points out that Ryan "had to elbow more experienced Republicans out of the way to grab his nomination, and then leapfrog other more experienced Republicans to claim the party’s leadership of the House Budget Committee in 2007."

Still, if this were just a badly reasoned post, I wouldn't be recommending you all read it. Seeking out something just because it's bad is mean-spirited and I try to avoid the habit (though the internet does encourage backsliding in that area). What makes the Crooks and Liars piece so fascinating is the way it shows how an interpretation can get embedded so deeply that everything we see confirms that view. As mentioned before, we've all had these moments but most of the time either we catch ourselves or someone shakes us out of it.

It's instructive to see what happens when you just keep going.

Wednesday, May 2, 2012

Outcome assessment

This idea from Dean Dad is only going to work with an extremely objective metric.  It works fine for sales or time in a race, but I think that the assessment of soft skills is sufficently difficult that this is likelty to cause more trouble than it is worth.  Consider:
With the overall allocation flat or shrinking, every college is pretty much up against it already. None of them has much margin for error.  That’s especially true given how fixed most of the costs are, and how old most of the buildings are.  (Gotta love 70’s architecture!) 
Now they’re being told that it’s not enough that they do well; they must do better than their counterparts.  If Northern State’s “score” -- however defined -- moves up five points, but Southern State’s moves up ten, then Northern State takes a cut. 
If the colleges had large endowments, this could be a spur to entrepreneurialism.  If they were gambling with money they could afford to lose, then this could be just the kick in the pants they needed to start trying more ambitious things.  The prospect of the big win can justify the risky play.
But when the colleges are running on empty at the outset, the prospect of any meaningful loss is simply intolerable.  Instead of spurring innovation, this will heighten the already-strong culture of loss aversion.  Taking a flyer on a strategy that would take years to pay off isn’t an option when the years in between could require layoffs.
Worse, any kind of statewide collaboration -- exactly the sort of thing that would “move the needle” on educational attainment, workforce development, or any social good you care to name -- would be entirely out of the question.  Why would I share my breakthrough innovation with Nearby State, when it would erode my competitive advantage? 
And just how long, exactly, do you think it would take before the quick fix of grade inflation starts to look attractive?
 It is the quick fix of grade inflation that I consider to be the most important threat.  Even with externally administered objective tests we are creating massive incentives to find any way to improve on the tests possible.  Some of these approaches will align with the goals of the program (e.g. improving classroom instruction).  But other goals will be to load the deck in any way possible (inflating grades, changing admisison standards, shifting learning to the test-based goals) in order to try and avoid these crushing cuts.  After all, even small amounts of advantage will add up to more resources in the future.  And I am becoming more and more convinced that a lot of outcomes are driven by resources as much as anything else. 

I am not saying no version of this approach can work, but it is always concerning when outcomes are not hard but rather soft.

Tuesday, May 1, 2012

A perspective on Ayn Rand

From XKCD:
I had a hard time with Ayn Rand because I found myself enthusiastically agreeing with the first 90% of every sentence, but getting lost at 'therefore, be a huge asshole to everyone.
It is true that Randianism has some rather perverse suggestions for human behavior and some rather idealistic assumptions about how actual human beings will act in the face of competition.  

Context

Context can really matter in terms of what we consider to be virtuous behavior:
So there it is: the difference between a stay-home mother and a welfare mother is money and a wedding ring. Unlike any other kind of labor I can think of, domestic labor is productive or not, depending on who performs it. For a college-educated married woman, it is the most valuable thing she could possibly do, totally off the scale of human endeavor. What is curing malaria compared with raising a couple of Ivy Leaguers? For these women, being supported by a man is good—the one exception to our American creed of self-reliance. Taking paid work, after all, poses all sorts of risks to the kids. (Watch out, though, ladies: if you expect the father of your children to underwrite your homemaking after divorce, you go straight from saint to gold-digger.) But for a low-income single woman, forgoing a job to raise children is an evasion of responsibility, which is to marry and/or support herself. For her children, staying home sets a bad example, breeding the next generation of criminals and layabouts.
I think that domestic work is extremely challenging to fit into our framework of how we define productive activity.  It is clear that domestic work is essential to the creation of the next generation of people and that it is not easy labor.  The idea that dignity can only come from paid employment rather than worthwhile work is perverse.

I would consider a Buddhist monk, for example, to have plenty of dignity even if their vocation never (ever) results in paid employment.  I think that this might be part of the modern social darwinist ideal that Jon Chait talks about in which the market doesn't just act to increase economic efficiency but it also grants moral status based on economic outcomes.  There are a lot of activities in the world that are not paid employment but are worth doing.  

Sunday, April 29, 2012

Sure it saved us twenty billion, but it sounds funny

I was very pleased to read this report (via Mr. Salmon,) in the Washington Post:
Rep. Jim Cooper (D-Tenn.) believes it is time the sex life of the screwworm got its due.

 On Wednesday afternoon, Cooper rose to the defense of taxpayer-funded research into dog urine, guinea pig eardrums and, yes, the reproductive habits of the parasitic flies known as screwworms--all federally supported studies that have inspired major scientific breakthroughs. Together with two House Republicans and a coalition of major science associations, Cooper has created the first annual Golden Goose Awards to honor federally funded research “whose work may once have been viewed as unusual, odd, or obscure, but has produced important discoveries benefiting society in significant ways.” Federally-funded research of dog urine ultimately gave scientists and understanding of the effect of hormones on the human kidney, which in turn has been helpful for diabetes patients. A study called “Acoustic Trauma in the Guinea Pig” resulted in treatment of early hearing loss in infants. And that randy screwworm study? It helped researchers control the population of a deadly parasite that targets cattle--costing the government $250,000 but ultimately saving the cattle industry more than $20 billion, according to Cooper’s office.
This is a good story in the sense that it's good news -- for too many years, important research with huge economic pay-offs has been ignored and often mocked -- but it's also a good story for a guy trying to write  a post for a science and technology blog because it illustrates so nicely some of the reasons that so much science reporting is so bad:

1. Most reporters have a weak grasp of what goes into good research. For example, studying conditions in different animals often produces giggles from the press (see the dog urine study) even though changing the population of animals studied is generally an excellent idea.

2. The press corps have an urban bias accompanied with a pronounced disinterest in agriculture. As a result, even agricultural research of immense and obvious economic value is routinely mocked by publications like the New York Times.

3. The press corps also has decided ddulite tendencies and unfortunately this research doesn't sound cool. (Even though it is.)

update: Upon review, I'm thinking that I didn't out my point sufficiently. The studies described on the WP were good, solid research that paid for itself. The media's inability to recognise good science makes it all the more difficult to fund and pursue good science.

Saturday, April 28, 2012

Organizational strategy

Brad DeLong:
The strategy that Berkeley has settled on is to seek to produce the funding stream necessary to maintain a great University by becoming a finishing school for the superrich of Asia. This may be the wrong strategy--I sometimes think so, many others think so, and you can certainly argue so. But it is the strategy that we have. And the worst strategy of all is to have no strategy. A bad strategy is vastly preferable to no strategy, or to an unimplemented strategy.
There is a lot of good stuff in this particular post.  In global terms, the critique of reducing access to education in the United States is probably the single most important point in Dr. DeLong's piece.

But I think that the point above is one that academics should pay a lot more attention to.  You may or may not agree with a particular strategy (or Dr. DeLong's specific recommendations as to how to approach the strategy) but it is critical that there be a strategy.  I have seen bad strategies work out for all sorts of unexpected reasons (nobody can know all possible variables).

But it is true that sticking to a strategy is a sensible plan.  The costs of constantly changing strategies is non-trivial and may replicate the worst elements of no strategy.  

On the tech beat


Lots of interesting technology stories out that I ought to be blogging about, most via Felix Salmon.

Salmon examines the career of "the Man Who Makes the Future."

Neal Stephenson and Tyler Cowen have some questions about innovation in the internet age.

While the other side seems to be retreating to the position that we're doing fine; just look at how slow progress was in 1900.

And finally, Noah Smith effectively makes the case that, in an era of limited budgets, particle accelerators are a bad place to spend our research dollars.

Wednesday, April 25, 2012

Is this really the basic lesson of economics?

Frances Woolley:
The basic lesson of economics is that people - including governments - aren't stupid. If it was possible to generate an immediate increase in tax revenues by reducing tax rates, taxes would be cut instantly. Taxes are what they are in part because reducing taxes creates an immediate revenue short-fall.
But it is worth noting that countries don't immediately do a "race to the bottom" on tax rates and so there has to be some underlying level of support for current policies.  It is not that change is never good, but it is the case that it pays to carefully examine the full set of incentives.

Somebody is usually benefiting from social arrangements.  

Can you plagiarize folklore?

[the following is a follow-up of sorts to this earlier post on plagiarism.]

You can certainly steal the wording, perhaps the narrative structure, but does it make any sense to talk about plagiarizing something that has neither distinct authors or authorship dates? That's a question raised by by this
kerfuffle over the following paragraph lifted by Karl Weick:
The young lieutenant of a small Hungarian detachment in the Alps sent a reconnaissance unit out into the icy wilderness. It began to snow immediately, snowed for two days, and the unit did not return. The lieutenant suffered, fearing that he had dispatched his own people to death. But the third day the unit came back. Where had they been? How had they made their way? Yes, they said, we considered ourselves lost and waited for the end. And then one of us found a map in his pocket. That calmed us down. We pitched camp, lasted out the snowstorm, and then with the map we discovered our bearings. And here we are. The lieutenant borrowed this remarkable map and had a good look at it. He discovered to his astonishment that it was not a map of the Alps but of the Pyrenees.
It's possible that this story really happened (I have reason to doubt it but I'll get into that later), but that's not really important. Some times the events in folk tales and urban myths do happen but that doesn't stop the tales and myths from functioning, culturally and aesthetically, as folklore.

The genre of worthless items proving valuable stretches at least from Stone Soup (which merits its own type in the Aarne-Thompson folktale classification system) to Mamma's Bank Account. Add to that the related genre of false or misunderstood instructions and you can find literally thousands of antecedents.

Now check out the "original" version (again from Gelman):

1916: Albert Szent-Györgyi, a medical student in Budapest, serves in World War 1.
1930: Working in Szeged, Hungary, Szent-Györgyi and his colleagues discover vitamin C. In the next several decades, he continues to make research contributions and becomes a prominent scientist, eventually moving to the U.S. after World War 2. He dies in 1986.
1972: Medical researcher Oscar Hechter reports the following in the proceedings of a “an international conference on cell membrane structure,” published in 1972:
Let me close by sharing with you a story told me by Albert Szent-Györgyi. A small group of Hungarian troops were camped in the Alps during the First World War. Their commander, a young lieutenant, decided to send out a small group of men on a scouting mission. Shortly after the scouting group left it began to snow, and it snowed steadily for two days. The scouting squad did not return, and the young officer, something of an intellectual and an idealist, suffered a paroxysm of guilt over having sent his men to their death. In his torment he questioned not only his decision to send out the scouting mission, but also the war itself and his own role in it. He was a man tormented.
Suddenly, unexpectedly, on the third day the long-overdue scouting squad returned. There was great joy, great relief in the camp, and the young commander questioned his men eagerly. “Where were you?” he asked. “How did you survive, how did you find your way back?” The sergeant who had led the scouts replied, “We were lost in the snow and we had given up hope, had resigned ourselves to die. Then one of the men found a map in his pocket. With its help we knew we could find our way back. We made camp, waited for the snow to stop, and then as soon as we could travel we returned here.” The young commander asked to see this wonderful map. It was a map not of the Alps but of the Pyrenees!
The moral of the story, as given by Hechter and by Bernard Pullman at another symposium a year later, is that the map gave the soldiers the confidence to make good decisions
...
1977: Immunologist Miroslav Holub publishes a poem (of the prosy, non-rhyming sort) telling the lost-soldiers story (again, crediting Szent-Györgyi) in the Times Literary Supplement, translated from the Czech. Holub may have actually attended the meeting reported on by Hechter.
Take a good look at the format here. The narrator says a person he knows told him a story which he then repeats. The source is specific and reliable. The story is improbable, involves unnamed protagonists and a fairly non-specific setting, and has folkloristic aspects. This puts us squarely into urban myth territory and a map of that territory is useful when you try to what's happening here.

Much of the pernicious staying power of urban myths is the tendency to attribute the credibility of the source to the story itself. Of course, with an urban myth, the source is simply another link in the chain just as we are when we repeat the story.

With that in mind, when Gelman emphasizes the importance of crediting Szent-Györgyi, it begs the question, what should we credit him with? What is Szent-Györgyi's role here? Though we can't say for certain, it seems unlikely that he came up with the story (and if so, he certainly misrepresented it). Likewise, it doesn't seem like these events happened to him or that he witnessed them. Instead, based on the evidence that we have in front of us, it seems obvious that Szent-Györgyi's role here was the same as Hechter's and Holub's and Weick's; he heard a story and he repeated it.

Weick certainly owes Holub an apology and an acknowledgement, but as for not mentioning Szent-Györgyi, I think he made the right call. Naming Szent-Györgyi implies that we know the source and can trust the story's veracity (I doubt that we do or can). Saying nothing about where the story came from is possibly more honest; it doesn't imply anything we have reason to believe is untrue. Instead it presents this as an apocryphal tale, a bit of folklore. As such, it has to stand on its own merits: is it interesting and thought provoking?; does it make a valid point?

Weick unquestionably stole the words he used to tell this story, but I suspect the story itself has been told and retold since soldiers started carrying maps. Arguing about plagiarism at this point seems rather silly.

Monday, April 23, 2012

What I'm currently blogging about in some alternate reality

Via Andrew Gelman, Gary Rubinstein digs through NYC's data dump and produces a series of interesting posts. If you can't read the whole thing, make sure to check out part 1 and part 3.

Elsewhere in education, Felix Salmon points us to this case of a foundation providing high school lesson plans that push a definite agenda.

Back on the USPS beat, the NYT explains how the service's attempts to diversify are often blocked through the lobbying of competitors. Dean Baker has more on the subject.

And on the subject of the growth fetish (with a bit of ddulitism thrown in), Noah Smith looks at the performance of venture capital firms since the bubble burst.

On the plagiarism front, stealing from unpublished work is especially egregious.

When you're feeling old, reading Tennyson can improve your outlook.

Saturday, April 21, 2012

Like complaining about saucy language in Sodom and Gomorrah

Here's an idea for a novel: in a dystopian future/alternate history, the country is governed by a totalitarian central government that forces teams of teenagers to battle to the death in an annual televised event. In the hands of competent writer it's a premise that could generate plenty of drama and suspense and it has highly cinematic elements.

I'll get back to that idea in a minute but first I want to direct your attention to this recent post by Andrew Gelman. Go ahead, take a look. I'll wait...

There are a number of things to discuss here but let's start with this assertion quoted by Gelman:

“The essence of plagiarism is passing off someone else’s work as your own."

This nicely catches the stark moral terms that we often see in this debate, but when look at this more closely, particularly when we look at what's entailed in different types of plagiarism and the reactions to those types, the picture is a bit murkier.

Let's go back to the idea from the top of the page and fantasy stories about young adults. Back in the mid-Nineties, J.K. Rowling came up with the inspired notion of combining the two great traditions of British juvenile literature. The concept and Rowling's skillful execution produced the enormously successful Harry Potter and the Philosopher's Stone.

Rowling's success was followed by a wave of science fiction and fantasy novels aimed at the young adult market. These included Percy Jackson, the Lorien Legacies (co-written by the disgraced James Frey), Gregor the Overlander, and, of course, Twilight and the Hunger Games.

But one thing Rowling's success didn't inspire was the idea I mentioned at the top. That one came from a Japanese writer who used it for a novel written in 1996 and published in 1999 under the name Battle Royale,

The book and the movie that followed a year later were huge international hits. Despite the somewhat disturbing subject matter, both generally received positive reviews. Here's the Guardian in 2001, "Some will find the explicit violence of this movie repulsive - or plain boring. But this is a film put together with remarkable confidence and flair. Its steely candour, and weird, passionate urgency make it compelling." And Stephen King, writing in Entertainment Weekly (February 1, 2007) gave the book an enthusiastic endorsement (while noting it had some elements in common with his novel The Long Walk).

A little bit more than a year and a half later, Scholastic published the Hunger Games.

Given the number of blogs by fans of science fiction and Japanese popular culture, it's not surprising that the resemblance was discussed at some length.

From Wikipedia:
The 2008 American young adult novel The Hunger Games by Suzanne Collins has been accused of being strikingly similar to Battle Royale in terms of the basic plot premise and the world within the book. While Collins maintains that she "had never heard of that book until her book was turned in", Susan Dominus of The New York Times reports that "the parallels are striking enough that Collins's work has been savaged on the blogosphere as a baldfaced ripoff," but argued that "there are enough possible sources for the plot line that the two authors might well have hit on the same basic setup independently."
That "might well have" is an awfully weak defense (particularly given the puff piece tone of the NYT article) and it points to one of the central problems in the plagiarism debate: while it's easy to prove the relatively trivial crime of lifting wording, it's next to impossible to prove more substantial thefts. We can look at the timeline. We can look at Collins' previous career as a writer of fairly derivative kids' shows (no Spongebob or Pete & Pete) and the author of the Underworld books (a series that bears a marked resemblance to Harry Potter). Nothing here gives us any reason to believe that she didn't steal the idea but also nothing that could be called evidence that she did.

This is not meant as an attack on Collins who is, as far as I can tell, an excellent writer and who is doing a wonderful job getting kids to read. I'm in favor of what she's doing and I couldn't care less how she does it.

My point is that the theft of wording -- a problem that is both trivial and rare, but easy to prove -- is treated as a major offence while stealing more substantial elements -- a problem that is both serious and common, but is hard to prove -- is largely ignored.

If we truly want to embrace the inclusive definition of plagiarism we quickly ourselves in the uncomfortable position of pointing out the extensive lapses of friends and colleagues rather than the failings of a few convenient pariahs.

If we're going to be anywhere near consistent and proportional, we're going to have to ask ourselves whose names really belong on a research paper. I can think of at least one case where the credit was given to someone who happened to be the spouse of the main researcher's thesis advisor (the valid reasons for being listed as an author do not include marrying well). If you didn't substantially contribute to the research behind or the writing of a paper and you put your name to it, you're a plagiarist.

And we need to ask ourselves how much journalism consists of simply paraphrasing and regurgitating other people's ideas, arguments and interpretations. When you hear someone talking about a meme, they actually mean that stories are being borrowed and recycled on a massive scale.

Discouraging plagiarism in the broad sense is a worthy goal, but focusing exclusively on those few people who lift some phrases from other published work is simply a distraction.