Friday, January 19, 2024

"Hi Ho! Hi Ho! It's off to court we go!" -- More Disney litigation blogging


Though the Air Pirates case was more consequential (and arguably more egregious), it was Disney's legal response to the 1989 Academy Awards that did the most in the industry to solidify the company's reputation for overly aggressive litigation. As Variety put it, “Because the Academy did something kinda Dopey, Disney sure is Grumpy.”

The story behind the suit is the stuff of Hollywood legends.

Carr's reputation for hosting expensive and lavish parties and creating spectacular production numbers led AMPAS to hire him to produce the 61st Academy Awards on March 29, 1989. Carr made a promise to shift the show from its perceived dry and dull stature to something different, one that would be inspired by Beach Blanket Babylon (created by Steve Silver), the musical revue show featuring Snow White during the Golden Age of Hollywood.[7] Three time Academy Award winner Marvin Hamlisch was brought in as conductor. With the promise of being "the antithesis of tacky," for which the ceremony would have no host, as it would rotate actors and actresses instead generally put in pairs as part of Carr's theme of "couples, companions, costars, and compadres", with the most notable pair being Bob Hope and Lucille Ball (in her final public appearance before her death just a few weeks later). The criticism for the ceremony stemmed mostly from the musical numbers that attempted to cross both Old and New Hollywood together. It began with a booming voice stating that the "star of all time" would arrive soon, which came in the form of Snow White, played by Eileen Bowman (Lorna Luft declined, much to her subsequent relief), who proceeded to try and shake the hands of stars in the audience in the theater (much to the embarrassment of nominated actress Michelle Pfeiffer). Merv Griffin started the show with his 1950 hit “I’ve Got a Lovely Bunch of Coconuts!”, complete with a re-creation of the Cocoanut Grove nightclub that featured a collection of established stars such as Vincent Price, Alice Faye, and Roy Rogers. Bowman returned to the stage afterwards to sing a duet of "Proud Mary" with Rob Lowe, which lasted twelve minutes.[8][9][4] A second production number was done later with "The Stars of Tomorrow" doing a number called "I Wanna Be An Oscar Winner" that featured actors in their teens/mid-20s such as Christian Slater (sword fighting Tyrone Power Jr.), Chad Lowe (shouting "I am a Thespian"), and Patrick Dempsey (tap dancing along the stairwell).[10]

Steve Silver, asked about the opening number while looking at the reviews, stated, "Janet Maslin says it is the worst production number in the history of the Oscars. I guess you can't top that. The publicity for Beach Blanket Babylon ought to be wonderful.” An open letter was released on April 7, featuring 17 figures of Hollywood (such as former Academy president Gregory Peck) [More on that below -- MP] that called the ceremony an embarrassment to the Academy and the industry. Two days later, the Los Angeles Times dedicated its entire letter section (ten in total) to hate mail about the ceremony, titling it “For Some, the Oscar Show Was One Big Carr Crash." The Walt Disney Company sued the Academy shortly after the ceremony for copyright infringement (no one at the Academy had asked for permission from Disney regarding Snow White), which forced the Academy to make a formal apology.[11] Lowe defended his appearance by describing himself as a "good soldier" doing it for the Academy, while Bowman would state later that the show looked like a "gay bar mitzvah", doing so years after the fact after having signed a gag order that had her not talk about the show for thirteen years.[12] Carr's reputation in Hollywood did not fully recover from the blowback the ceremony received, although there were some retaining benefits. The ratings for the show were marginally better than the previous ceremony, as over 42 million viewers in 26 million homes saw the ceremony in the United States.[13] The choice of no host for the ceremony would be replicated for the 2019 ceremony. Carr's decision to change the announcement from "And the winner is..." to "And the Oscar goes to.."[14] has been utilized for each Academy ceremony since. Carr elected to have retailer Fred Hayman get designers to dress the stars for arrival on the red carpet, which became its own segment of focus in later years. Comedian Bruce Vilanch, hired as a writer for the show, would work on the show for the next two decades, which included a promotion to head writer. Carr did not produce another film or television show again.

And check out who signed that letter (from a Variety thirty year anniversary article):

Filmmaker Joseph L. Mankiewicz griped to Variety that the show was “amateur night.” He was one of 17 individuals who wrote a letter to the Academy of Motion Picture Arts & Sciences saying, “The 61st Academy Awards show was an embarrassment to both the Academy and the entire motion picture industry,” and urging AMPAS president Richard Kahn and the board to ensure it didn’t happen again. Among other signatories: Julie Andrews, Stanley Donen, William Friedkin, Sidney Lumet, Paul Newman, Gregory Peck, Billy Wilder and Fred Zinnemann.

It's worth noting that the knives were out for Carr even before the show started. He was a flamboyantly gay man in a still closeted era. Worse yet, he had quickly risen to the top of the Hollywood social scene and this is a town that loves a good, tragic fall (they've made A Star Is Born four times). Disnet was just piling on.

 

 

If you're up to it, you can see most of the actual number here. Check out the reactions of Gregory Hines and Robert Downey jr. at the end (if you make it that far)






Thursday, January 18, 2024

Monumental greed, delusional narcissism, or just the drugs talking?

My money's on a combination of the three (but mainly the drugs).


From CNN

Elon Musk says he wants a significantly larger stake in Tesla than the one that already made him the richest person on the planet.

In a series of posts on X Monday night, Musk said that he would not want to grow Tesla to become a leader in artificial intelligence and robotics without a compensation plan that would give him ownership of around 25% of the company’s stock. That would be about double the roughly 13% stake he currently owns.

“I am uncomfortable growing Tesla to be a leader in AI & robotics without having ~25% voting control. Enough to be influential, but not so much that I can’t be overturned,” Musk wrote in a post on X. “Unless that is the case, I would prefer to build products outside of Tesla.”

Notably, Musk held a stake of more than 20% in Tesla before he sold a large number of shares to buy X, the social media company he purchased over a year ago for $44 billion.

Here's a handy explainer from the always reliable Common Sense Skeptic.





Wednesday, January 17, 2024

I know we've been pounding this hard for a long time, but the story of the 2024 GOP is first and foremost a story about feral disinformation

From last week:

At the risk of stating the obvious, having the majority of your party believe that you actually won the last election and should be in office now gives you a tremendous advantage in the primary. This is one of the antibodies that Josh Marshall has mentioned that makes it all but impossible for a challenger to take the nomination away from Donald Trump. (This is not to say that Trump can't be taken out, just that it would almost certainly require external forces like health issues or big legal developments).

We also pointed out that belief in these conspiracies was especially problematic for the establishment candidate, Nikki Haley.

If the post had come out a few days later, we might have included this.
 

 

 [More on HAARP's special place in the febrile mind tomorrow.]

With that and this in mind, check out this data point from Iowa.


I'm not going to go down the rabbit hole of to what degree belief causes support compared with the other way around (I'm sure you can find both), but the correlation is undeniable, as is the difference in levels of 2000 Mules believers between Haley and DeSantis voters. In normal times, having fewer conspiracy nuts among your supporters would be a good thing, but in this year's Republican primary, it's likely to be fatal.

Tuesday, January 16, 2024

[Almost] Ten years at the blog -- more thoughts on Richard Florida's creative class thesis

About ten years ago, we started noticing that a great deal of social science research had an analysis by outlier problem. Ten years later, the problem is still around and, in the case of San Francisco, has gotten worse. A city of less than a million that is neither the population center nor the employment center for the Bay Area, let alone the state has somehow become the primary focus of the nation's debate and the results have not been pretty.

Monday, February 24, 2014

The Outlier by the Bay

[Homonym alert -- I dictated this to my smart phone then edited it late in the evening.]

There's an energetic debate going on over at Andrew Gelman's site regarding Richard Florida's theories of the creative class. I can understand the urge to rise to Florida's defense. After all, there's great appeal to the idea that the kind of smart, innovative people who tend to drive economic growth are attracted to diverse, tolerant, livable cities with vibrant cultures. To  some extent, I believe it myself, but I also find myself having the same problems with Florida I have with the rest of the urban utopianists: first that they have a tendency to take interesting but somewhat limited findings and draw impossibly sweeping conclusions and TED-ready narratives; and that these narratives often mesh badly with the facts on the ground. I've already discussed the latter (in probably overly harsh but still heartfelt language). Here are some thoughts on the second.

Part of my problem with a lot of urban research is that there just aren't enough major cities out there to make a really good sample, particularly when you have data this confounded and so many unusual if not unique aspects with each area. For some cities, with New York and San Francisco being very close to the top of the list, these unique aspects make it difficult to generalize findings and policy suggestions.

When I look at Richard Florida's research, at least in the form that made it to the Washington Monthly article, the role of San Francisco strikes me as especially problematic.

What is by many standards the most Bohemian and gay-friendly area in America is also arguably the country's center of technological innovation. Even if there were no relationship in the rest of the country, that single point would create a statistically significant correlation. That would not be so troubling if we had a clear causal relationship or a common origin. Unfortunately, the main driver of the tech boom, if you had to limit yourself to just one factor, would have to be Stanford University, while the culture of San Francisco does not appear to have been particularly influenced by that school, particularly when compared to Berkeley. In other words, had Stanford chosen to establish his college in Bakersfield, we might still have had Haight-Ashbury but we almost certainly would not have had Silicon Valley.

What's more, when we start looking at this narrative on a city by city basis, we often fail to see what we would expect. For example, if you were growing up in a relatively repressive area of the Southeast and you were looking for a Bohemian, gay-friendly metropolitan area with a vibrant arts scene, the first name on your list would probably be New Orleans followed by, roughly in this order, Atlanta, Savannah, and Memphis. Neither Cary, North Carolina nor Huntsville, Alabama would have made your top 10.

Rather bizarrely, Florida discusses both the Research Triangle and and New Orleans in his WM article, apparently without seeing the disconnect with his theories.:
Stuck in old paradigms of economic development, cities like Buffalo, New Orleans, and Louisville struggled in the 1980s and 1990s to become the next "Silicon Somewhere" by building generic high-tech office parks or subsidizing professional sports teams. Yet they lost members of the creative class, and their economic dynamism, to places like Austin, Boston, Washington, D.C. and Seattle---places more tolerant, diverse, and open to creativity.
There are lots of reasons for leaving New Orleans for Austin, but tolerance, diversity and openness to creativity aren't among them.

Even stranger are Florida's comments about the Research Triangle:
Kotkin finds that the lack of lifestyle amenities is causing significant problems in attracting top creative people to places like the North Carolina Research Triangle. He quotes a major real estate developer as saying, "Ask anyone where downtown is and nobody can tell you. There's not much of a sense of place here. . . .The people I am selling space to are screaming about cultural issues." The Research Triangle lacks the hip urban lifestyle found in places like San Francisco, Seattle, New York, and Chicago, laments a University of North Carolina researcher: "In Raleigh-Durham, we can always visit the hog farms."
Remember, Florida said "Places that succeed in attracting and retaining creative class people prosper; those that fail don't," so is this spot withering away? Not so much:
Anchored by leading technology firms, government and world-class universities and medical centers, the area's economy has performed exceptionally well. Significant increases in employment, earnings, personal income and retail sales are projected over the next 15 years.

The region's growing high-technology community includes such companies as IBM, SAS Institute, Cisco Systems, NetApp, Red Hat, EMC Corporation and Credit Suisse First Boston. In addition to high-tech, the region is consistently ranked in the top three in the U.S. with concentration in life science companies. Some of these companies include GlaxoSmithKline, Biogen Idec, BASF, Merck & Co., Novo Nordisk, Novozymes, and Wyeth. Research Triangle Park and North Carolina State University's Centennial Campus in Raleigh support innovation through R&D and technology transfer among the region's companies and research universities (including Duke University and The University of North Carolina at Chapel Hill).
This is not to say that there is not some truth to Florida's narrative or validity to many if not most of his insights. It does appear, however, that the magnitude of the effects he proposes are far less than he suggested and that the absolute claims he is fond of making are often riddled with exceptions.

 

Monday, January 15, 2024

Welcome to the Iowa caucus, the Golden Globes of American politics

[Before we get started, check out this historical overview from the LA Times' Mark Z. Barabak.]


There was a point decades ago when the Iowa caucus and the New Hampshire primary actually served a useful purpose as search committees, allowing candidates with little name recognition and limited budgets to make their case to a small politically engaged state.  This is what happened with Jimmy Carter.  To a degree, it also happened to Barack Obama though he was hardly as obscure as Carter.  I don't remember if the following was something I said or if it was said to me but after his speech at the democratic national convention in 2004, one of us told the other that we had just seen the man who would be the first black president of the United States.

That one scenario where someone most people have never heard of manages to come in first or second is the only time that the results from Iowa are important or newsworthy or even interesting.  That is the entire case for Iowa and to a degree in New Hampshire going first and it has largely evaporated in the age of mega budgets and the long campaign.

Take take it away and Iowa becomes the Golden Globes of American politics, everyone knows it's absolutely meaningless but does come first and all the journalists covering the real story are starved for something to talk about.

Why should we ignore Iowa?  Putting aside the size of the state, the caucus system is hopelessly confusing and, except for its ability to occasionally elevate the truly unknown, is horribly and incurably flawed.  Add to that the highly idiosyncratic nature of Iowa politics and, from time to time, the weather, and you have an event that tells us virtually nothing about the state of the primary.


Friday, January 12, 2024

Plagiarism and nuance

This is Joseph.

There has been a lot of discussion about Plagiarism lately. But it is a very much an art to identify it in academic writing. Some sections, like the methods, need to present information clearly and there are not that many ways to state the same information. 

Compare this paper with this paper in terms of the first paragraph of the methods section: 



These papers were published one day apart (Nov 27/28 2022). There is not a single author in common and both reflect very different styles of developing the paragraph. But automated detection software will find a surprising correspondence because both papers will need to accurately describe the sampling frame for the study and many aspects of this are fixed (how many people, race/ethnicity, ages, field centers, exclusion criteria). Same thing with technical definitions and mathematical equations -- math does not do well with being creatively altered. 

I don't want to defend plagiarism, but there is a lot of nuance to academic writing. It takes some real effort to decide if things that are similar actually rise to the level of an issue and it takes a lot of work and judgement. Here is a case where a scientist looks at what the media found and has . . . questions

It is not definitive, but it explains why careful investigation is the best path forward. 

Anyway, food for thought.

Thursday, January 11, 2024

Five years ago at the blog -- we were skeptical of the streaming business model

We've been hammering this one for for about a decade, back when it was another one of our unpopular lose-friends-and-alienate-people positions. Now in 2024, we know that the industry on the whole has been losing billions of dollars a year, even the one (moderately) profitable company is cutting back and largely abandoning its old business model, and pretty much everybody has admitted that spending a hundred-plus million a pop on shows that mostly got cable access audience numbers might not have been the best idea.

 

Friday, January 11, 2019

Yes, this is what a content bubble looks like

This post by LGM's Loomis is a perfect example of an unintentionally interesting piece, one that makes a tremendously important point in passing then never returns to it.

One of my New Year’s resolutions for 2018 was to watch more television. I’ve slowly increased my TV viewing over the last couple of years in response to the great stuff out there. Now it’s kind of weird, as the consensus is that there’s a lot of just OK shows and not much that’s really that compelling compared to the prestige dramas of earlier this decade. But I’m so far behind that this really barely matters to me.

There's a lot to unpack here.

One. While most of us are making New Year's resolutions to watch less television, Loomis is one of the very few who actually resolved to watch more, particularly those buzz-friendly critical darlings that the major streaming services are spending so heavily on. He is the ideal consumer with respect to this business model, and yet even he acknowledges that he will never be able to catch up with all the shows on his to-see list.

The dirty little not-so-secret secret of most of these must-see shows is that very few people actually watch them. For all their awards and feature stories, they remain more talked about than viewed. We could have an interesting discussion about their role in brand building and other indirect effects, but even with those taken into account, you have to have serious concerns about a business strategy that spends billions of dollars producing shows with such tiny audiences.

Two. Yes, N=1, but the perception that quality is slacking off has tremendously disturbing implications for the business model. For a number of years, the formula for generating awards, buzz, and perceived quality was fairly simple. Obviously, making good shows did help, but the key to getting noticed was hiring big-name talent, spending stunning amounts on PR and marketing, and sticking as close as possible to a handful of genres that lent themselves to extensive coverage and favorable reviews ("it's a dark, edgy crime drama with a quirky sense of humor, ""it's a dark, mindbending science-fiction drama with a quirky sense of humor"). Now, though, there is reason to believe that through a combination of saturation and the half-life of novelty, the formula is losing its effectiveness. That means even more obscene amounts of money will have to be spent to create the same impact.

Three. Finally, as we have said many times before, content accumulates. The 500 or so series that are currently in production are not just competing against each other, but against everything that has come before. If someone like Loomis who is almost genetically engineered to seek out new, trendy shows is opting instead for something that has been off the air for years like the Sopranos, investors should definitely be taking note.




Wednesday, January 10, 2024

At the risk of stating the obvious, having the majority of your party believe that you actually won the last election and should be in office now gives you a tremendous advantage in the primary.

  Consider this recent you-gov poll on conspiracy theories.

 


The first obvious thing that jumps out is that Republicans have a much more serious problem with conspiracy theories and feral disinformation than do Democrats. Not exactly surprising, but it's always nice to find data that backs up your intuition. 
 
Now take a look at these three questions...
 

 
There are lots of interesting things to look at here but the big relevant takeaways for me are that Republicans tend to be far more paranoid about the establishment and that a majority of them believe that the election was stolen.

The implications of this last point are huge. For one thing, these people believe that Donald Trump was the legitimate winner. From that it follows that January 6th could not have been an insurrection; instead, it was actually an attempt to prevent an insurrection. Once you believe this, all sorts of major events of the past couple of years are completely inverted. Suddenly, the January 6th rioters become political prisoners. The Jack Smith and Georgia investigations become nothing more than politicized attempts to keep the rightful winner out of office. The near universal condemnation of the stolen election lie in the non-partisan press becomes damning evidence of a vast leftist conspiracy, and the fact that most Americans don't believe the lie proves that the conspiracy has been frighteningly effective.

At the risk of stating the obvious, having the majority of your party believe that you actually won the last election and should be in office now gives you a tremendous advantage in the primary. This is one of the antibodies that Josh Marshall has mentioned that makes it all but impossible for a challenger to take the nomination away from Donald Trump.
(this is not to say that Trump can't be taken out, just that it would almost certainly require external forces like health issues or big legal developments ).

The establishment question points out another almost insurmountable obstacle for those trying to knock the former president out before the general. Trump is seen, in many ways accurately, as the ultimate outsider candidate. This presents challengers with something of a catch-22. In order to have any chance of unseating Trump, they have to have the full support of the Republican establishment, particularly in its now weakened state, but if they are seen as having the support of the establishment, they don't have any chance of unseating Trump. This is particularly a problem for the mainstream media's flavor of the month.

 Nikki Haley’s Rocket Ride to Second Place

December 2, 2023

In April, Sarah Longwell of The Bulwark wrote what is still one of the most insightful reports about the Republican electorate. Longwell, strategic director of Republican Voters Against Trump, has sat through hundreds of focus groups to understand the mental state of the party. Her primary conclusion is that most GOP voters see the Trump era not as an interregnum but as a kind of revolutionary event she calls “Year Zero.”

“The Republican party has been irretrievably altered,” she wrote, “and, as one GOP voter put it succinctly, ‘We’re never going back.’” Such voters have bought into Trump’s argument that the party leaders who preceded him were weak losers. (This argument conveniently absolves Trump of blame for his own losses — he was sabotaged by the Establishment, you see.) “If you forged your political identity pre-Trump, then you belong to a GOP establishment now loathed by a majority of Republican primary voters,” she concluded. “Even if you agree with Trump. Even if you worked for Trump. Even if you were on Trump’s ticket as his vice president.

Longwell laid out a roster of Republican politicians whom the voters could never accept for this reason. The first name on her list was Nikki Haley.


Even if Haley was running a good campaign (and in case you haven't been paying attention, she's not), the very things that make her so appealing to the establishment also make her toxic to a large portion of the GOP base.

Tuesday, January 9, 2024

100% Pure Extra Virgin Schadenfreude

This story involves two topics which I normally don't consider worth the candle, plagiarism (which requires an enormous amount of spade work to discuss properly, inevitably more effort than the case merits) and Harvard. In this instance, however, we have someone so odious revealed to be a huge hypocrite in the most embarrassing manner possible. To paraphrase Oscar Wilde you would have to have a heart of stone not to laugh at the travails of Bill Ackman.

Katherine Long, Jack Newsham writing for Business Insider:

The billionaire hedge fund manager and major Harvard donor Bill Ackman seized on revelations that Harvard's president, Claudine Gay, had plagiarized some passages in her academic work to underscore his calls for her removal following what he perceived as her mishandling of large protests against Israel's bombardment of Gaza on Harvard's campus.

 ...

Her husband, Ackman, has taken a hardline stance on plagiarism. On Wednesday, responding to news that Gay is set to remain a part of Harvard's faculty after she resigned as president, he wrote on X that Gay should be fired completely due to "serious plagiarism issues."

"Students are forced to withdraw for much less," Ackman continued. "Rewarding her with a highly paid faculty position sets a very bad precedent for academic integrity at Harvard."

 And you'll never guess what happens next.

In [Ackman's wife, Neri] Oxman's dissertation, completed at MIT, she plagiarized a 1998 paper by two Israeli scholars, Steve Weiner and H. Daniel Wagner, a 2006 article published in the journal Nature by the New York University historian Peder Anker, and a 1995 paper published in the proceedings of the Royal Society of London. She also lifted from a book published in 1998 by the German physicist Claus Mattheck and, in a more classical mode of plagiarism, copied one paragraph from Mattheck without any quotation or attribution.

In addition to getting funnier every damn time you read it, this incident provides us with a dramatic example of one of the main problems with the outcry over minor plagiarism, selective enforcement.

Whenever the non-immediate consequences for these offenses go beyond public shaming, where people pay real penalties for things that happened years ago, invariably the application will be inconsistent and unfair, often deliberately so. Minor incidents of plagiarism are so common with so much falling in a gray area, that most people have (intentionally or not) done something that a bad faith actor can go after them for.

 

 

 

While similar or worse cases are largely ignored.

 

Did some of the instances that so enraged Ackman go beyond the trivial and the sloppy? I couldn't tell you, but I can say that informed and objective opinions differ and that some of those who claim to have the least doubt have given us the most reason to doubt their impartiality.

 

One of the best parts of this story has been Ackman's reactions to the BI article. 




 

And I'll leave you with this closing thought.

Monday, January 8, 2024

Twelve years ago at the blog -- After stealing these characters, Marvel trademarked and copyrighted them, which was just adding injury to insult

It isn't all that relevant to the subject of the post, but I'm still surprised I didn't work this in. 

Captain Billy's Whiz Bang is one of the signs of sin and decadence given in "Ya Got Trouble" from Meredith Willson's the Music Man (anachronistically, since the play is set in 1912 and the magazine didn't debut until seven years later).

Mothers of River City!
Heed that warning before it's too late!
Watch for the tell-tale sign of corruption!
The minute your son leaves the house,
Does he rebuckle his knickerbockers below the knee?
Is there a nicotine stain on his index finger?
A dime novel hidden in the corn crib?
Is he starting to memorize jokes from Capt. Billy's Whiz Bang?
Are certain words creeping into his conversation?
Words like, like 'swell?"
And 'so's your old man?"
Well, if so my friends,

Ya got trouble,
Right here in River city!
With a capital "T"
And that rhymes with "P"
And that stands for Pool.

 



 

And while we're on the subject of Marvel and IP.

 

Sunday, January 8, 2012

Intellectual property and business life-cycles



A while back, we had a post arguing that long extensions for copyrights don't seem to produce increased value in properties created after the extension, but what about the costs of an extension? And who pays it?

New/small media companies tend to make extensive use of the public domain (often entailing a rather liberal reading of the 'public' part). The public domain allows a company with limited resources to quickly and cheaply come up with a marketable line of products which can sustain the company until it can generate a sufficient number of original, established properties.

Many major media companies have gotten their start mining the public domain, none more humbly than Fawcett. At its height, the company had magazines that peaked at a combined circulation of ten million a month in newsstand sales, comics that outsold Superman, and the legendary Gold Medal line of paperbacks. All of this started with a cheaply printed joke magazine called Captain Billy's Whiz Bang


Of course, Wilford Fawcett couldn't have reimbursed the unknown authors of those jokes even if he had wanted to. Disney, on the other hand, built its first success on a a title that was arguably still under copyright.
Mickey had been Disney's biggest hit but he wasn't their first. The studio had established itself with a series of comedies in the early Twenties about a live-action little girl named Alice who found herself in an animated wonderland. In case anyone missed the connection, the debut was actually called "Alice's Wonderland." The Alice Comedies were the series that allowed Disney to leave Kansas and set up his Hollywood studio.

For context, Lewis Carroll published the Alice books, Wonderland and Through the Looking Glass, in 1865 and 1871 and died in 1898. Even under the law that preceded the Mouse Protection Act, Alice would have been the property of Carroll's estate and "Alice's Wonderland" was a far more clear-cut example of infringement than were many of the cases Disney has pursued over the years.

In other words, if present laws and attitudes about intellectual property had been around in the Twenties, the company that lobbied hardest for them might never have existed.
Another company that went from near bankruptcy to media powerhouse was a third tier comics publisher that had finally settled on the name Marvel. The company's turnaround is the stuff of a great case study (though MBA candidates should be warned, Stan Lee's memoirs can be slightly less credible than his comics). Not surprisingly, one element of that turnaround was a loose reading of copyright laws.

Comic book writer and historian Don Markstein has some examples:
Comic book publisher Martin Goodman was no respecter of the property rights of his defunct colleagues. In 1964, he appropriated the name of a superhero published in the '40s by Lev Gleason, and brought out his own version of Daredevil. A couple of years later, he introduced an outright copy of a '50s western character published by Magazine Enterprises, Ghost Rider. It wasn't until late 1967, possibly prompted by a smaller publisher's attempt to do the same, that he finally got around to stealing the name of one of the most prominent comics heroes of all time, Captain Marvel. And this delay was odd, because the name of Goodman's company was (and remains) Marvel Comics."
(That would, by the way, be Fawcett's Captain Marvel so what goes around...)

(Fans of fantasy art should find the covers of the old Ghost Rider familiar)




This is how how media companies start. A small music label fills out a CD with a few folk songs. An independent movie company comes up with a low-budget Poe project. An unaffiliated television station runs a late night horror show with public domain films like Little Shop of Horrors and Night of the Living Dead. Then, with the payroll met and some money in the bank, these companies start getting more ambitious.

Expansion of the public domain is creative destruction at its most productive. Not only does it clear the way for new work; it actually provides the building blocks.

 

Friday, January 5, 2024

Politics and Jobs

This is Joseph.

So I think that this line of thinking is misguided:


Why? Well, the first problem is that purity tests, in general, have a tendency to be abused. The question of whether one's political beliefs make one capable of doing a job, seem like something that goes wrong. Even in this exact context

Now, if there is an actual act that shows a physician is acting inappropriately with respect to patient care that is different. It can also be helpful to include a broad range of marginalized voices in the discussion of priorities -- this is why we have a universal voting franchise. But "Zionism" is a very nebulous term in this context and it is going to map to a specific heritage rather more than chance alone would. I will also point out that many people see the political beliefs of their opponents odious. You may say real lives are associated with the consequences of Zionism. But the same is true of abortion rights

A pluralistic society needs to have room for disagreement on politics, especially with international politics. I disagree with Russia's decisions in Ukraine, but would not think of it as a good idea to interrogate the beliefs of Russian-Americans. Even if some of them might have perspectives that are more positive towards this conflict than mine are. People do not automatically become liable for actions of groups that they just happen to be distantly related to, especially if they are not members of these polities. 

I think that this crosses an important line in trying to keep a diverse and dynamic society together. 

Thursday, January 4, 2024

Meet Dan O'Neill, the one person who still isn't allowed to draw Mickey Mouse

Another post for copyright month.

Gene Maddaus writing for Variety.

Dan O’Neill was 53 years ahead of his time.

In 1971, he launched a countercultural attack on Mickey Mouse. In his underground comic book, “Air Pirates Funnies,” the lovable mouse was seen smuggling drugs and performing oral sex on Minnie.

As O’Neill had hoped, Disney sued him for copyright infringement. He believed it was a legal parody. But after eight years in court, he was saddled with a judgment he could not pay. To stay out of prison, he agreed never to draw Mickey Mouse again.

“It’s still a crime for me,” said O’Neill, 81, in a phone interview from his home in Nevada City, Calif. “If I draw a picture of Mickey Mouse, I owe Walt Disney a $190,000 fine, $10,000 more for legal fees, and a year in prison.”

 


 

It's possible this agreement is no longer in effect -- I doubt very much that even Disney would try to enforce it at this point -- but the story is fascinating, and one we've talked about before.

Parody is a protected form of free speech as long as it isn't too good 

With IP law, you know in advance that the big boys are heavily favored to win. The suspense is in the legal twists and turns getting there.


From Terence Chua's thesis, "Messing with the Mouse."

In 1971, Disney sued a group of underground comic artists calling themselves the Air Pirates, who published two comics portraying Walt Disney characters in sex and drug-related situations. The resulting case lasted 8 years and ended in a settlement where both sides claimed victory. This thesis uses the case to examine the development of the law of copyright and parody as a defense and demonstrate that the court tends to rule against the parodist if the work is offensive or obscene, although these are irrelevant concerns. It also examines the case itself and the cultural and personal forces motivating the parody.

[Dan]O'Neill's affidavit was positively lyrical in justifying the artistic reasons behind Air Pirates Funnies, but it contained language that ultimately proved damaging to the Air Pirates' arguments. O'Neill stated that he drew cartoons to "relieve a basic human anxiety pattern, hysteria," by means of laughter. Mickey Mouse, he deposed, had started as a positive image, but as people grew older, it became a "non-positive adjective." To investigate why it had degenerated, O'Neill said he "chose to parody exactly the style of drawing and characters to evoke the response created by Disney (emphasis in original)."
...

[From here on, all emphasis added]

The Ninth Circuit delivered its 15-page decision on September 5, 1978, ruling three to zero against the Air Pirates on the charges of copyright infringement. Judge Walter J. Cummings, a sixty-year-old former Assistant United States Solicitor General and former partner in a Chicago-based law firm appointed to the bench by Johnson, penned the judgment. ... Cummings then considered fair use as a defense. He noted that the Pirates were not saying that the copying was not substantial enough to be infringing, merely that the infringement was defensible as an example of parody and thus fair use. Noting that Loew's case was the legal standard, the court found that Wollenberg's test of "substantial copying, combined with the fact that the portion copied constituted a substantial part of the defendant's work" that "automatically precluded the fair use doctrine" was unjustified. Such a reading would make any defense of fair use untenable, and would lead to a gap where a substantial amount was taken but not a substantial part of the defendant's work. Loew's was more properly read as "setting a threshold that eliminates from the fair use doctrine copying that is virtually verbatim," as in Jack Benny's burlesque of Gaslight. Loew's, in other words, was the upper limit to tell what was definitely not fair use. In the absence of "near-verbatim copying", the test would be Berlin's, as in whether the parodist had taken up more than was needed to "recall or conjure" the original.

The Ninth Circuit decided that the Pirates had done more than was needed. Ironically, the ubiquitous presence of Disney's characters in popular culture that made them such attractive targets was precisely why the Pirates had gone too far. Cummings wrote, "Given the widespread public recognition of the major characters involved here... very little would have been necessary to place Mickey Mouse and his image in the minds of the readers." He noted that Pirates did not parody how the characters looked, but their "personalities, their wholesomeness and their innocence." The Pirates would therefore have had a better argument if they had "paralleled... Disney characters and their actions in a manner that conjured up the particular elements of the innocence of the characters to be satirized... Here, the copying of the graphic image appears to have no other purpose than to track Disney's work... as closely as possible." Cummings dismissed the Pirates' arguments that they had to copy Disney exactly to make their point effectively. They were entitled to parody, but they were not entitled to the "best parody" they could make – that consideration had to be balanced with the rights of the copyright owner, and the Pirates had exceeded what was "necessary to place firmly in the reader's mind the parodied work and those specific attributes that are to be satirized." Because of this, Wollenberg's granting of summary judgment on copyright infringement was proper.

...

"Communiqu̩ #1" goes on to criticize the Ninth Circuit's decision in the Air Pirates case as being too vague. Misidentifying the Ninth Circuit as the "Supreme145 Court", O'Neill quotes the court as saying that the Pirates had taken too much of the original when effecting their parody. Although "'some' says the Court, is OK... no one, including the Court, is sure how much is 'some'..." O'Neill juxtaposed this with a drawing of Mickey's head on a realistic rat's body, its tail curled around a sign that says, "Is this some?" Minnie also points to her gloved hand Рwhich has five fingers instead of the usual cartoon four Рwith a caption, "Is this some?"
Something about the "best parody" section seems particularly off. It was not the juvenile and deliberately offensive attempts to shock that did the Air Pirates in but the loving homage. Though there was little that could be called copying -- only a few of the images call back directly to the source material --  O'Neill and friends beautifully captured the style and the sensibility of the original Gottfredson strips. 

One of the many ironies of this case is that had the artistic quality of the parody been worse, the defendants' legal case would have been stronger,

And that doesn't seem right. 

 

Wednesday, January 3, 2024

The semi-emancipation of proto-Mickey

So it finally happened. After countless successful battles involving armies lobbyists, Mickey Mouse is in the public domain... sort of. He still has trademark protection which isn't going away anytime in the foreseeable future so you can't name your amusement park Mickey Mouse-land, nor can you draw him with white gloves and red pants (It is only the 1928 Steamboat Willie version where the copyright has expired), but if you are creating a movie or TV show or comic book, you can introduce the familiar rodent as a character without being sued into oblivion by the Walt Disney company.

This is a complicated story, so some background is helpful.

From Wikipedia:

The Copyright Term Extension Act (CTEA) of 1998 extended copyright terms in the United States by 20 years. Since the Copyright Act of 1976, copyright would last for the life of the author plus 50 years, or 75 years for a work of corporate authorship. The Act extended these terms to life of the author plus 70 years and for works of corporate authorship to 120 years after creation or 95 years after publication, whichever endpoint is earlier. Copyright protection for works published prior to January 1, 1978, was increased by 20 years to a total of 95 years from their publication date.
For twenty years, nothing entered the public domain.

What made the Sonny Bono Act especially egregious was the fact that it came less than a quarter century after the 1976 act which was itself a major copyright extension. For context, the previous statutory extension had been in 1909.

There was some justification for the 1976 law. Media had undergone huge innovations and those sixty-seven years and the law very much needed to be updated with respect to movies, television, etc., but the case for those extensions was far weaker, particularly with work-for-hire. What had changed was that this IP was now worth a tremendous amount of money. The middle fifty years or so of the 20th century had been stunningly fertile in terms of popular culture creating tens, probably hundreds of billions of dollars worth of intellectual property which was about to start sliding into the public domain unless action was taken.

In 1998, the impetus was obviously and almost entirely the desire of a handful of huge corporations to keep from handing back works that, for the most part, they had accumulated, almost always having paid the actual creators a fraction of the value of the original works. 

As for the wider economic impact, here's the invaluable Michael Hiltzik:

The fundamental error in this timeline is the notion that ever-longer protection is a good thing. It’s wrong on several counts. To some extent it’s based on the theory that creators (or their heirs) should be entitled to income from a work well into the distant future in order to incentivize creative artists to create.

But the truth is that the income stream from all but a tiny minority of published works largely evaporates after the first few years, and what does arrive decades in the future has a minuscule present value at the time of creation. The 20-year extension in the 1998 law, as 17 economists (including five Nobel laureates) wrote in a 2002 Supreme Court brief, provided “no significant incentive to create new works” and arguably less for existing works.

In fact, constraining entry into the public domain is a drag on creativity. 

Once a work enters the public domain, Jenkins says, “community theaters can screen the films. Youth orchestras can perform the music publicly, without paying licensing fees. Online repositories such as the Internet Archive, HathiTrust, Google Books, and the New York Public Library can make works fully available online. This helps enable access to cultural materials that might otherwise be lost to history. ... Anyone can rescue them from obscurity and make them available, where we can all discover, enjoy, and breathe new life into them.”

In some cases, extended copyright seems to work against the public interest. Consider the stringent control exercised by the estate of the Rev. Martin Luther King Jr. — mostly his children — over his speeches and writings such as the “I Have a Dream” speech he delivered in Washington, D.C., on Aug. 28, 1963.

On the day it was delivered, the speech was eligible for copyright protection through 2019. Congressional revisions extended the speech’s copyright until 2058, nearly a century after King delivered it to a massive crowd at the Lincoln Memorial and untold more viewers on television. Filmmaker Ava DuVernay had to put rewritten and paraphrased lines into the mouth of the actor portraying King in her film “Selma,” about the 1965 protests in support of the Voting Rights Act.

DuVernay’s options were limited because the King estate had sold the film rights to Steven Spielberg for a still-unproduced project. Even had she acquired the rights, she said, that might have required giving the family a voice in how King was portrayed, constraining her own artistic choices.

The next few years will be interesting. Things should be quiet for a while, but around 2034, assuming they don't try for another extension (and I doubt they'll push it that far), things will start to pop, particularly at Disney and Warner Bros. where such valuable characters as Donald Duck, Bugs, Daffy, Porky, Superman, Batman, Wonder Woman, Captain America, Namor will either be in the public domain. 

 We are about to enter the golden age of trademark enforcement.

 

Tuesday, January 2, 2024

The very idea of claiming a candidate a year before an election has X% chance of winning is gross statistical malpractice.

[This has been sitting in the queue for a while but I think it still has another month or two on the sale-by date.]

A couple of issues make talking about predictive modeling difficult: 

Predictive range -- When we say someone accurately predicted an outcome, are we talking about an event that happened the the next day or the next year? Most are easier in the short range. Some are easier in the long (we'll all be dead) range. This has been particularly relevant with poll-based electoral predictions, where the track record for short term models has been great and long term models has been disastrous. We have an extensive history of pundits bragging about successes in the first category while hoping you'll forget about their failures in the second.

So, Is Obama Toast? by Nate Silver


Then there's modelers' luck. The problem with checking any probabilistic claim is that being right (got the outcome predicted) doesn't mean you were right (used a sound approach to estimate reasonable odds). The person who told you not to try to fill an inside straight was right and the person who told you to go for it was wrong, even if you did end up getting the card you were looking for.  

Back in 2011, Nate Silver said that, unless there was a major uptick in the economy, Obama had very little chance (think Russian Roulette odds) of winning the election. Instead, the economy was basically flat and yet the incumbent not only won but won by a comfortable margin. It is safe to say the model is wrong but was it bad or merely unlucky? Based on this article's long and admirably transparent explanation, I have to go with bad and here are some of the reasons why.

The fundamental assumption of predictive modeling is that things still work like they used to. Correlations and causal relationships from the past still hold. Data are collected in roughly the same way and the statistics derived from them have the same definitions.

The first practical implication of the fundamental assumption is that you can't push the boundaries of your data back too far. If things were two different beyond a certain point, you can't reasonably assume that they will generalize to today.

How far back you can reasonably go depends on what kinds of questions you are trying to answer and what types of data you're relying on. In terms of re-elections, 1931 is certainly too far back for any kind of meaningful comparison. This would have been 80 years before Nate Silver did his analysis which is a long time with respect to making political or social comparisons. More importantly, the way public opinion was formed and measured is enormously different. Add to that the huge outlier which was the beginning of the Great Depression.

We are even further into outlier territory with the entire presidency of FDR, especially if we're talking about the concept of re-election. (Silver goes back to 1944 in his analysis.) Truman is also problematic for a number of reasons, not the least of which being the fact he was not technically re-elected. The same concerns apply to LBJ and Gerald Ford.

This leaves us with Eisenhower, Nixon, Carter, Reagan, HW Bush, Clinton, and George W Bush. 

N equals 7.

Even if we ignore the distinction between election and reelection (which is a pretty big jump) and look at all elections going back to 1952, which is about the maximum I would be comfortable with, we're still looking at 15 elections to take us to Obama versus Romney. 

N equals 15.

(If we were just looking at win/loss, one of those 15 data points is missing since we will never know who actually won the 2000 election.)

That would be a small sample under the best of circumstances, but in this case we also have messy data, major one time events like the Cuban Missile Crisis, the Vietnam War, the Watts riots and the Iranian hostage crisis, not to mention waaaaaaay more than 15 researcher degrees of freedom.

Case in point. Look at how Silver handles the 800 lb gorilla of the model.

A president’s approval rating at the beginning of his third year in office has historically had very little correlation to his eventual fate. In January 1983, Reagan had an approval rating of just 37 percent, but he won in a landslide. George H. W. Bush had a 79 percent approval rating in January 1991 and was soundly defeated. But voters start to think differently about a president over the course of his third year; they view him more on the basis of his performance and less on the hopes they had for him. These perceptions are sharpened by the beginning of the opposition party’s primary campaign, which, of course, accentuates the negatives.

A president’s approval rating toward the end of his third year, therefore, has been a decent (although imperfect [I love how Silver throws in these little qualifiers while getting further and further ahead of the data -- MP]) predictor of his chances of victory. Reagan saw his approval rating shoot up to 51 percent in November 1983 amid the V-shaped recovery from the recession of the previous year — the first sign that he was headed for a big win. Obama’s approval rating may have rebounded by a point or two from its lows after the debt-ceiling debacle — but not by much more than that. In late October, it ranged between 40 and 46 percent in different polls and averaged about 43 percent.

Look at the forks. Of the various factors we can put in the model,  we pick approval rating but the fit to our fourteen data points is still crappy, so we limit ourselves to an arbitrary interval. Silver tells a good story to justify setting the the cut-off at the end of the third year, but that's all it is, a story, and even if it's true, we have no way of knowing if that particular cut-off will be appropriate going forward.

Silver also considered

The good news is that voters have short memories. If there are hopeful signs during an election year, they may be willing to forget earlier problems. Reagan, Nixon, Eisenhower and Truman all won despite recessions earlier in their terms. Moreover, voters’ evaluations of the economy are relatively forward-looking. Even if the economy is below its full productive capacity — as it was in November 1984 when the unemployment rate was 7.2 percent, and as it certainly was in 1936, when it was still around 17 percent — voters may be willing to overlook this, provided it seems headed in the right direction.

Monday, January 1, 2024