Wednesday, January 31, 2024

Willy Wonka and the money-printing machine

[more January IP blogging]

I noticed this box of cereal the other day while shopping. My first thought was they are still squeezing money out of a 50 year old film and the image of a star who has been dead since 2016

 

 [Next to a box of cereal featuring sixty year old cartoon characters, but that's a topic for another post.]



The example was even more striking after I did a little research. This wasn't just a fifty-year-old film; it was a fifty-year-old flop.

 

The idea for adapting the book into a film came about when director Mel Stuart's ten-year-old daughter read the book and asked her father to make a film out of it, with "Uncle Dave" (producer David L. Wolper, who was not related to the Stuarts) producing it. Stuart showed the book to Wolper, who happened to be in the midst of talks with the Quaker Oats Company regarding a vehicle to introduce a new candy bar from its Chicago-based Breaker Confections subsidiary (since renamed the Willy Wonka Candy Company and sold to Nestlé). Wolper persuaded the company, which had no previous experience in the film industry, to buy the rights to the book and finance the picture for the purpose of promoting a new Quaker Oats "Wonka Bar".

 ...

Willy Wonka & the Chocolate Factory remained in obscurity in the years immediately following its original release. When the distribution rights lapsed in 1977, Paramount declined to renew, considering it not viable. The rights defaulted back to the Quaker Oats Company, which was no longer involved in the film business, and therefore sold them to Warner Bros. for $500,000. Wolper engineered the rights sale to Warner, where he became a corporate director after selling his production company to it the previous year.

By the 1980s, the film had experienced an increase in popularity due to repeated television broadcasts, and gained cult status with a new audience in home video sales. In 1996, there was a 25th anniversary theatrical re-release which grossed the film a further $21 million. In 2003, Entertainment Weekly ranked it 25th in the "Top 50 Cult Movies" of all time. The tunnel scene during the boat ride has been cited as one of the scariest in a film for children, for its surreal visuals, and was ranked No. 74 on Bravo's The 100 Scariest Movie Moments. The scene has also been interpreted as a psychedelic trip, though director Stuart denied that was his intention.

...

Willy Wonka & the Chocolate Factory was released by Paramount Pictures on June 30, 1971. The film was not a big success, eventually earning $4 million worldwide on a budget of $3 million, and was the 24th highest-grossing film of the year in North America.

A few important points here both about the business of entertainment and about why most of the coverage of that business is so bad.

First off, every story of this industry is an IP story, both in who owns it and the curve of its money-making potential. Warner picked up the rights to this film for what was, even at the time, a song. Since then it has grown far more profitable and, to point out the obvious, that's all gravy. You don't need to spend any money on production or acquiring rights and, in a sense, its advertising budget is negative since you get paid for product placements to keep it in the public eye.

As we've discussed before, certain intellectual property has legs. It will continue to produce, often becoming even more popular, for decades after its creation. It is no coincidence that every few years armies of lobbyists from Disney and other big media companies descend upon Washington to get Congress to push back the expiration dates on copyrights.

Certain properties remain culturally relevant seemingly forever, and that relevance translates to multiple streams of revenue. They will be sold directly, remade or covered, streamed, licensed, and will serve as the basis for derivative works.

Universal's Frankenstein is over ninety years old and you will still see Jack Pierce's copyrighted makeup every Halloween. At the height of the pandemic, people watched over a billion hours of the  Andy Griffith Show, a series that ran from 1960 to 1968. Artists are constantly covering the songs of Hank Williams, a songwriter who died seventy years ago. Frankenstein, The Andy Griffith Show, and Hank Williams catalog were all enormously profitable at the time (Griffith was the number one show in the country the year it went off the air), but all ended up making far more money afterwards.

Looping back to the original subject, one place where one frequently finds IP with legs is the children's market. Kids are voracious consumers of media, have an extraordinary tolerance for repetition, and are frequently not all that discerning. Better yet, if your target audience is any age band Under 10, every few years it will completely refresh itself so you can just haul out the same old product. In one notorious example, Hanna-Barbera would crank out a single season of shows like The Jetsons, Johnny Quest, or Space Ghost and then run them in constant rotation on Saturday mornings for decades.

Television basically created the children's market, but it was home media that cranked it into high gear. I remember an interview with Robert Altman where the director of films like mash, McCabe and Mrs Miller, Nashville, and the Player said the film that had made him the most money was Popeye.

 Journalists covering the streaming industry, particularly East Coast based journalists, have done an embarrassingly bad job with the IP aspect of the industry, which is probably the single worst thing they could screw up. Disney and Warners lost billions saturating the market with expensive shows that provided little incremental value -- how many additional subscribers do you think the $150 to 200 million investment in Moon Knight brought in? - - while the majority of viewers were there for their already incredibly rich catalogs. At the same time, these journalists did a piss poor job reporting on which "Originals" Netflix actually owned the rights to and, perhaps more importantly, how few of the shows that they did on had any kind of legs.

Tuesday, January 30, 2024

Twelve years ago at the blog -- The Devil's Candy with added Kael

We were talking a lot about education costs at the time so the analogy of cost spirals was a bit more obvious then than it is now. 

A few years after posting this, I came across this passage from a Pauline Kael essay spelled out that what went wrong with Bonfire a decade before the film was made.


Monday, January 30, 2012

The Devil's Candy -- movie bombs and college budgets

The recent discussion of higher education costs got me thinking about other spiraling budgets and about one of my favorite case studies on the subject, Julie Salamon's excellent, The Devil's Candy, an account of the making of the movie adaptation of Bonfire of the Vanities.

Salamon, already a well established journalist, was given almost unprecedented access to the production. I say 'almost' because there is one other similar book, Picture, by Lillian Ross of the New Yorker, which describes John Huston's filming of Stephen Crane's Red Badge of Courage. Huston's film has grown in critical stature over the years, but it was a notorious commercial flop, which should, perhaps, have been a warning to Brian DePalma and the other people behind Bonfire.

Of course, Hollywood is a world of its own, but there are some general lessons in The Devil's Candy. One is that enterprises have a right size and if you try to scale past that size, things can go very wrong. As DePalma (who deserves serious points for forthrightness) put it:

"The initial concept of it was incorrect. If you're going to do The Bonfire of the Vanities, you would have to make it a lot darker and more cynical, but because it was such an expensive movie we tried to humanize the Sherman McCoy character – a very unlikeable character, much like the character in The Magnificent Ambersons. We could have done that if we'd been making a low-budget movie, but this was a studio movie with Tom Hanks in it. We made a couple of choices that in retrospect were wrong. I think John Lithgow would have been a better choice for Sherman McCoy, because he would have got the blue-blood arrogance of the character."

Another lesson is that, viewed individually, each of the disastrous decisions seemed completely reasonable. There's something almost Escher-like about the process: each decision seems to be a move up toward a better and more profitable film but the downward momentum simply accelerates, ending with a critically reviled movie that lost tens of millions of dollars. I suspect that survivors of similar fiascos in other fields would tell much the same story.

Finally there's the way that the failure to control costs in one area limits the ability (or willingness) to control it in other areas. You might that excessive spending on a cast would encourage producers to look for ways to spend less on something like catering, but the opposite often seems to happen when you have this kind of budget spiral. It's a delusional cousin of dynamic scoring: people internalize the idea that anything that might directly or indirectly improve box office performance will pay for itself, no matter how expensive it may be. Pretty soon you're bleeding money everywhere.

 

___________________________________________

 

From Why Are Movies So Bad? Or, The Numbers

If a big star and a big director show interest in a project, the executives will go along for a $14,000,000 or $15,000,000 budget even if, by the nature of the material, the picture should be small. And so what might have been a charming light entertainment that millions of people all over the world would enjoy is inflated, rewritten, to enlarge the star’s part, and overscaled. It makes money in advance and sends people out of theatres complaining and depressed. Often, when people leave theatres they’re bewildered by the anxious nervous construction of the film—by the feeling it gives them of having been pieced together out of parts that don’t fit. Moves have gone to hell and amateurism. A third of the pictures being made by Hollywood this year are in the hands of first-time directors, who will receive almost no guidance or help. They’re thrown right into a pressure-cooker situation, where any delay is costly. They may have come out of sitcoms, and their dialogue will sound forced, as if it were all recorded in a large, empty cave; they may have come out of nowhere and have never worked with actors before. Even if a director is highly experienced, he probably has certain characteristic weaknesses, such as a tendency to lose track of the story, or an ineptness with women characters; he’s going to need watching. But who knows that, or cares enough to try to protect the picture? The executives may have hired the director after “looking at his work”—that is, running off every other reel of one of his films. They are busy people. Network executives who are offered a completed movie commonly save time by looking at a fifteen-minute selection from it—a précis of its highlights—which has been specially prepared for them. God forbid that they should have to sit through the whole thing.

 

Monday, January 29, 2024

..."was later murdered by a vengeful dentist."

[Keeping with our IP theme for January.]

Marketplace's Sabri Ben-Achour has a good segment on the patent battle over the Apple Watch, complete with some very cool historical details. Both Joseph and I are always up for one of these stories, but I have to admit I'm mainly posting this one for the title.

Apple’s patent battle is just one in a long chain going back hundreds of years in the United States.

America had a patent law before it had a Bill of Rights.

“Even while the Constitution was being drafted, there were rival steamboat inventors lobbying members of the Constitutional Convention for patents,” said Christopher Beauchamp, a professor of law at Brooklyn Law School. The biggest patent wars in U.S. history actually happened 150 years ago. 

...

Alexander Graham Bell sued every company that used a telephone without his permission, Beauchamp said. A similar thing happened with waterwheels, sewing machines, barbed wire.

“The inventor of rubber dentures sued every dentist in the country and made basically thousands of people pay to use rubber dentures or face lawsuits,” said Beachamp.

The head of the company was later murdered by a vengeful dentist. In the 20th century, as innovation was controlled by fewer and bigger companies, patent wars were less common, Beauchamp said. But recently they’ve increased again.

“Technology itself has become more complex,” said Rob Merges, author of “American Patent Law” and a professor at the University of California, Berkeley. “There’s more components in a lot of our products.”


Friday, January 26, 2024

As we've said before, if you want to understand the Republican party of 2024, you first have to understand feral disinformation in 2024.


Just in case you're not up to date on your conspiracy theories...

The High-frequency Active Auroral Research Program (HAARP) is a University of Alaska Fairbanks program which researches the ionosphere – the highest, ionized part of Earth's atmosphere.

The most prominent instrument at HAARP is the Ionospheric Research Instrument (IRI), a high-power radio frequency transmitter facility operating in the high frequency (HF) band. The IRI is used to temporarily excite a limited area of the ionosphere. Other instruments, such as a VHF and a UHF radar, a fluxgate magnetometer, a digisonde (an ionospheric sounding device), and an induction magnetometer, are used to study the physical processes that occur in the excited region.

Initially HAARP was jointly funded by the U.S. Air Force, the U.S. Navy, the University of Alaska Fairbanks, and the Defense Advanced Research Projects Agency (DARPA).[1] It was designed and built by BAE Advanced Technologies. Its original purpose was to analyze the ionosphere and investigate the potential for developing ionospheric enhancement technology for radio communications and surveillance.[2] Since 2015 it has been operated by the University of Alaska Fairbanks.[3]

 ...

HAARP is the subject of numerous conspiracy theories. Various individuals have speculated about hidden motivations and capabilities of the project. For example, Rosalie Bertell warned in 1996 about the deployment of HAARP as a military weapon.[36] Michel Chossudovsky stated in a book published by the Committee on Monetary and Economic Reform that "recent scientific evidence suggests that HAARP is fully operational and has the capability of triggering floods, hurricanes, droughts and earthquakes."[37] Over time, HAARP has been blamed for generating such catastrophes, as well as thunderstorms, in Iran, Pakistan, Haiti, Turkey, Greece and the Philippines, and even major power outages, the downing of TWA Flight 800, Gulf War syndrome, and chronic fatigue syndrome.[8][38][39]

 With apologies for repeating a point we've made before, a majority of the GOP believes in conspiracy theories and a substantial segment of the party bases its decisions on them.

 

When we talk about rational decision-making we inevitably are making some pretty big assumptions about beliefs and available information.  If you believe there is an all powerful group that covertly rules the world and you believe that the military industrial complex possesses and has used a secret weapon that can control the weather, it is not all that unreasonable to speculate that a freak weather event of which might help the candidate with the strongest establishment ties was engineered by that group.

We're not talking about the fringe here.  These are beliefs held by much, often most of the party and endorsed by influential figures.  Every time you see a think piece about what Republicans really want, it needs to start with a section on what Republicans really believe.


Wednesday, January 24, 2024

Five years ago at the blog -- critiquing political journalism is basically a long term gig with the Sisyphean landscaping company.

On the bright side, I can save so much time by recycling old posts.  

Monday, January 14, 2019

"Why Horse-Race Political Journalism Is Awesome" is prime Shafer

As mentioned before, Jack Shafer has always been a reliably obsequious sycophant when approaching those high placed in the establishment hierarchy while being a genuinely mean-spirited bully toward outsiders and other safe targets. Even under the best of circumstances, this would make him an odious character, but these are not the best of circumstances. The stakes are now much higher.

It was perhaps inevitable that, when he weighed in on the issue of horse race political coverage, he would do so in an attempt to undermine the work of reformers like Jay Rosen and Margaret Sullivan.

Horseracism might be scary if the campaign press corps produced nothing but who’s up/who’s down stories. But that’s never been the case. American newspapers overflow with detailed stories about the issues and the candidates’ positions. At the end of the 2008 campaign, Washington Post ombudsman Deborah Howell sorted Post political coverage over the previous year and found 1,295 horse-race stories compared with 594 stories about the issues. This ratio seems defensible, seeing as the who’s up/who’s down of the horse race can change daily. Issue stories don’t need that sort of constant revisiting, especially if they’re done well.
...
Horse-race coverage also helps clarify the voters’ minds when candidates converge on the issues, as happens regularly in the Democratic presidential derbies. If there’s little difference between the views of the candidate you favor and the leader’s, horse-race coverage helps optimize your vote by steering you toward the politician most likely to implement your views. Pundits aren’t the only ones who worry about a candidate’s electability. 
For the main election, these arguments are obviously inapplicable, but even on the primary level, his case is weak and sometimes self-contradictory. The very fact that Shafer, who has built much of his career defending the indefensible in the service of the journalistic establishment, falls back to the “seems defensible” standard tells us he knows he’s got a weak hand.

How weak? For starter, while there is some value in helping voters determine electibility and that polls play a role in this (though not a large one, early numbers mainly tell us about name recognition, and that's not really a factor in the general), it is dwarfed by the importance of helping voters understand the problems facing us and how each candidate plans to address them. What’s more, these questions are enormously complex. This means that journalists were spending less than half of their time on topics that were both more important and needed to be explored in greater depth. 

But it gets worse. Horse-race stories are primarily about prediction, who is more likely to win, and any prediction that swings wildly back and forth is, at best, chasing noise and is by definition bad. Even that is too generous a reading. The primary horse-race coverage that Shafer terms awesome didn’t just waste time and muddy the water by reporting every meaningless fluctuation as a trend, it also managed to get the actual trends wrong. Reporters and editors wanted badly to make the Democratic race seem more competitive than it was (at least, in part, to justify more horse-race stories). More to the point, they desperately wanted to convince themselves that Trump wasn’t the GOP front-runner.

On some level, this descent into denial was based on the knowledge that the rise of Trump was bad for the country, but there was another, less praiseworthy motive. The establishment media had invested heavily in false balance, both-siderism and radical centrist positions. The Trump candidacy required either painful soul-searching like we saw from the Washington Post, or a debilitating level of cognitive dissonance like we are still seeing from the New York Times.

Tuesday, January 23, 2024

Eight years ago at the blog -- back before Time Warner ruined "Adam Ruins Everything" (more IP blogging)

It was the fallout from the disastrous AT&T acquisition that killed the show. WB has been horribly mismanaged for years. Its successes despite this incompetence owed a lot to its fantastic IP and the same extension of copyrights Conover talks about in this segment.   

Monday, January 11, 2016

When threads converge -- "How Mickey Mouse Destroyed the Public Domain"

 For a while now we've been telling you to keep an eye on Adam Conover (and everybody else at CollegeHumor) and we've been bitching about the state of IP laws almost as long as we've been blogging.

For those reasons alone, we'd pretty much have to post this segment from Adam Ruins Everything even if it sucked, but we're safe on that score. This is one of Conover's best efforts, particularly for those in the audience nerdy enough to catch the steady stream of reference to classic animation.






Monday, January 22, 2024

Reposted without comment

 Wednesday, August 24, 2022

I shouldn't have to say this but a 49-25 poll is not good news for the 25 (and it gets worse)

First off, the decision of the New York Times to even conduct a presidential poll more than two years before the election is irresponsible and bad for for Democracy. It distracts from important conversations and, since the data are largely worthless,  its main function is to introduce noise into the conventional wisdom. 

 But while the data are not worth wasting any time analyzing, the analysis in the NYT piece by Michael C. Bender is worth talking about, and I don't mean that in a good way. This represents a disturbing throwback to the wishful analytics of the second half of 2015, showing that many data journalists and the publications that employ them have learned nothing in the past seven years.

Back in the early (and not so early) days of the last Republican primary, 538, the Upshot, and pretty much everyone else in the business were competing to see who could come up with the best argument for why being consistently ahead in the polls was actually bad news for Trump. These arguments, as we pointed out at the time, were laughably bad.

Just as being ahead in the polls was not bad for Trump in 2015, the results of this poll (to the extent that they have any meaning) are not bad for Trump in 2022. When elections approach, parties tend to converge on whoever has the clear plurality, and 49% is a big plurality, particularly when a large part of it consists of people who are personally loyal to Trump rather than to the GOP. On top of that, 53% of self-identified Republicans had a "very favorable" opinion of the former president and 27% were "somewhat favorable."

80% favorable is a good number.

Politically, this is a time of tumult, and all predictions at this point are little more than educated guesses, but given the losses and scandals Trump had seen by the time this poll was taken, his support was remarkably solid, which is the opposite of how Bender spun it.

And it gets worse

Here's the headline and the beginning of Bender's piece. [emphasis added.]

Half of G.O.P. Voters Ready to Leave Trump Behind, Poll Finds

Far from consolidating his support, the former president appears weakened in his party, especially with younger and college-educated Republicans. Gov. Ron DeSantis of Florida is the most popular alternative.

By focusing on political payback inside his party instead of tending to wounds opened by his alarming attempts to cling to power after his 2020 defeat, Mr. Trump appears to have only deepened fault lines among Republicans during his yearlong revenge tour. A clear majority of primary voters under 35 years old, 64 percent, as well as 65 percent of those with at least a college degree — a leading indicator of political preferences inside the donor class — told pollsters they would vote against Mr. Trump in a presidential primary.

Notice the phrase "GOP voters." That 49% refers to the respondents who said they thought they would vote in the Republican primary. Among that group, those who identified as Republicans went for Trump over DeSantis 56% to 21%.

If we're talking about who is likely to be nominated (which is, as mentioned before, an incredibly stupid and irresponsible question to be asking more than a year before the election), people who say they are going to vote in the primary are a reasonable group to focus on, but they cannot be used interchangeably with Republicans, which is exactly what Bender does.

While we're on the subject, this was a survey of 849 registered voters, so when we limit ourselves to those who said they were going to vote in the Republican primary then start slicing and dicing that, we are building big conclusions on a foundation of very small numbers.



And it gets worse. [Emphasis added]

While about one-fourth of Republicans said they didn’t know enough to have an opinion about Mr. DeSantis, he was well-liked by those who did. Among those who voted for Mr. Trump in 2020, 44 percent said they had a very favorable opinion of Mr. DeSantis — similar to the 46 percent who said the same about Mr. Trump.

Should Mr. DeSantis and Mr. Trump face off in a primary, the poll suggested that support from Fox News could prove crucial: Mr. Trump held a 62 percent to 26 percent advantage over Mr. DeSantis among Fox News viewers, while the gap between the two Floridians was 16 points closer among Republicans who mainly receive their news from another source.

Here's a fun bit of context. Fox has been maxing out its support of DeSantis for years now.

Steve Contorno writing for the Tampa Bay Times

(from August of 2021):

The details of this staged news event were captured in four months of emails between Fox and DeSantis’ office, obtained by the Tampa Bay Times through a records request. The correspondences, which totaled 1,250 pages, lay bare how DeSantis has wielded the country’s largest conservative megaphone and show a striking effort by Fox to inflate the Republican’s profile.

From the week of the 2020 election through February [2021], the network asked DeSantis to appear on its airwaves 113 times, or nearly once a day. Sometimes, the requests came in bunches — four, five, even six emails in a matter of hours from producers who punctuated their overtures with flattery. (“The governor spoke wonderfully at CPAC,” one producer wrote in March.)

There are few surprises when DeSantis goes live with Fox. “Exclusive” events like Jan. 22 are carefully crafted with guidance from DeSantis’ team. Topics, talking points and even graphics are shared in advance.

Once, a Fox producer offered to let DeSantis pick the subject matter if he agreed to come on.

If I were DeSantis's campaign manager, this poll would scare the shit out of me. Fox has pushed him to a degree unprecedented for a politician at that stage of his career. He has also gotten tremendous (and appallingly credulous) coverage from the mainstream press, but he just doesn't register. I know political scientists and data journalists don't like to talk about things like personality, let alone charisma, but for whatever reason, DeSantis has not made much of an impression.

It's possible cataclysmic events (of which we're seeing a definite uptick) will hand the Florida governor the nomination or maybe even the presidency, but if this poll had any meaning, it would be bad new for him and good news for Trump.

And it gets worse.

This wasn't just an article based on worthless data sliced ridiculously thin wishfully analyzed to get conclusions completely at odds with the actual numbers; this was an influential and widely cited article based on worthless data sliced ridiculously thin wishfully analyzed to get conclusions completely at odds with the actual numbers. It instantly became a fan favorite among political journalists.

The article was published on July 12th and immediately became part of the conventional wisdom. A little less than a month later, the FBI raided Mar-a-Lago, and the "Republicans are moving on from Trump" voices suddenly grew quieter, as even the highest ranking party members responded with unhinged accusations and threats of retribution. Though the pundits desperately wanted to believe otherwise, they  had to acknowledge that the GOP still belongs to Donald Trump.

Friday, January 19, 2024

"Hi Ho! Hi Ho! It's off to court we go!" -- More Disney litigation blogging


Though the Air Pirates case was more consequential (and arguably more egregious), it was Disney's legal response to the 1989 Academy Awards that did the most in the industry to solidify the company's reputation for overly aggressive litigation. As Variety put it, “Because the Academy did something kinda Dopey, Disney sure is Grumpy.”

The story behind the suit is the stuff of Hollywood legends.

Carr's reputation for hosting expensive and lavish parties and creating spectacular production numbers led AMPAS to hire him to produce the 61st Academy Awards on March 29, 1989. Carr made a promise to shift the show from its perceived dry and dull stature to something different, one that would be inspired by Beach Blanket Babylon (created by Steve Silver), the musical revue show featuring Snow White during the Golden Age of Hollywood.[7] Three time Academy Award winner Marvin Hamlisch was brought in as conductor. With the promise of being "the antithesis of tacky," for which the ceremony would have no host, as it would rotate actors and actresses instead generally put in pairs as part of Carr's theme of "couples, companions, costars, and compadres", with the most notable pair being Bob Hope and Lucille Ball (in her final public appearance before her death just a few weeks later). The criticism for the ceremony stemmed mostly from the musical numbers that attempted to cross both Old and New Hollywood together. It began with a booming voice stating that the "star of all time" would arrive soon, which came in the form of Snow White, played by Eileen Bowman (Lorna Luft declined, much to her subsequent relief), who proceeded to try and shake the hands of stars in the audience in the theater (much to the embarrassment of nominated actress Michelle Pfeiffer). Merv Griffin started the show with his 1950 hit “I’ve Got a Lovely Bunch of Coconuts!”, complete with a re-creation of the Cocoanut Grove nightclub that featured a collection of established stars such as Vincent Price, Alice Faye, and Roy Rogers. Bowman returned to the stage afterwards to sing a duet of "Proud Mary" with Rob Lowe, which lasted twelve minutes.[8][9][4] A second production number was done later with "The Stars of Tomorrow" doing a number called "I Wanna Be An Oscar Winner" that featured actors in their teens/mid-20s such as Christian Slater (sword fighting Tyrone Power Jr.), Chad Lowe (shouting "I am a Thespian"), and Patrick Dempsey (tap dancing along the stairwell).[10]

Steve Silver, asked about the opening number while looking at the reviews, stated, "Janet Maslin says it is the worst production number in the history of the Oscars. I guess you can't top that. The publicity for Beach Blanket Babylon ought to be wonderful.” An open letter was released on April 7, featuring 17 figures of Hollywood (such as former Academy president Gregory Peck) [More on that below -- MP] that called the ceremony an embarrassment to the Academy and the industry. Two days later, the Los Angeles Times dedicated its entire letter section (ten in total) to hate mail about the ceremony, titling it “For Some, the Oscar Show Was One Big Carr Crash." The Walt Disney Company sued the Academy shortly after the ceremony for copyright infringement (no one at the Academy had asked for permission from Disney regarding Snow White), which forced the Academy to make a formal apology.[11] Lowe defended his appearance by describing himself as a "good soldier" doing it for the Academy, while Bowman would state later that the show looked like a "gay bar mitzvah", doing so years after the fact after having signed a gag order that had her not talk about the show for thirteen years.[12] Carr's reputation in Hollywood did not fully recover from the blowback the ceremony received, although there were some retaining benefits. The ratings for the show were marginally better than the previous ceremony, as over 42 million viewers in 26 million homes saw the ceremony in the United States.[13] The choice of no host for the ceremony would be replicated for the 2019 ceremony. Carr's decision to change the announcement from "And the winner is..." to "And the Oscar goes to.."[14] has been utilized for each Academy ceremony since. Carr elected to have retailer Fred Hayman get designers to dress the stars for arrival on the red carpet, which became its own segment of focus in later years. Comedian Bruce Vilanch, hired as a writer for the show, would work on the show for the next two decades, which included a promotion to head writer. Carr did not produce another film or television show again.

And check out who signed that letter (from a Variety thirty year anniversary article):

Filmmaker Joseph L. Mankiewicz griped to Variety that the show was “amateur night.” He was one of 17 individuals who wrote a letter to the Academy of Motion Picture Arts & Sciences saying, “The 61st Academy Awards show was an embarrassment to both the Academy and the entire motion picture industry,” and urging AMPAS president Richard Kahn and the board to ensure it didn’t happen again. Among other signatories: Julie Andrews, Stanley Donen, William Friedkin, Sidney Lumet, Paul Newman, Gregory Peck, Billy Wilder and Fred Zinnemann.

It's worth noting that the knives were out for Carr even before the show started. He was a flamboyantly gay man in a still closeted era. Worse yet, he had quickly risen to the top of the Hollywood social scene and this is a town that loves a good, tragic fall (they've made A Star Is Born four times). Disnet was just piling on.

 

 

If you're up to it, you can see most of the actual number here. Check out the reactions of Gregory Hines and Robert Downey jr. at the end (if you make it that far)






Thursday, January 18, 2024

Monumental greed, delusional narcissism, or just the drugs talking?

My money's on a combination of the three (but mainly the drugs).


From CNN

Elon Musk says he wants a significantly larger stake in Tesla than the one that already made him the richest person on the planet.

In a series of posts on X Monday night, Musk said that he would not want to grow Tesla to become a leader in artificial intelligence and robotics without a compensation plan that would give him ownership of around 25% of the company’s stock. That would be about double the roughly 13% stake he currently owns.

“I am uncomfortable growing Tesla to be a leader in AI & robotics without having ~25% voting control. Enough to be influential, but not so much that I can’t be overturned,” Musk wrote in a post on X. “Unless that is the case, I would prefer to build products outside of Tesla.”

Notably, Musk held a stake of more than 20% in Tesla before he sold a large number of shares to buy X, the social media company he purchased over a year ago for $44 billion.

Here's a handy explainer from the always reliable Common Sense Skeptic.





Wednesday, January 17, 2024

I know we've been pounding this hard for a long time, but the story of the 2024 GOP is first and foremost a story about feral disinformation

From last week:

At the risk of stating the obvious, having the majority of your party believe that you actually won the last election and should be in office now gives you a tremendous advantage in the primary. This is one of the antibodies that Josh Marshall has mentioned that makes it all but impossible for a challenger to take the nomination away from Donald Trump. (This is not to say that Trump can't be taken out, just that it would almost certainly require external forces like health issues or big legal developments).

We also pointed out that belief in these conspiracies was especially problematic for the establishment candidate, Nikki Haley.

If the post had come out a few days later, we might have included this.
 

 

 [More on HAARP's special place in the febrile mind tomorrow.]

With that and this in mind, check out this data point from Iowa.


I'm not going to go down the rabbit hole of to what degree belief causes support compared with the other way around (I'm sure you can find both), but the correlation is undeniable, as is the difference in levels of 2000 Mules believers between Haley and DeSantis voters. In normal times, having fewer conspiracy nuts among your supporters would be a good thing, but in this year's Republican primary, it's likely to be fatal.

Tuesday, January 16, 2024

[Almost] Ten years at the blog -- more thoughts on Richard Florida's creative class thesis

About ten years ago, we started noticing that a great deal of social science research had an analysis by outlier problem. Ten years later, the problem is still around and, in the case of San Francisco, has gotten worse. A city of less than a million that is neither the population center nor the employment center for the Bay Area, let alone the state has somehow become the primary focus of the nation's debate and the results have not been pretty.

Monday, February 24, 2014

The Outlier by the Bay

[Homonym alert -- I dictated this to my smart phone then edited it late in the evening.]

There's an energetic debate going on over at Andrew Gelman's site regarding Richard Florida's theories of the creative class. I can understand the urge to rise to Florida's defense. After all, there's great appeal to the idea that the kind of smart, innovative people who tend to drive economic growth are attracted to diverse, tolerant, livable cities with vibrant cultures. To  some extent, I believe it myself, but I also find myself having the same problems with Florida I have with the rest of the urban utopianists: first that they have a tendency to take interesting but somewhat limited findings and draw impossibly sweeping conclusions and TED-ready narratives; and that these narratives often mesh badly with the facts on the ground. I've already discussed the latter (in probably overly harsh but still heartfelt language). Here are some thoughts on the second.

Part of my problem with a lot of urban research is that there just aren't enough major cities out there to make a really good sample, particularly when you have data this confounded and so many unusual if not unique aspects with each area. For some cities, with New York and San Francisco being very close to the top of the list, these unique aspects make it difficult to generalize findings and policy suggestions.

When I look at Richard Florida's research, at least in the form that made it to the Washington Monthly article, the role of San Francisco strikes me as especially problematic.

What is by many standards the most Bohemian and gay-friendly area in America is also arguably the country's center of technological innovation. Even if there were no relationship in the rest of the country, that single point would create a statistically significant correlation. That would not be so troubling if we had a clear causal relationship or a common origin. Unfortunately, the main driver of the tech boom, if you had to limit yourself to just one factor, would have to be Stanford University, while the culture of San Francisco does not appear to have been particularly influenced by that school, particularly when compared to Berkeley. In other words, had Stanford chosen to establish his college in Bakersfield, we might still have had Haight-Ashbury but we almost certainly would not have had Silicon Valley.

What's more, when we start looking at this narrative on a city by city basis, we often fail to see what we would expect. For example, if you were growing up in a relatively repressive area of the Southeast and you were looking for a Bohemian, gay-friendly metropolitan area with a vibrant arts scene, the first name on your list would probably be New Orleans followed by, roughly in this order, Atlanta, Savannah, and Memphis. Neither Cary, North Carolina nor Huntsville, Alabama would have made your top 10.

Rather bizarrely, Florida discusses both the Research Triangle and and New Orleans in his WM article, apparently without seeing the disconnect with his theories.:
Stuck in old paradigms of economic development, cities like Buffalo, New Orleans, and Louisville struggled in the 1980s and 1990s to become the next "Silicon Somewhere" by building generic high-tech office parks or subsidizing professional sports teams. Yet they lost members of the creative class, and their economic dynamism, to places like Austin, Boston, Washington, D.C. and Seattle---places more tolerant, diverse, and open to creativity.
There are lots of reasons for leaving New Orleans for Austin, but tolerance, diversity and openness to creativity aren't among them.

Even stranger are Florida's comments about the Research Triangle:
Kotkin finds that the lack of lifestyle amenities is causing significant problems in attracting top creative people to places like the North Carolina Research Triangle. He quotes a major real estate developer as saying, "Ask anyone where downtown is and nobody can tell you. There's not much of a sense of place here. . . .The people I am selling space to are screaming about cultural issues." The Research Triangle lacks the hip urban lifestyle found in places like San Francisco, Seattle, New York, and Chicago, laments a University of North Carolina researcher: "In Raleigh-Durham, we can always visit the hog farms."
Remember, Florida said "Places that succeed in attracting and retaining creative class people prosper; those that fail don't," so is this spot withering away? Not so much:
Anchored by leading technology firms, government and world-class universities and medical centers, the area's economy has performed exceptionally well. Significant increases in employment, earnings, personal income and retail sales are projected over the next 15 years.

The region's growing high-technology community includes such companies as IBM, SAS Institute, Cisco Systems, NetApp, Red Hat, EMC Corporation and Credit Suisse First Boston. In addition to high-tech, the region is consistently ranked in the top three in the U.S. with concentration in life science companies. Some of these companies include GlaxoSmithKline, Biogen Idec, BASF, Merck & Co., Novo Nordisk, Novozymes, and Wyeth. Research Triangle Park and North Carolina State University's Centennial Campus in Raleigh support innovation through R&D and technology transfer among the region's companies and research universities (including Duke University and The University of North Carolina at Chapel Hill).
This is not to say that there is not some truth to Florida's narrative or validity to many if not most of his insights. It does appear, however, that the magnitude of the effects he proposes are far less than he suggested and that the absolute claims he is fond of making are often riddled with exceptions.

 

Monday, January 15, 2024

Welcome to the Iowa caucus, the Golden Globes of American politics

[Before we get started, check out this historical overview from the LA Times' Mark Z. Barabak.]


There was a point decades ago when the Iowa caucus and the New Hampshire primary actually served a useful purpose as search committees, allowing candidates with little name recognition and limited budgets to make their case to a small politically engaged state.  This is what happened with Jimmy Carter.  To a degree, it also happened to Barack Obama though he was hardly as obscure as Carter.  I don't remember if the following was something I said or if it was said to me but after his speech at the democratic national convention in 2004, one of us told the other that we had just seen the man who would be the first black president of the United States.

That one scenario where someone most people have never heard of manages to come in first or second is the only time that the results from Iowa are important or newsworthy or even interesting.  That is the entire case for Iowa and to a degree in New Hampshire going first and it has largely evaporated in the age of mega budgets and the long campaign.

Take take it away and Iowa becomes the Golden Globes of American politics, everyone knows it's absolutely meaningless but does come first and all the journalists covering the real story are starved for something to talk about.

Why should we ignore Iowa?  Putting aside the size of the state, the caucus system is hopelessly confusing and, except for its ability to occasionally elevate the truly unknown, is horribly and incurably flawed.  Add to that the highly idiosyncratic nature of Iowa politics and, from time to time, the weather, and you have an event that tells us virtually nothing about the state of the primary.


Friday, January 12, 2024

Plagiarism and nuance

This is Joseph.

There has been a lot of discussion about Plagiarism lately. But it is a very much an art to identify it in academic writing. Some sections, like the methods, need to present information clearly and there are not that many ways to state the same information. 

Compare this paper with this paper in terms of the first paragraph of the methods section: 



These papers were published one day apart (Nov 27/28 2022). There is not a single author in common and both reflect very different styles of developing the paragraph. But automated detection software will find a surprising correspondence because both papers will need to accurately describe the sampling frame for the study and many aspects of this are fixed (how many people, race/ethnicity, ages, field centers, exclusion criteria). Same thing with technical definitions and mathematical equations -- math does not do well with being creatively altered. 

I don't want to defend plagiarism, but there is a lot of nuance to academic writing. It takes some real effort to decide if things that are similar actually rise to the level of an issue and it takes a lot of work and judgement. Here is a case where a scientist looks at what the media found and has . . . questions

It is not definitive, but it explains why careful investigation is the best path forward. 

Anyway, food for thought.

Thursday, January 11, 2024

Five years ago at the blog -- we were skeptical of the streaming business model

We've been hammering this one for for about a decade, back when it was another one of our unpopular lose-friends-and-alienate-people positions. Now in 2024, we know that the industry on the whole has been losing billions of dollars a year, even the one (moderately) profitable company is cutting back and largely abandoning its old business model, and pretty much everybody has admitted that spending a hundred-plus million a pop on shows that mostly got cable access audience numbers might not have been the best idea.

 

Friday, January 11, 2019

Yes, this is what a content bubble looks like

This post by LGM's Loomis is a perfect example of an unintentionally interesting piece, one that makes a tremendously important point in passing then never returns to it.

One of my New Year’s resolutions for 2018 was to watch more television. I’ve slowly increased my TV viewing over the last couple of years in response to the great stuff out there. Now it’s kind of weird, as the consensus is that there’s a lot of just OK shows and not much that’s really that compelling compared to the prestige dramas of earlier this decade. But I’m so far behind that this really barely matters to me.

There's a lot to unpack here.

One. While most of us are making New Year's resolutions to watch less television, Loomis is one of the very few who actually resolved to watch more, particularly those buzz-friendly critical darlings that the major streaming services are spending so heavily on. He is the ideal consumer with respect to this business model, and yet even he acknowledges that he will never be able to catch up with all the shows on his to-see list.

The dirty little not-so-secret secret of most of these must-see shows is that very few people actually watch them. For all their awards and feature stories, they remain more talked about than viewed. We could have an interesting discussion about their role in brand building and other indirect effects, but even with those taken into account, you have to have serious concerns about a business strategy that spends billions of dollars producing shows with such tiny audiences.

Two. Yes, N=1, but the perception that quality is slacking off has tremendously disturbing implications for the business model. For a number of years, the formula for generating awards, buzz, and perceived quality was fairly simple. Obviously, making good shows did help, but the key to getting noticed was hiring big-name talent, spending stunning amounts on PR and marketing, and sticking as close as possible to a handful of genres that lent themselves to extensive coverage and favorable reviews ("it's a dark, edgy crime drama with a quirky sense of humor, ""it's a dark, mindbending science-fiction drama with a quirky sense of humor"). Now, though, there is reason to believe that through a combination of saturation and the half-life of novelty, the formula is losing its effectiveness. That means even more obscene amounts of money will have to be spent to create the same impact.

Three. Finally, as we have said many times before, content accumulates. The 500 or so series that are currently in production are not just competing against each other, but against everything that has come before. If someone like Loomis who is almost genetically engineered to seek out new, trendy shows is opting instead for something that has been off the air for years like the Sopranos, investors should definitely be taking note.




Wednesday, January 10, 2024

At the risk of stating the obvious, having the majority of your party believe that you actually won the last election and should be in office now gives you a tremendous advantage in the primary.

  Consider this recent you-gov poll on conspiracy theories.

 


The first obvious thing that jumps out is that Republicans have a much more serious problem with conspiracy theories and feral disinformation than do Democrats. Not exactly surprising, but it's always nice to find data that backs up your intuition. 
 
Now take a look at these three questions...
 

 
There are lots of interesting things to look at here but the big relevant takeaways for me are that Republicans tend to be far more paranoid about the establishment and that a majority of them believe that the election was stolen.

The implications of this last point are huge. For one thing, these people believe that Donald Trump was the legitimate winner. From that it follows that January 6th could not have been an insurrection; instead, it was actually an attempt to prevent an insurrection. Once you believe this, all sorts of major events of the past couple of years are completely inverted. Suddenly, the January 6th rioters become political prisoners. The Jack Smith and Georgia investigations become nothing more than politicized attempts to keep the rightful winner out of office. The near universal condemnation of the stolen election lie in the non-partisan press becomes damning evidence of a vast leftist conspiracy, and the fact that most Americans don't believe the lie proves that the conspiracy has been frighteningly effective.

At the risk of stating the obvious, having the majority of your party believe that you actually won the last election and should be in office now gives you a tremendous advantage in the primary. This is one of the antibodies that Josh Marshall has mentioned that makes it all but impossible for a challenger to take the nomination away from Donald Trump.
(this is not to say that Trump can't be taken out, just that it would almost certainly require external forces like health issues or big legal developments ).

The establishment question points out another almost insurmountable obstacle for those trying to knock the former president out before the general. Trump is seen, in many ways accurately, as the ultimate outsider candidate. This presents challengers with something of a catch-22. In order to have any chance of unseating Trump, they have to have the full support of the Republican establishment, particularly in its now weakened state, but if they are seen as having the support of the establishment, they don't have any chance of unseating Trump. This is particularly a problem for the mainstream media's flavor of the month.

 Nikki Haley’s Rocket Ride to Second Place

December 2, 2023

In April, Sarah Longwell of The Bulwark wrote what is still one of the most insightful reports about the Republican electorate. Longwell, strategic director of Republican Voters Against Trump, has sat through hundreds of focus groups to understand the mental state of the party. Her primary conclusion is that most GOP voters see the Trump era not as an interregnum but as a kind of revolutionary event she calls “Year Zero.”

“The Republican party has been irretrievably altered,” she wrote, “and, as one GOP voter put it succinctly, ‘We’re never going back.’” Such voters have bought into Trump’s argument that the party leaders who preceded him were weak losers. (This argument conveniently absolves Trump of blame for his own losses — he was sabotaged by the Establishment, you see.) “If you forged your political identity pre-Trump, then you belong to a GOP establishment now loathed by a majority of Republican primary voters,” she concluded. “Even if you agree with Trump. Even if you worked for Trump. Even if you were on Trump’s ticket as his vice president.

Longwell laid out a roster of Republican politicians whom the voters could never accept for this reason. The first name on her list was Nikki Haley.


Even if Haley was running a good campaign (and in case you haven't been paying attention, she's not), the very things that make her so appealing to the establishment also make her toxic to a large portion of the GOP base.

Tuesday, January 9, 2024

100% Pure Extra Virgin Schadenfreude

This story involves two topics which I normally don't consider worth the candle, plagiarism (which requires an enormous amount of spade work to discuss properly, inevitably more effort than the case merits) and Harvard. In this instance, however, we have someone so odious revealed to be a huge hypocrite in the most embarrassing manner possible. To paraphrase Oscar Wilde you would have to have a heart of stone not to laugh at the travails of Bill Ackman.

Katherine Long, Jack Newsham writing for Business Insider:

The billionaire hedge fund manager and major Harvard donor Bill Ackman seized on revelations that Harvard's president, Claudine Gay, had plagiarized some passages in her academic work to underscore his calls for her removal following what he perceived as her mishandling of large protests against Israel's bombardment of Gaza on Harvard's campus.

 ...

Her husband, Ackman, has taken a hardline stance on plagiarism. On Wednesday, responding to news that Gay is set to remain a part of Harvard's faculty after she resigned as president, he wrote on X that Gay should be fired completely due to "serious plagiarism issues."

"Students are forced to withdraw for much less," Ackman continued. "Rewarding her with a highly paid faculty position sets a very bad precedent for academic integrity at Harvard."

 And you'll never guess what happens next.

In [Ackman's wife, Neri] Oxman's dissertation, completed at MIT, she plagiarized a 1998 paper by two Israeli scholars, Steve Weiner and H. Daniel Wagner, a 2006 article published in the journal Nature by the New York University historian Peder Anker, and a 1995 paper published in the proceedings of the Royal Society of London. She also lifted from a book published in 1998 by the German physicist Claus Mattheck and, in a more classical mode of plagiarism, copied one paragraph from Mattheck without any quotation or attribution.

In addition to getting funnier every damn time you read it, this incident provides us with a dramatic example of one of the main problems with the outcry over minor plagiarism, selective enforcement.

Whenever the non-immediate consequences for these offenses go beyond public shaming, where people pay real penalties for things that happened years ago, invariably the application will be inconsistent and unfair, often deliberately so. Minor incidents of plagiarism are so common with so much falling in a gray area, that most people have (intentionally or not) done something that a bad faith actor can go after them for.

 

 

 

While similar or worse cases are largely ignored.

 

Did some of the instances that so enraged Ackman go beyond the trivial and the sloppy? I couldn't tell you, but I can say that informed and objective opinions differ and that some of those who claim to have the least doubt have given us the most reason to doubt their impartiality.

 

One of the best parts of this story has been Ackman's reactions to the BI article. 




 

And I'll leave you with this closing thought.

Monday, January 8, 2024

Twelve years ago at the blog -- After stealing these characters, Marvel trademarked and copyrighted them, which was just adding injury to insult

It isn't all that relevant to the subject of the post, but I'm still surprised I didn't work this in. 

Captain Billy's Whiz Bang is one of the signs of sin and decadence given in "Ya Got Trouble" from Meredith Willson's the Music Man (anachronistically, since the play is set in 1912 and the magazine didn't debut until seven years later).

Mothers of River City!
Heed that warning before it's too late!
Watch for the tell-tale sign of corruption!
The minute your son leaves the house,
Does he rebuckle his knickerbockers below the knee?
Is there a nicotine stain on his index finger?
A dime novel hidden in the corn crib?
Is he starting to memorize jokes from Capt. Billy's Whiz Bang?
Are certain words creeping into his conversation?
Words like, like 'swell?"
And 'so's your old man?"
Well, if so my friends,

Ya got trouble,
Right here in River city!
With a capital "T"
And that rhymes with "P"
And that stands for Pool.

 



 

And while we're on the subject of Marvel and IP.

 

Sunday, January 8, 2012

Intellectual property and business life-cycles



A while back, we had a post arguing that long extensions for copyrights don't seem to produce increased value in properties created after the extension, but what about the costs of an extension? And who pays it?

New/small media companies tend to make extensive use of the public domain (often entailing a rather liberal reading of the 'public' part). The public domain allows a company with limited resources to quickly and cheaply come up with a marketable line of products which can sustain the company until it can generate a sufficient number of original, established properties.

Many major media companies have gotten their start mining the public domain, none more humbly than Fawcett. At its height, the company had magazines that peaked at a combined circulation of ten million a month in newsstand sales, comics that outsold Superman, and the legendary Gold Medal line of paperbacks. All of this started with a cheaply printed joke magazine called Captain Billy's Whiz Bang


Of course, Wilford Fawcett couldn't have reimbursed the unknown authors of those jokes even if he had wanted to. Disney, on the other hand, built its first success on a a title that was arguably still under copyright.
Mickey had been Disney's biggest hit but he wasn't their first. The studio had established itself with a series of comedies in the early Twenties about a live-action little girl named Alice who found herself in an animated wonderland. In case anyone missed the connection, the debut was actually called "Alice's Wonderland." The Alice Comedies were the series that allowed Disney to leave Kansas and set up his Hollywood studio.

For context, Lewis Carroll published the Alice books, Wonderland and Through the Looking Glass, in 1865 and 1871 and died in 1898. Even under the law that preceded the Mouse Protection Act, Alice would have been the property of Carroll's estate and "Alice's Wonderland" was a far more clear-cut example of infringement than were many of the cases Disney has pursued over the years.

In other words, if present laws and attitudes about intellectual property had been around in the Twenties, the company that lobbied hardest for them might never have existed.
Another company that went from near bankruptcy to media powerhouse was a third tier comics publisher that had finally settled on the name Marvel. The company's turnaround is the stuff of a great case study (though MBA candidates should be warned, Stan Lee's memoirs can be slightly less credible than his comics). Not surprisingly, one element of that turnaround was a loose reading of copyright laws.

Comic book writer and historian Don Markstein has some examples:
Comic book publisher Martin Goodman was no respecter of the property rights of his defunct colleagues. In 1964, he appropriated the name of a superhero published in the '40s by Lev Gleason, and brought out his own version of Daredevil. A couple of years later, he introduced an outright copy of a '50s western character published by Magazine Enterprises, Ghost Rider. It wasn't until late 1967, possibly prompted by a smaller publisher's attempt to do the same, that he finally got around to stealing the name of one of the most prominent comics heroes of all time, Captain Marvel. And this delay was odd, because the name of Goodman's company was (and remains) Marvel Comics."
(That would, by the way, be Fawcett's Captain Marvel so what goes around...)

(Fans of fantasy art should find the covers of the old Ghost Rider familiar)




This is how how media companies start. A small music label fills out a CD with a few folk songs. An independent movie company comes up with a low-budget Poe project. An unaffiliated television station runs a late night horror show with public domain films like Little Shop of Horrors and Night of the Living Dead. Then, with the payroll met and some money in the bank, these companies start getting more ambitious.

Expansion of the public domain is creative destruction at its most productive. Not only does it clear the way for new work; it actually provides the building blocks.