Friday, July 19, 2013

First assume a fairy godmother...

This is one of those stories illustrates just how bad journalists have gotten at covering life in the bottom quartile. Here, from Marketplace, is the set-up:
The fast food chain teamed up with Visa to create an online budget guide for its employees. And most of the criticism is directed at the fact that the company's budget doesn't list 'food' or 'heat' as monthly budget items. 
...
"Helping you succeed financially is one of the many ways McDonald's is creating a satisfying and rewarding work environment," the McDonald's site's about page states. "So you can take the next step towards financial freedom." 
To do that, the guide suggests journaling daily expenses, setting up a budget and outling a savings goal. Sound reasonable? 
One problem: the sample budget offered by McDonald's (below) doesn't mention money for basic necessities like food, heat, gas and clothing. 
The budget also assumes a worker will need to maintain two jobs in order to make roughly $24,500 a year.

Here's the actual document:



A heated debate has broken out over whether it's possible to live on $24,500 a year. This is not a question that would perplex a group pulled at random from the general populace. People do it all the time. I've done it myself (and yes, I'm adjusting for inflation). I even have a musician friend in New York City who's doing it now.

You eat lots of beans and potatoes. You get a prepaid phone. You buy a set of rabbit ears (which, as mentioned before, would actually give you more channels and better picture than the basic cable the WP article suggests). You live day-to-day. You constantly worry about money. You're one one bad break away from disaster but with exception of the health insurance and heating items, nothing in expenses, including rent, is that unreasonable.

There is, in fact, only one completely unrealistic item here:

Second job: $955

Angry Bear, which does get it, explains just how much work we're talking about.
Besides skipping certain expenses and skimping on others; to meet the income levels portrayed in the budget, McDonalds suggests associates to work not one but two jobs. A full time job at McDonalds and a part time job elsewhere totally 62 hours per week (if the worker resides in Illinois where the minimum wage is $8.25/hour). If perchance, the worker resides in one of the other 48 states; the total hours needed to hit the suggested income level jumps to 74 hours/week due to a lower minimum wage (the equivalent of a second full time job). 
And Marketplace explains how unlikely that 74 is:
At the same time, there’s been a sharp drop in the number of people who are holding down multiple jobs, and most of those are likely to be part-time, since there are only so many hours in a day. The number of multiple job-holders is down by more than 500,000 since 2007.  So, there are more people in part-time jobs, but fewer people able to cobble together two or more of those jobs to make ends meet.
...
This trend to more part-time work could be permanent. Employers like the flexibility, and the low cost. Benefits in many part-time jobs -- health care, retirement -- are slim to none.

But there’s a complication. For job-seekers, it’s now harder to find and keep multiple part-time jobs. “Among low-wage employers -- retail, hospitality, food service -- employers are requiring their employees to say they’re available for a full-time schedule, even when they know they’re never going to schedule them for full-time,” says Stephanie Luce at the City University of New York’s Murphy Institute.

Luce is a labor sociologist who studies union movements around the world. She co-authored, with the Retail Action Network, a study based on surveys of retail workers in New York, Discounted Jobs: How Retailers Sell Workers Short. “Managers are asked to schedule based on customer-flow, on weather, on trends in the economy, and to change the schedule day-to-day,” says Luce. “They don’t want employees that are going to say ‘I can’t come in, I have another job.’ They want employees that’ll say, ‘OK, I’ll come in if you need me. I won’t come in if you don’t need me.’”  


Thursday, July 18, 2013

If we just look at climate change, we should do these things. If we take climate change out of the picture we should still do these things

Massoud Amin, chair of the Institute of Electrical and Electronics Engineers Control Systems Society’s Technical Committee on Smart Grids, has a must-read opinion piece up at Nature (unfortunately behind a pay wall though you can get most of the same information from this interview). He lays out a highly persuasive case based on both economic benefits like greatly reducing the number and duration of power outages (which are currently estimated to cost the US economy between US$80 billion and $188 billion each year) and environmental benefits like reducing carbon dioxide emissions by 12–18% by 2030.

Two things in particular struck me as I read this. The first was that Amin could make his argument strictly on economic terms and strictly on environmental ones. There's an obvious parallel here with fixing choke points in our freight rail system, an infrastructure improvement that would pay for itself two or three-fold in areas like transportation costs, highway congestion and wear-and-tear on roads and bridges. Our inability to take action is so entrenched that we can't take significant steps to address climate change even when there's a overwhelming non-environmental case for moving forward.

The second thing that hit me was the cost Amin gives: around $400 billion, or $21 billion to $24 billion a year for 20 years. That is a great deal of money in absolute terms but when you start looking at the various costs associated with climate change-- sea level, ocean acidification, droughts, extreme weather -- you get into the trillions fairly quickly (you might well hit a trillion in Florida alone).

What follows is very back-of-the-envelope, but based on the US share of carbon emissions, it looks like, if you make all sorts of simplifying assumptions, the smart grid impact would be around 2.5% of global totals. In gross terms, ignoring all other economic benefits, smart grids are not a particularly cheap way of reducing carbon emissions (in net terms they actually pay for themselves, but put that aside) but $400 billion for a two and a half percent reduction doesn't seem that high.

The climate change debate is often framed as a choice between saving the planet and saving the economy, but when you look at the proposed solutions, they either don't seem to put that much of a drag on the economy (carbon taxes, moving away from coal) or they actually pay for themselves (infrastructure improvements) while the potential economic damage of climate change dwarfs the combined costs of all of the proposed solutions.


Wednesday, July 17, 2013

Intellectual property and Marvel

(I told you I'd connect Stan Lee to this)

For about the past fifty years the company which is now Marvel entertainment, has made a spectacular amount of money and has done it in virtually every medium from comics to television to film to video games to  novels to even, Heaven help us, the stage.

This is all the more remarkable when you consider that shortly before its period of dominance the company was a third rate imprint that was, by some accounts, on its last legs.

The rise was a remarkable achievement both in American publishing and pop culture. In economic terms alone, it shows how a small company with almost no resources or structural advantages can come to dominate an industry and generate billions of dollars.

One aspect of that story which is particularly relevant given our recent posts on the subject is the way Stan Lee used public domain (and in some cases, not-quite-public domain) intellectual properties as an important part of his business model.

First a quick and hopefully painless bit of comic book history.

Superheroes were the first big original content hit of the medium. Starting with Superman in 1938, they dominated that side of the industry for almost a decade. Licensed titles (like Dell's Disney line) were, by some sources, bigger sellers but if you were creating characters specifically for comics in the early Forties, superheroes were where the money was. By the end of the decade, though, the boom was largely over and other fads such as crime, horror, Western, funny animals, funny teenagers and (with a very unlikely origin) romance took turns as the next big thing.

Of course, comic book publishers kept trying to bring back the genre that had started it all. Lots of companies tried to introduce new superheroes or dust off old ones but without real success. Among others, the company that would become Marvel was particularly badly burned by a superhero push in the mid-Fifties). The big exception here is Magazine Enterprise's Ghost Rider in 1949 but as Don Markstein pointed out, that character blended the faded superhero genre with the up-and-coming genres western and horror.

It was not until 1956 that a team working under DC editor Julius Schwartz came up with a workable formula: take a dormant mid-tier character from the Forties; completely rework the character (sometimes keeping only the name) with a science fiction origin, streamlined jump-suit inspired costumes and a heavy emphasis on space age themes.

In rapid succession and generally with great success,  Schwartz applied this rebooting approach to a number of properties and soon other companies were trying their hand. As early as 1959, Archie Comics (which had been known for a relatively successful and very violent collection of superhero titles in the Forties) had hired Joe Simon and Jack Kirby to rework their character the Shield. As the Sixties got going almost everyone was in on the act.

In 1961, Marvel Comics joined in. Marvel was a small part of Martin Goodman's middling publishing company but it did have a couple of significant assets: a few well-remembered Golden Age characters (the Human Torch, Namor and Captain America) and comics auteur Jack Kirby. Given the market conditions of the time, Kirby's brand was extremely valuable. There was a tremendous demand for all things associated with the previous era of superheroes and Kirby had been a major player with an exceptional level of prominence. In an era when most stories went unsigned, his name was a big enough selling point to be featured prominently on the covers.

Much myth has accumulated around the creation of the Fantastic Four, partially because of the impact the title would go on to have and partially because none of the people involved (Goodman, Lee, Kirby) can be considered reliable narrators, but if you simply look at the book itself and what had been going on in the industry at the time, FF #1 is about what you would expect, combining Schwartz's formula with the group dynamics of Kirby's team comics and elements of the monster comics Marvel had been producing (the two most unusual aspects, the lack of costumes and secret identity were completely dropped when Marvel introduced Spider-man less than a year later).

I don't mean any disrespect for Marvel here. This is usually how companies really work. You find an existing business model, modify it slightly, then use it to establish a revenue stream and a loyal customer base. That's what Lee did. That's what Sam Walton did with Ben Franklin stores. The big ideas and innovation tend to come only after you have a successful stable operation (which was certainly the case with Marvel). That leads to a much bigger point.

We have seen over the past few years a tendency to grant intellectual property protection to ideas that would previously have been considered general parts of a business plan (for example, offering free wi-fi to customers). What if the ability to borrow and recombine elements of business plans in kind of a de facto genetic algorithm is an important part of a creative economy? What if being derivative is the first step for coming up with something original?

There are also some interesting IP questions involving the creation of Spider-man (Wikipedia has a good summary), but that's a discussion for another time. The part of the story that's most relevant comes a couple of years later.

As mentioned previously, from approximately 1956 to 1966, the big thing in comics was to modernize and reboot Golden age characters. This left Marvel with a problem: With the exception of Captain America, the Human Torch and Namor, the company had a very thin bench. You very soon got down to really obscure characters. The whole purpose of the reboot model is to cash in on name recognition so rebooting the virtually forgotten is of limited value. (You have to wonder how many readers in the Sixties had ever heard of the Thirties Tarzan knock-off Ka-Zar.)

Lee's solution was to launch characters using at least the names of three of the biggest sellers of the Golden Age: Daredevil; Ghost Rider; and Captain Marvel, none of which actually belonged to Marvel, but were instead arguably in the public domain. It is the third one that required considerable nerve.

Captain Marvel had been, by some standards, the most successful character to come out of the Golden Age, outselling even Superman (nuisance suits from DC were a big factor in the decision to eventually cancel the series in 1953). What's more, the publisher, Fawcett was big and, though out of the comics business, still very active, publishing title including Family Circle, Woman's Day, Mechanix Illustrated and Gold Medal paperbacks.

Lee was betting (correctly as it turns out) that Fawcett either wouldn't notice or wouldn't bother to sue an obvious copyright infringement. It was a bold but not a reckless move. Attitudes toward copyrights have changed greatly and many of those changes involve the earlier emphasis on active properties and going concerns. Up until recently, the primary reason you acquired and held copyrights was because you wanted to do something with those properties. As a result, if someone went out of the comics business and no one had an immediate interest in their properties, the copyrights were often allowed to lapse or (in the case of Fawcett) go unenforced.

There's a lesson here about creative destruction. Companies, particularly those in the creation business, often start out by borrowing business plans and skirting copyright and patent laws. You can certainly argue that this lowers the value of the intellectual property they are making use of, but I think you can also argue, as or more persuasively, that the returns on tolerating this behavior from small, young companies far outweigh the cost.

For more on the IP beat, click here, here, and here.

Tuesday, July 16, 2013

Krugman is on a roll

Krugman begins to build the case against a direct application of the theories proposed in Ayn Rand's magnum Opus Atlas Shrugged here:
Of course, that’s not how we do things. We may live in a market sea, but that sea is dotted with many islands that we call firms, some of them quite large, within which decisions are made not via markets but via hierarchy — even, you might say, via central planning. Clearly, there are some things you don’t want to leave up to the market — the market itself is telling us that, by creating those islands of planning and hierarchy.


and here:
The thing is, however, that for a free-market true believer the recognition that some things are best not left up to markets should be a disturbing notion. If the limitations of markets in providing certain kinds of shared services are important enough to justify the creation of command-and-control entities with hundreds of thousands or even millions of workers, might there not even be some goods and services (*cough* health care *cough*) best provided by non-market means even at the level of the economy as a whole?
 In a lot of ways, the corporation is a huge challenge for theories of free markets.  They are large organizations that have a great deal of political power and are famous for being able to survive sub-optimal decisions (see Dilbert, especially the Wally character).  This use of political power can result in firms being successful due to favorable regulation (consider agricultural subsidies). 

This is quite different than the nation of shopkeepers envisioned by Adam Smith.  Without all business being set up in small units, the net result is that new entrants to the market can be strangled by existing firms (if nothing else, it is expensive to fend off spurious lawsuits).  The Sears experiment that Krugman references is another good piece of evidence that there can be returns to cooperation as well as competition.  Heck, a student of history who has read Caesar's account of his war against the Gauls should be highly suspicious that cooperation can be extremely important in the success or failure of groups.   

Relative versus absolute changes

Paul Krugman:
One implication of all this is what Gauti Eggerstsson and I (pdf) call the paradox of flexibility: making it easier for wages to fall, as Hazlitt demanded then and his modern acolytes demand now, doesn’t just redistribute income away from workers to the wealthy (funny how that happens); it actually worsens the economy as a whole.
I think this is a very good example of cases where relative advantage and absolute advantage are confused.  Any one firm can become more competitive by reducing costs.  But if everyone reduces costs then there is no advantage.  Things only get interesting if railroad workers and teachers are out of alignment on wages and you can change the wages of one relative to the others. 

But there isn't a clear reason that this has to be due to dropping wages.  The difference in wage increases over time can address this misalignment without causing massive hardship in a world where obligations are in nominal dollars. 

So I think Paul Krugman is right to be skeptical about deflation as a solution.  


Monday, July 15, 2013

Believe or not, I'm going to tie this in with the Fantastic Four





Joseph Stiglitz has a NYT opinion piece entitled "How Intellectual Property Reinforces Inequality" which lays out in one of the most serious areas imaginable just how costly out of control IP laws can be (a point we've been making in much more frivolous terms).
The drug industry, as always, claimed that without patent protection, there would be no incentives for research and all would suffer. I filed an amicus brief with the court, explaining why the industry’s arguments were wrong, and why this and similar patents actually impeded, rather than fostered, innovation. Through my participation in the case, I heard heart-rending stories of women who didn’t take the actions they should have, because they believed the false-negative results of Myriad’s inferior tests. The better tests would have told them they did in fact have a gene associated with cancer.The good news coming from the Supreme Court was that in the United States, genes could not be patented. In a sense, the court gave back to women something they thought they already owned. This had two enormous practical implications: one is it meant that there could now be competition to develop better, more accurate, less expensive tests for the gene. We could once again have competitive markets driving innovation. And the second is that poor women would have a more equal chance to live — in this case, to conquer breast cancer.

But as important a victory as this is, it is ultimately only one corner of a global intellectual property landscape that is heavily shaped by corporate interests — usually American. And America has attempted to foist its intellectual property regime on others, through the World Trade Organization and bilateral and other multilateral trade regimes. It is doing so now in negotiations as part of the so-called trans-Pacific Partnership. Trade agreements are supposed to be an important instrument of diplomacy: closer trade integration brings closer ties in other dimensions. But attempts by the office of the United States Trade Representative to persuade others that, in effect, corporate profits are more important than human lives undermines America’s international standing: if anything, it reinforces the stereotype of the crass American.

Economic power often speaks louder, though, than moral values; and in the many instances in which American corporate interests prevail in intellectual property rights, our policies help increase inequality abroad. In most countries, it’s much the same as in the United States: the lives of the poor are sacrificed at the altar of corporate profits. But even in those where, say, the government would provide a test like Myriad’s at affordable prices for all, there is a cost: when a government pays monopoly prices for a medical test, it takes money away that could be spent for other lifesaving health expenditures.

Call me suspicious...

But I've gotten to the point where I look for signs of manipulation in all business news and brokers' recommendations. Case in point, Disney had a bad week recently. As you've probably heard, the Lone Ranger reboot is on track to lose a lot of money (the figure $100 million keeps being tossed around), but that doesn't cover the full drop in expected value. Disney was shooting for another Pirates franchise (complete with the same writers, director, producer and star). The first four installments of that series have done almost four billion in box office and the fifth and sixth chapters are in the works. And that box office total doesn't include toys and tee shirts and all of the other ways Disney could make money off something like this. Investors who had priced in the possibility of this being another Pirates will need to recalibrate.

Disney is a huge company, but even there the old saying applies -- "a billion here... a billion there... pretty soon you're talking about real money." Not enough to threaten the company but worth taking into account when thinking about stock price. Fortunately for Disney, this terrible news was balanced out by quite a bit of good (enough to bump the price up a bit). Credit Suisse analyst Michael Senno estimated a global take of $1.2 billion for Star Wars Episode VII and Motley Fool* ran a string of positive stories arguing that Disney was adding value to Marvel and that "Buena Vista Pictures is earning as much as ever." That second claim was supported with a year over year comparison:
Disney won't be as fortunate with The Lone Ranger, which is why so many are comparing this flop-in-the-making to John Carter, last year's $250 million box office bomb that effectively ended the Disney career of former studio chief Rich Ross. 
If only he knew then what we know now. John Carter, for as big a disaster as it was, did nothing to diminish the House of Mouse's theatrical prowess. Here's a closer look at the year-over-year numbers from Jan. 1 through June 30: 
Buena Vista Year-Over-Year Comparison
                                    YTD 2013          YTD 2012          Change  
Number of films            10                       12**                   (2)
Total U.S. box office    $886.8 million      $949.8 million     (6.6%)
Per-film average           $88.7 million        $79.2 million       11.9%
Source: Box Office Mojo.
** Includes a 3D rerelease of The Lion King. 
After achieving $800 million in domestic box office receipts only once since 2000 (in 2010), Disney has done at least that in both 2012 and this year. Impressive may be too timid a word for how well Buena Vista is doing right now.
Notice anything missing? How about budgets, marketing costs, performance of comparable films from other studios? Keep in mind that in absolute numbers, John Carter did pretty well:
John Carter earned $73,078,100 in North America and $209,700,000 in other countries, for a worldwide total as of June 28, 2012 of $282,778,100.
In relative terms, not so much:
Paul Dergarabedian, president of Hollywood.com noted, "John Carter’s bloated budget would have required it to generate worldwide tickets sales of more than $600 million to break even...a height reached by only 63 films in the history of moviemaking"
(According to the New York Times, the Lone Ranger would have to hit $800 million to break even.)

How about Disney adding value to Marvel? The only real example I saw in the article was the willingness to cough up extra money to hold on to talent like Whedon and Downey. Probably a good investment but old news and a case of maintaining, not adding, value.

And that incredible prediction for the Star Wars reboot? Not credible about covers it.

Perhaps I'm too cynical, but given the low quality but excellent timing of these analyses, I have to believe that some folks at Disney have really been working the phones.

* I'm going by the Motley Fool posts and not the videos that accompany them. If anyone out there wants to take one for the team and watch them, let me know if anything of value is said.

Thanks


Saturday, July 13, 2013

Weekend blogging -- Listen slowly

A music historian I know speaks eloquently about this very long piece (the first ten minutes of which can be heard here). He has listened to most if not all of the piece (I did mention it was long, right?) and he says the key is letting yourself give in to the pace of the music, though I suspect that his complete familiarity with the composition being adapted helps a bit.

If you're interested in classical or experimental music, take a listen at the link above then click here to see what you've just heard.

Friday, July 12, 2013

Apologist Watch

There is a danger in using small incidents to make big arguments. List enough anecdotes and you can often overwhelm by sheer weight even though, with time, someone could probably amass a comparable pile on the other side of the debate. But that doesn't mean we should ignore all information that doesn't come in big, clean, conclusive data sets. Even an old-school frequentist will acknowledge the value of a reality check, seeing how well and how often the world matches your theories.

Take, for instance, the argument that the quality of today's journalism has been damaged by the group dynamics of the profession and particularly by the tendency of media critics to adopt the role of group apologists. With that hypothesis in mind, look at these two paragraphs from Politico's Dylan Byers (expertly satirized by Esquire's Charles Pierce).
This is not the first time folks have been fooled by The Daily Currant. In February, The Washington Post picked up one of the site's stories and erroneously reported that Sarah Palin was joining Al Jazeera. In March, both the Boston Globe and Breitbart.com ran reports that Paul Krugman had filed for personal bankruptcy, again based on a Daily Currant report. Needless to say, the real journalists bear the responsibility here: They should verify the news they report or, at the very least, know the sources from which they aggregate.
As Pierce notes, Byers would be fine if he stopped here. Unfortunately he follows with this:
But these mistakes don't reflect well on The Daily Currant, either. If their stories are plausible, it's because they aren't funny enough. No one -- almost no one -- mistakes The Onion for a real news organization. That's not just because it has greater brand recognition. It's because their stories make readers laugh. Here are some recent headlines from The Onion: "Dick Cheney Vice Presidential Library Opens In Pitch-Dark, Sulfurous Underground Cave" ... "John Kerry Lost Somewhere In Gobi Desert" ... "'F--- You,' Obama Says In Hilarious Correspondents' Dinner Speech." Here are some recent headlines from The Daily Currant: "Obama Instantly Backs Off Plan to Close Guantanamo" ... "CNN Head Suggests Covering More News for Ratings" ... "Santorum Pulls Son Out of Basketball League."
There are at least a couple of critical fallacies here. The first is, as Pierce observes, confusing title and story. The other is the assumption that the humor and effectiveness of satire depends on its obviousness (has anyone at Politico read "a Modest Proposal"?).

Those would be interesting topics for another discussion, but what's relevant here is the social dynamic. Outsiders play a joke on some insiders and trick them into doing something stupid. Byers concedes that the insiders shouldn't have been stupid but complains that the joke wasn't funny. That's pretty much the expected response of a group apologist under the circumstances, made doubly problematic by the fact that a lot of people are laughing.

p.s. Sorry about being so late to get this one out.

Thursday, July 11, 2013

Data points

You could use this anecdote as a launching point for any number of interesting discussions, but I think it might be more valuable if it simply stands on its own as an example off how school policies are made and implemented.



16-year-old Kiera Wilmot is accused of mixing housing chemicals in a small water bottle at Bartow High School, causing the cap to fly off and produce a bit of smoke. The experiment was conducted outdoors, no property was damaged, and no one was injured.

Not long after Wilmot’s experiment, authorities arrested her and charged her with “possession/discharge of a weapon on school property and discharging a destructive device,” according to WTSP-TV. The school district proceeded to expel Wilmot for handling the “dangerous weapon,” also known as a water bottle. She will have to complete her high school education through an expulsion program.

Friends and staffers, including the school principal, came to Wilmot’s defense, telling media that authorities arrested an upstanding student who meant no harm.

"She is a good kid," principal Ron Richard told WTSP-TV. "She has never been in trouble before. Ever."

"She just wanted to see what happened to those chemicals in the bottle," a classmate added. "Now, look what happened."

Wednesday, July 10, 2013

I have no doubt these numbers were pulled from someplace other than "thin air"

I have no opinion on the price of a share of Disney, but as of now I'm a bit more inclined to short Credit Suisse:
Here's something you don't see very often. The Walt Disney Company suffered a major bomb over the weekend, opening The Lone Ranger to a pitiful $48 million 5-day box office, forcing the studio to lose an anticipated $150 million on the big-budget Western. But if you're a Disney stockholder, things are still looking up. According to Variety the studio's stock is up 1.3%. Why? You probably already know the answer: Star Wars. 
OK, partial credit goes to Monsters University and Iron Man 3, two huge hits that help soften the blow of The Lone Ranger's failure. But Credit Suisse analyst Michael Senno estimates that Disney will make $733 million in profit-- that's $1.2 billion in global ticket sales-- from Star Wars Episode VII, which means that the company still ought to remain a solid investment. He didn't pull that number out from thin air, of course-- the final Star Wars prequel, Revenge of the Sith, made $850 million worldwide, and of course Disney's last giant hit The Avengers was a huge global success, making $1.5 billion. If anything, $1.2 billion for Star Wars Episode VII might be lowballing it. 
First, a quick belaboring of the obvious...

1. Movies are an unpredictable business, particularly movies scheduled for release two years from now.

2. Senno is predicting a extraordinarily rare event. According to Wikipedia, only five films have broken $1.2 billion in global ticket sales 

1 Avatar $2,782,275,172 2009
2 Titanic $2,185,372,302 1997
3 The Avengers $1,511,757,910 2012
4 Harry Potter and the Deathly Hallows – Part 2 $1,341,511,219 2011
5 Iron Man 3  (currently playing) $1,211,010,987 2013

3. Though the data jumps around quite a bit, when we correct for inflation, there's reason to believe that the box office of the Star Wars franchise is trending down. With inflation, the first film made far more than the Phantom Menace (check my numbers but I believe almost twice) which made more than considerably more than the latest.

Of course, it's entirely possible that this reboot will hit Avengers territory -- all of the films in the franchise have been enormously profitable -- but the idea that someone can forecast this with any level of confidence is simply silly.

None of this is meant to be taken as a comment on the value of Disney stock which could be badly undervalued as far as I know. What this is a reminder of is how often we are fed unbelievable numbers and, more troublingly, how often established institutions are willing to put their reputations behind them.

Tuesday, July 9, 2013

[Insert joke headline here]

Just when I was starting to feel bad about that Marx Brothers crack this comes along (from Yahoo):
In an attempt to ban Internet cafes in the Sunshine State, legislators may have been a bit too broad in their language, resulting in what some legal experts say is a law that inadvertently makes it illegal to own a computer or a smartphone. 
The bill, which was signed into law by Florida governor Rick Scott in April, was meant to shut down cafes, some of which have been tied to illegal gambling in the state. And shut them down it did: over 1,000 cafes were immediately closed. 
But it's the wording that's problematic, as it defines a slot machine as "any machine or device or system or network of devices" that can be used in games of chance. Turns out the Internet is full of gambling sites, which is where the definition runs into some problems.

The show business equivalent of a patent troll -- more from the post pretense stage of intellectual property

Back on the IP beat, take a look at this item from Entertainment Weekly:
The Weinstein Company’s upcoming movie The Butler, which stars Forest Whitaker as the African-American servant who worked in the White House for more than 40 years, has tripped over an industry obstacle on the way to its Aug. 16 release in theaters. As Deadline reported Monday, Warner Bros. is claiming protective rights to the film’s title due to a 1916 silent short film with the same name that resides in its archives, and both sides are heading to arbitration to reach a resolution. 
Technically, this isn’t a legal issue, since you can’t typically copyright or trademark a movie title. But the MPAA has a voluntary Title Registration Bureau that the industry uses to self-regulate and avoid title conflicts that might confuse audiences. In this case, it’s unlikely that moviegoers are even aware of the 1916 silent film that Warner Bros. is citing, but TWC apparently never cleared the title.
Just so we're clear, my obscurity bar is alarmingly high, but the Butler (1916) clears it in street shoes. This short subject seems to have been forgotten almost on its release. No one of any note was involved either in front of or behind the camera. It doesn't seem to have had any airings in any media after the late Teens. There's no plot synopsis on IMDB or, as far as I can tell, anywhere else on line and though we can only guess at what the film is about, one thing we can say with near certainty is that it has nothing whatsoever to do with the Forest Whitaker movie.

None of the standard defenses of copyright apply here. This is a work completely forgotten by the general public, unseen for almost a century, created by people long dead, of no conventional value as a piece of intellectual property. There's no way Warner Bros. (which didn't even produce the 1916 film) can argue this is anything more than rent seeking and the harassment of smaller competitors.

That's what I meant when I called this the post-pretense stage of intellectual property rights. Up until a few decades ago, I believe we were in the sincere stage: patent and copyright laws were, for the most part, honest attempts at protecting creators' rights and encouraging the creation of socially and economically beneficial art and technology. Then came the pretense stage, where the arguments about protecting creators and encouraging invention were used as cover for powerful interests to hang on to valuable government-granted monopolies (or in this case industry-granted -- not that surprising since the same major players that dominate industry organizations are the ones that benefit most from and lobby hardest for these laws). Now we're entering the stage where even the pretense of fairness and social good can no longer be maintained with a straight face.

Monday, July 8, 2013

S&P defense

In a startling approach, S&P seems to be making a claim that their ratings are marketing activities and not a credible estimate of credit-worthiness.  I am going to outsource the obvious conclusion from Matthew Yglesias:

If talk of objectivity is just marketing hype, then the ratings are worthless. It's just a form of paid public relations for security issuers and nobody should take the ratings seriously for a minute.

In case you have trouble imaging this defense, Kevin Drum posted a picture showing the actual text.  What passes imagination is that the US government has written these ratings into law in terms of what pension funds can invest in.  If this is just paid marketing, should we not repeal these laws?  And would that not remove what little is left of the business model?

What am I missing here?

"Don't you cry for me..."

In the recent IP thread (see here, here, here and here to get caught up), Joseph and I have generally been bemoaning regulatory capture and trying to make the case that ludicrously broad patents and endlessly extended copyrights are socially costly. We were not, however, arguing for the other extreme, a world of no intellectual property protection where "information wants to be free." We've seen how how things work in a world where copyrights go unenforced:
Stephen Collins Foster (July 4, 1826 – January 13, 1864), known as the "father of American music", was an American songwriter primarily known for his parlour and minstrel music. Foster wrote over 200 songs; among his best known are "Oh! Susanna", "Camptown Races", "Old Folks at Home", "My Old Kentucky Home", "Jeanie with the Light Brown Hair", "Old Black Joe", and "Beautiful Dreamer". Many of his compositions remain popular more than 150 years after he wrote them.
...
Stephen Foster had become impoverished while living at the North American Hotel at 30 Bowery on the Lower East Side of Manhattan, New York. He was reportedly confined to his bed for days by a persistent fever; Foster tried to call a chambermaid, but collapsed, falling against the washbasin next to his bed and shattering it, which gouged his head. It took three hours to get him to Bellevue Hospital. In an era before transfusions and antibiotics, he succumbed three days after his admittance, aged 37. 
His worn leather wallet contained a scrap of paper that simply said, "Dear friends and gentle hearts," along with 38 cents in Civil War scrip and three pennies.
(According to a music historian I know, the example of Foster was very much on the minds of the people who created ASCAP fifty years later.)

Just so there's no room for confusion, Joseph and I are arguing for the middle ground for IP protection, such as what we saw with American copyrights from 1909 to 1976 (a period that represented a pretty good run).