Monday, July 16, 2018

What next for health care costs?

This is Joseph

I read with great interest this article by Edward Murphy on US health care costs.  The bottom line was quite interesting:
In March, three researchers from the Harvard T. H. Chan School of Public Health published a study in JAMA analyzing the well-known reality that the United States spends dramatically more on health care than other wealthy countries. They compared the US, where health care consumes 17.8 per cent of gross domestic product, to 10 comparable nations where the mean expenditure is 11.5 percent. Despite spending much less, the other countries provide health insurance to their entire populations and have outcomes equal to or better than ours. The researchers found that this inefficiency gap is primarily driven by two characteristics of the US system: the high cost of pharmaceuticals and inordinate administrative expenses.
and
It is not in the interest of huge profit-making corporations to restrain the overall cost of the US health care system. In fact, their interest is served by driving health care expenditures higher. When combined with the spending analysis provided by researchers, the financial data disclosed by public corporations point to a path that the country must follow to make our system more coherent and less costly. Any progress will require driving down pharmaceutical pricing and reducing administrative costs imposed by middlemen. We are not doing that yet but, ultimately, we must. 
This is an evergreen topic given that health care is expensive and there is a sense that it doesn't have quite the overall outcomes that you'd expect.  If it was just that a wealthier country spends more on health to get better outcomes, then I would say that the market was functioning effectively.  It's unclear to me that being wealthy means an interest in higher administrative costs, although I guess hedge funds might be an example of this phenomenon.

This is relevant because there is a real issue with laws to regulate health care, both in terms of payments and the laws themselves.  I see this as the end of the idea of a regulatory framework for the private marker (the affordable care act) and a prelude to a pivot to some other approach. I am waiting to see who might provide leadership on this front.  What is the vision of the Trump administration on health care policy and cost containment?  How are they going to tackle this difficult and nearly intractable issue?

Curious minds are waiting for the other shoe to drop and see what the new plan is

Postscript:  And in case you think these high profits are needed for innovation, Noah Smith has a great tweet here


Friday, July 13, 2018

A follow-up to yesterday's post

This is Joseph

In yesterday's post we talked about employer monopsony.  Relevant to this post is today's news about Washington state successfully suing to get fast food franchises to no longer enforce "no poaching" agreements.  The gist is:
The provisions prohibit workers at, for example, one Carl’s Jr. franchise from going to another Carl’s Jr. They do not stop those workers from taking jobs at restaurants run by a different chain.
While this does not lock workers into a single chain, it does make it harder for them to move around if one particular employment situation gets bad.  Any barriers to mobility tend to depress wages and thus the question of minimum wage as a policy.

It's true that tackling the bad policy might be a better overall decision.  But minimum wage is an easy to enforce policy that is hard to weasel around.  Once again, it is a policy that bears more looking at.

Thursday, July 12, 2018

Jodi Beggs on employer monopsony power and mimumum wage

This is Joseph

Jodi Beggs has a great post on the minimum wage in the presence of employer monopsonies.  If there is only one buyer of market labor, that purchaser has a decided advantage.  Her main point of doing this analysis was to make the point that:

If a labor market is a monopsony (or if employers have monopsony power), minimum wages could actually increase rather than decrease employment.
She provides a number of worked examples, which are fun to look over.

What this made me wonder about the is the role of non-compete agreements in labor markets.  For example, there is a sandwich company that used to enforce non-compete agreements. Insofar as actors include these agreements in employment contracts, we might be closer to the point where minimum wages have a reduced cost (or even a benefit) to employment levels.  This is also likely to occur in high skill areas where the number of employers is small.  If you are a skilled airplane builder and live in Montreal, you will find limited employer options that are not Bombardier.  Moving has costs and, in the modern world, it isn't necessarily trivial to immigrate.

These effects may also occur in small markets, where the number of competing employers is limited, which is more of what Jodi Beggs was worrying about in her post.  In order for market forces to work properly, there are a number of key assumptions that are required.  There being multiple buyers of labor is one of these and we shouldn't be surprised if your intuitions are reversed in some markets.

Now how that works for things like the Seattle minimum wage are more complicated.  But this could be an important layer to understanding this as different studies had slightly different industries and it may be that employer diversity is confounding some of these associations.

But it is worth reflecting on as it makes the arguments against the minimum wage a lot weaker if there are market monopsonies present.  If so, then the policy benefits of this policy are actually pretty great.

(Jodi Beggs points to an article on the topic, and the evidence seems mixed, but not ignorable).






Wednesday, July 11, 2018

Another in our series of revolutionary postwar innovations – – let's go to the tape

With the advent of magnetic tape, audio recording changed more in a few years than it had in the previous few decades. For record producers and sound engineers, performances went from being something holistic that could only be presented as captured to being a collection of elements, each of which could be edited, tweaked, and recombined in infinite ways. Though digital technology has greatly increased the power and convenience of the sound engineer's toolbox, the basic paradigm is about 70 years old.

Though the effects took a bit longer videotape, the impact was arguably even greater. Though film remained the standard for situations where high picture quality had to be maintained, the ability to play back a recording immediately ("and now let's go to the instant replay") and the ability to reuse tape radically altered the business of video production and television news.

Here's a notable example of the new audio paradigm from CBS Radio Workshop.
The premiere broadcast was a two-part adaptation of Aldous Huxley's Brave New World, introduced and narrated by Huxley. It took a unique approach to sound effects, as described in a Time (February 6, 1956) review that week:

It took three radio sound men, a control-room engineer and five hours of hard work to create the sound that was heard for less than 30 seconds on the air. The sound consisted of a ticking metronome, tom-tom beats, bubbling water, air hose, cow moo, boing! (two types), oscillator, dripping water (two types) and three kinds of wine glasses clicking against each other. Judiciously blended and recorded on tape, the effect was still not quite right. Then the tape was played backward with a little echo added. That did it. The sound depicted the manufacturing of babies in the radio version of Aldous Huxley's Brave New World.




A few years earlier, Gunsmoke (a serious contender for best show ever in two different media) was noted for its detailed sound montages, including different effects for the pouring of beer and whiskey.

Interesting bit of trivia about the show. CBS chose to run the show unsponsored for the first few seasons. They wanted a genuinely adult Western and the producers were afraid advertisers would pressure them to lighten the program. If you listen to this episode, you'll understand why.






Tuesday, July 10, 2018

Back on the urbanist beat

The following video hits a couple of points we've been making over the years about urbanization and the creative class. The first is that utopian urbanists, while not without good ideas and an abundance of fine intentions, tend to operate under a hopelessly romanticized vision of urban living and that they often embrace policies that push reality even further from their own ideal.

If you're looking for a perfect Richard Florida style neighborhood, a place where a film director, a JPL rocket scientist, and a coder might run into each other and decide to have a coffee at a sidewalk café, you'd be hard pressed to find a better spot than this section of Burbank.

You can eat on the cheap (a small coffee and a big pastry at Portos can come in under three dollars). You can watch professional makeup artists comparing prosthetics for an indie horror film and costume designers sifting through the racks of a thrift store. You can see a first-rate play or improv comedy show, go to a book signing, or pick up some groceries in one of the delis.

This is very much in line with the lifestyle many utopian urbanists fantasize about, but unfortunately it can only exist in a fragile Goldilocks zone of just the right density and property values, and ironically, the more appealing a neighborhood like this is, the more factors that make it appealing are threatened.

High density, high traffic areas are really only suitable for two types of retail businesses, expensive and/or fast-turnover. Businesses where customers take their time and don't spend a lot of money cannot survive under these conditions.

Just to be clear, I'm not taking sides or advocating certain policies here. If this stretch of Magnolia is all chain stores in a couple of years, I probably won't hang out there but I won't begrudge the property owners for seeking optimal returns either. I simply want to point out that urbanists and the YIMBY movement may not have thought through the implications of some of their proposals and rhetoric.


Via Mark Evanier.






Monday, July 9, 2018

Media Post: The Last Jedi

This is Joseph.

Since the Last Jedi came out there have been a number of critiques of it; some better than others.  It is quite clear that the attempt to deconstruct the franchise, while artistically interesting, had some issues.  In particular, I would summarize my thoughts as:

Visuals: Stunning.  Simply stunning.  What a beautiful movie
Creativity: I may not love Canto Bight, but boy was it creative
Characters: Ok.  There is a break in continuity of character between movies, but it isn't fatal
Plot: Weak, very weak.

Probably my favorite review of it is this one.  The author lists eight critiques, of which I see #3 as being the most serious:
The primary plot (of the cruiser chase) is riddled with plot holes and doesn’t make any sense. The film suffers because its backbone is broken.
Now, it isn't the case that previous movies completely avoided these issues.  Compare with the Empire Strikes Back -- the asteroid chase has similar issues and while they are more clever about dialogue, the timeline is a mess.  And some characters had big changes -- Darth Vader seems much more important than he used to be, for example, and much less deferential to senior military commanders.

But the issue was the deconstruction, I think.  The writer-director played against type, which the Empire Strikes Back did not do (at least not so completely).  Now you can shake up a franchise, but this is a delicate matter that requires great skill.  Mark Evanier tells an anecdote about a junior writer asking Ray Bradbury if it was ok to be as high maintenance as Harlan Ellison.  The rejoinder was priceless:

"I don't know if that's okay but if you try it, check first and make sure you have the talents of a Harlan Ellison."
I think that this is exactly right.  If you want to subvert expectations then it should be in the service of creating a better or more thought provoking story.  This story was very well shot but needed a serious rewrite of the script and a goal as to how the subversion would make the story sing.

It was also easy in the franchise for this move.  Thor Skywalker points out that the Marvel Cinematic Universe is twenty movies in, with a fairly limited range of characters and plots, and hasn't felt the need to subvert expectations yet:


In fact his comparison of Thor Ragnorak, which is not a perfect movie by a long stretch, to the Last Jedi is quite clever.  Both movies are ultimately about failure.  They even both have the breaking of the protagonist's weapons, the loss of home, and the death of most of the protagonist's organization.   But one has a tighter script and more narrative payoffs.  The other has better visuals and more creativity.  Guess which one grossed higher.

Now it is true, I was cheering for Rey to date Finn and not a serial killer.  But I don't mind being wrong -- that is part of the fun.  Who would have guessed it would be Han and Leia?  Or that Darth Vader would be the one to defeat the emperor?  Or that Yoda actually fought Darth Sidious?  I like being surprised, so long as the surprise keeps the story fresh and interesting.


So my view: good movie, fabulous eye candy, great for kinds, but one that really needed a script doctor for the fans in the audience.









Friday, July 6, 2018

"Back when fifty years was a long time ago" -- updated with new video links


I realized that the Willoughby episode fit into the late 19th century time travel genre (As well as being a good example of how people in the early sixties felt about the pace of their own lives).



And the link on this one had gone dead.



Sunday, December 30, 2012

Back when fifty years was a long time ago




I've noticed over the past year or two that people ranging  from Neal Stephenson  to Paul Krugman have been increasingly open about the possibility that technological progress been under-performing lately (Tyler Cowen has also been making similar points for a while). David Graeber does perhaps the bst job summing up the position (though I could do without the title).

The case that recent progress has been anemic is often backed with comparisons to the advances of the late Nineteenth and early Twentieth Centuries (for example).  There are all sorts of technological and economic metrics that show the extent of these advances but you can also get some interesting insights looking at the way pop culture portrayed these changes.

Though much has been written pop culture attitudes toward technological change, almost all focus on forward-looking attitudes (what people thought the future would be like). This is problematic since science fiction authors routinely mix the serious with the fanciful, satiric and even the deliberately absurd. You may well get a better read by looking at how people in the middle of the Twentieth Century looked at their own recent progress.



In the middle of the century, particularly in the Forties, there was a great fascination with the Gay Nineties. It was a period in living memory and yet in many ways it seemed incredibly distant, socially, politically, economically, artistically and most of all, technologically. In 1945, much, if not most day-to-day life depended on devices and media that were either relatively new in 1890 or were yet to be invented.  Even relatively old tech like newspapers were radically different, employing advances in printing and photography and filled with Twentieth Century innovations like comic strips.

The Nineties genre was built around the audiences' self-awareness of how rapidly their world had changed and was changing. The world of these films was pleasantly alien, separated from the viewers by cataclysmic changes.

The comparison to Mad Men is useful. We have seen an uptick in interest in the world of fifty years ago but it's much smaller than the mid-Twentieth Century fascination with the Nineties and, more importantly, shows like Mad Men, Pan Am and the Playboy Club focused almost entirely on social mores. None of them had the sense of traveling to an alien place that you often get from Gay Nineties stories.

There was even a subgenre built around that idea, travelling literally or figuratively to the world of the Nineties. Literal travel could be via magic or even less convincing devices,
 
Figurative travel involved going to towns that had for some reason abandoned the Twentieth Century. Here's a representative 1946 example from Golden Age artist Klaus Nordling:









There are numerous other comics examples from the Forties, including this from one of the true geniuses of the medium, Jack Cole.











Thursday, July 5, 2018

Images from 1896


From Scientific American 1896-11-14

This is important because it shows how people were thinking about phonographs and recorded media in general.



This is important because it illustrates the age's fascination with the massive.





This isn't important at all.


Wednesday, July 4, 2018

Fourth of July Americana from the great Jerry Goldsmith.


First, two great scores from two perfect little movies.









Next, two great scores from movies you could probably skip. (Howard Hawks should have stopped with El Dorado. He also should've hired Goldsmith for that one as well. With all due respect to Nelson Riddle, the score they ended up with pretty much sucks.)











And finally, a highly appropriate original composition.



Tuesday, July 3, 2018

Come to think of it, I'll bet we can find a venture capitalist who'll go for the discounted twenties plan


As bubbles go, the glut of video content is in many ways fairly benign. It means lots of work or people in the industry and a ton of often very good shows for viewers (so many that the main complaint is the inability to keep up). Furthermore, with the exception of Netflix (which may have some very unhappy investors at some point in the future), most of the money seems to be coming from companies with such deep pockets that a few billion here are there can go relatively unnoticed. Apple, for instance, could set up a line of street corner kiosks that sell $20 bills at 50% off and it still wouldn't make a dent in the company's profits.

But though the content bubble is probably nothing to get upset about, it is still worthy of study as an example of the mentality. In particular, this story nicely illustrates how the bubble mania and the fear of missing out of the next big thing creates its own kind of dream logic that overrides normal rational concerns.

As previously mentioned, even before the bubble the amount of content available was probably growing faster than the market, especially when we focus on the United States (more on that later). The introduction of smart phones and to a lesser degree tablets did significantly increase the potential hours of purview or consumption, but that was a one time bump and we are probably seeing it leveling off by now.

Content accumulates. It is true that programs tend to lose audience appeal with time, but the drop off is often quite slow. Numerous movies and TV shows maintain a shockingly stable viewership over the years. I Love Lucy and Perry Mason have maintained a following for over 60 years now. Blocks of MASH run on at least a half-dozen basic cable channels and show up in local syndication in pretty much every market. Some shows even have a resurgence. Golden Girls (no, I can't explain it either) has gone from obscure, to retro, to ironic, to genuinely cool, a status it didn't even achieve in its original run.

Even if the production of new programming had remained steady, there would be real concerns about oversupply, but instead it has exploded with more and more programs featuring bigger and bigger budgets. With each year, the unsustainability of the model has become increasingly obvious and yet the idea that you have to be producing your own content to survive has become an ever more closely held item of faith.

A couple of interesting assumptions get baked into these discussions. One is that you have to have your own content. The other is that the future of online video is the subscription all-you-can-the model. Both are highly suspect. As for the first, there is reason to believe that, while almost all of the hype and marketing center on the Netflix originals, much if not most of their viewership comes from licensing existing programs like blockbuster movies or old shows like the aforementioned Golden Girls.

As for the second, this is basically just the old cable TV bundling strategy (which everyone hates) dressed in 21st century drag. With both models, you are paying for programming you have no interest in ever watching. With streaming services, this has been momentarily obscured by a willingness to just break even or even lose money, but that is not likely to go on indefinitely.

The Apple programming slate suggest something close to peak bubble mentality. They are spending obscene amounts of money on big-name talent, most of which has extremely limited international appeal. They have no idea how these shows fit into their existing business plan. They have no need to be producing original content at all. They are doing this strictly because of the extraordinary popular delusion that not being part of this bubble will mean being left behind in the new economy.

From the Verge:

That’s an impressive roster, but it’s important to remember that Apple is building its catalog from the ground up. It has to catch up to streaming companies that have had years-long head starts and are currently producing hundreds of titles. And while we have dedicated platforms from heavy hitters like Amazon Prime, Hulu, and Netflix, as well as exclusive systems such as Stargate Command and CBS All Access, Apple has yet to announce exactly where any of these announced shows will debut.

Monday, July 2, 2018

The world's first photo booth

Another example for the point we've been making about how the period centering around the 1890s was marked not just by amazing technological advances that made headlines but also by smaller, more personal inventions and discoveries that permeated every aspect of people's lives.

For instance, getting a photograph taken was still an elaborate and expensive process in the early 1870s. 20 years later, it was something you paid a nickel for at the local arcade.

From Scientific American 1893-03-18




AUTOMATIC PHOTOGRAPHY.
Of all the many uses to which the automatic selling machine has been put, that of taking photographs seems the most remarkable. And yet this is what is being done now in several public places in New York and Brooklyn by means of a nickel-in-the-slot photograph machine recently patented by Mr. Pierre V. W. Welsh, of New York City. The operation, so far as relates to the exposure, development and fixing of the picture, is entirely automatic, and the little picture which the machine throws out, after a momentary washing, appears to be a marked success over previous efforts in this direction, as judged by the excellence of the work and the rapidity with which it is effected.

Friday, June 29, 2018

"The monster on your TV set"

I wish I knew the exact date on this one. I assume it's some time in the early to mid-70s, late enough that cable TV and services like HBO were clearly on the horizon, they still seemed like something that might be pushed back. I also suspect it was before or at best shortly after the multiplex model took hold.

Eventually, increasingly ginormous home video screens, mobile streaming, and other innovations will probably kill the movie theater as a mainstream venue, but I suspect the industry watchers of 50 or so years ago would have been surprised at how well the basic business model has held up.





Thursday, June 28, 2018

Thought for the day

From Paul Krugman [emphasis added]:
What the freshwater school did was to take the actual experience of business cycles and say, “We don’t see how to formalize this experience in terms of maximizing equilibrium models; therefore it doesn’t exist.” It only looks as if recessions result from inadequate demand and that monetary or fiscal expansion can create jobs; our models tell us that can’t happen, so it’s all an optical illusion.

...

Anyway, this isn’t about me (well, it sort of is, but never mind.) The important point shouldn’t be “don’t formalize”; it should be that formalism is there to open your mind, not close it, and if the real world seems to be telling you something inconsistent with your model, the problem lies in the model, not the world.

Wednesday, June 27, 2018

If you have Netflix's PR budget, your own journalistic genre is just one of the things you can buy.

The following showed up in the "Recommended by Pocket" links in my Firefox browser. To be perfectly honest, I only clicked on it because I was looking for a jumping off point from which to discuss the extraordinary PR efforts of Netflix. For that purpose, the article was even better than I had hoped.

First, a few points about what the film Evolution isn't.

It isn't new. It came out in 2015 and was widely and generally positively reviewed. If you're into this kind of art-house horror film, there's a good chance you've already seen it and a very good chance you've already heard about it.

It isn't a Netflix original.

It isn't even exclusive to Netflix. In addition to the DVD, you can buy it online for $3.99 from iTunes or Amazon or a number of other vendors.

 Given all of this, why is the availability of this film on Netflix newsworthy? The short answer is that it's not. Basically it's an ad for Netflix disguised as a piece of news. Probably unpaid for and unintentional, but still an ad. What's more, it's actually part of a series of ads for Netflix running at GQ.

We have all gotten so accustomed to the what's-on-Netflix genre that the strangeness no longer registers. There is tons of great (and I mean that without hyperbole) content out there. There is no good journalistic reason why being on Netflix is any more newsworthy than being on CBS.com or Film Struck or PBS.org or the Internet Archive or even MeTV (I would actually make the case that Neil Simon's work under Nat Hiken on the Phil Silvers Show was more noteworthy than most of the Netflix films GQ chose to write).

To be hammer-blunt, what's-on-Netflix is a genre because the company has spent hundreds of millions of dollars on PR hacks who have spent countless hours planting and pushing these stories. It was a tremendous amount of money, but it was well spent. Netflix is now worth more than Disney because it has been more successful than any of its competitors at generating hype. The editors at GQ deserve a small part of the credit for this and, should the company implode leaving a pile of badly burned investors, they also might deserve a comparable share of the blame.

Tuesday, June 26, 2018

I'm afraid even the Brothers Grimm would have found Bitcoin a little too fantastic

I'm edging closer to the notion that the tools which we would normally use to critique journalism are no longer up to the task of discussing the 21st century technology narrative. Instead, the appropriate methods are probably those of the folklorist. We are rapidly approaching the realm of the myth and the tall tale. Why not start thinking in those terms?

It is standard practice when discussing something like a Jack tale to list the Aarne–Thompson classification. For example, Jack in the beanstalk fall under the classification AT 328 ("The Treasures of the Giant"). We could do something similar with the vast majority of tech reported. TakeTheranos. This and other accounts of college dropouts supposedly coming up with some amazing innovation can be classified under "wayward youth finds magic object."

 I've been getting quite a bit of thought recently to how magical heuristics have come to dominate the conversation about technology and innovation, but the idea of actually treating the narrative as folklore didn't hit me until I read this:
The paperclip maximizer is a thought experiment described by Swedish philosopher Nick Bostrom in 2003. It illustrates the existential risk that an artificial general intelligence may pose to human beings when programmed to pursue even seemingly-harmless goals, and the necessity of incorporating machine ethics into artificial intelligence design. The scenario describes an advanced artificial intelligence tasked with manufacturing paperclips. If such a machine were not programmed to value human life, then given enough power its optimized goal would be to turn all matter in the universe, including human beings, into either paperclips or machines which manufacture paperclips.[4]

    Suppose we have an AI whose only goal is to make as many paper clips as possible. The AI will realize quickly that it would be much better if there were no humans because humans might decide to switch it off. Because if humans do so, there would be fewer paper clips. Also, human bodies contain a lot of atoms that could be made into paper clips. The future that the AI would be trying to gear towards would be one in which there were a lot of paper clips but no humans.
    — Nick Bostrom, "Ethical Issues in Advanced Artificial Intelligence", 2003

Bostrom has emphasised that he does not believe the paperclip maximiser scenario per se will actually occur; rather, his intention is to illustrate the dangers of creating superintelligent machines without knowing how to safely program them to eliminate existential risk to human beings. The paperclip maximizer example illustrates the broad problem of managing powerful systems that lack human values

Suddenly it struck me that this was just the magic salt mill ever so slightly veiled in cyber garb. In case you're not up on your folklore...

It is Aarne-Thompson type 565, the Magic Mill. Other tales of this type include The Water Mother and Sweet porridge.

Synopsis

A poor man begged from his brother on Christmas Eve. The brother promised him, depending on the variant, ham or bacon or a lamb if he would do something. The poor brother promised; the rich one handed over the food and told him to go to Hell (in Lang's version, the Dead Men's Hall; in the Greek, the Devil's dam). Since he promised, he set out. In the Norse variants, he meets an old man along the way. In some variants, the man begs from him, and he gives something; in all, the old man tells him that in Hell (or the hall), they will want to buy the food from him, but he must only sell it for the hand-mill behind the door, and come to him for directions to use it. It took a great deal of haggling, but the poor man succeeded, and the old man showed him how to use it. In the Greek, he merely brought the lamb and told the devils that he would take whatever they would give him, and they gave him the mill. He took it to his wife, and had it grind out everything they needed for Christmas, from lights to tablecloth to meat and ale. They ate well and on the third day, they had a great feast. His brother was astounded and when the poor man had drunk too much, or when the poor man's children innocently betrayed the secret, he showed his rich brother the hand-mill. His brother finally persuaded him to sell it. In the Norse version, the poor brother didn't teach him how to handle it. He set to grind out herrings and broth, but it soon flooded his house. His brother wouldn't take it back until he paid him as much as he paid to have it. In the Greek, the brother set out to Constantinople by ship. In the Norse, one day a skipper wanted to buy the hand-mill from him, and eventually persuaded him. In all versions, the new owner took it to sea and set it to grind out salt. It ground out salt until it sank the boat, and then went on grinding in the sea, turning the sea salty.


I realize Bostrom isn't proposing this as a likely scenario. That's not the point. What matters here is that he and other researchers and commentators tend to think about technology using the specific heuristics and motifs people have always used for thinking about magic, and it worries me when I start recognizing the Aarne–Thompson classifications for stories in the science section.