Thursday, February 12, 2015

Netflix PR

[I'm coming off of a challenging January and I'm playing catch-up. The following should have run about a month ago, but I think the points are still fairly relevant.]

This is yet another one of those post where I based seen to be clubbing Netflix,when I'm actually using Netflix to club someone else. In this case, Netflix is doing exactly what the company is supposed to do, using PR to promote its products and brand . My issue is entirely with the people on the other side of the process.

If you have been paying attention, you may have noticed a new subgenre of entertainment journalism, the inexplicably exciting old show (that just happens to be on Netflix). On one level, this is nothing new. PR departments have been ghost-writing stories for reporters since at least the Twenties but in recent years the flack-to-hack ratio has ticked up noticeably and there are few companies making better or more aggressive use of the planted news story than Netflix does.

It started getting really blatant for me about the time that the company started pushing Young Indiana Jones. For the uninitiated, YIJ was one of the great crash-and-burns of network television history. Despite big budgets and high expectations (coming just three years after Last Crusade), this show flopped so decisively that ABC never even aired the last four episodes. DVDs were released in 2007/2008 to cash in on the buzz around  Indiana Jones and the Kingdom of the Crystal Skull, but other than that the show was largely forgotten.

Then Netflix picked it up and we started seeing stories like these:

5 Reasons Why Young Indiana Jones Is Actually Not As Bad As You Think

What to Binge This Weekend: 'The Young Indiana Jones Chronicles'



Netflix has always been very aggressive about drumming up press coverage, particularly involving originals and new releases. Whenever you see a blogger recommending some show you can stream on on the service, the chances are very good that the idea for this post initiated in some PR firm. As mentioned before, this is nothing new. [I discussed this post with a friend who used to work in the publicity department of one of the major studios (welcome to LA). He pointed out that he frequently wrote press releases that appeared verbatim under reporters' bylines in Variety and the Hollywood Reporter in the Nineties.] You can also argue that the journalistic expectations for the sites mentioned above have never been that high.

However the lines are continuing to grow blurrier, both between news story and press release and between puff piece and journalism. Which brings us to Esquire.

Friends is, in a sense, the opposite of Young Indiana Jones. The latter is a show that no one has heard about me: the former is a show that everyone has seen. It is arguably the seminal romantic comedy sitcom. It was massively popular in the day, continues to sell a ton of DVDs, and has been syndicated to the point of full immersion. It will, no doubt, be a popular feature for Netflix but it is hard to see how adding one more venue qualifies as news.

But Esquire thinks differently:




For those not familiar with the online edition of the magazine, the right side of the page is reserved for recommendation for other articles on the site (and, at the bottom of the page, for sponsored recommendations from other sites, but the ethics of the sponsored link is a topic for another day). The emphasis is not on what you'd call hard news -- men's grooming and style feature prominently and the editors manage to work in lots of pictures of Penélope Cruz and Angelina Jolie -- but it's the same sort of fluff that has always filled out the magazine.

So when "Everything You Should Watch in January 2015" came in number six on WHAT TO READ NEXT, you can reasonably consider it a relatively well-promoted news story. You can also assume that the impetus if not the actual authorship came from someone working PR for Netflix.

The subtitle of the piece is "In theaters and on video and Netflix Streaming." Note the omission of the major competitors Hulu and Amazon and providers of similar content such as YouTube, CBS.com, PBS.com and a slew of smaller players. Hulu's Wrong Mans seems a particularly obvious choice given all of the attention going to James Corden taking over the Late Late Show from Craig Ferguson.

This type of PR push is normally coordinated with a substantial ad campaign. In this case, the unlimited streaming of Friends (not to be confused with per-episode streaming, which has been available for years) prompted a large ad buy which included among other things, disinterring a few careers.





None of this is in any way meant as a criticism of Netflix. If anything, it would be an example of a company being responsible and portraying its product in the best possible light. This does raise some questions about Esquire's editorial policies, but given this is the kind of magazine that encourages you to buy four-hundred dollar watchbands,  I think the more alert readers already suspected that these manufacturers may have been exerting some influence on the journalists covering them.

All that said, it is always useful to remind ourselves from time to time that when news stories work out particularly well for some corporation, that favorable result probably isn't entirely coincidental.

Wednesday, February 11, 2015

Analog recording without analogs

And now for something completely different...

Two examples of artists who manually created their works on media normally used for analog recording.


Conlon Nancarrow


Nevertheless, it was in Mexico that Nancarrow did the work he is best known for today. He had already written some music in the United States, but the extreme technical demands his compositions required meant that satisfactory performances were very rare. That situation did not improve in Mexico's musical environment, also with few musicians available who could perform his works, so the need to find an alternative way of having his pieces performed became even more pressing. Taking a suggestion from Henry Cowell's book New Musical Resources, which he bought in New York in 1939, Nancarrow found the answer in the player piano, with its ability to produce extremely complex rhythmic patterns at a speed far beyond the abilities of humans.

Cowell had suggested that just as there is a scale of pitch frequencies, there might also be a scale of tempi. Nancarrow undertook to create music which would superimpose tempi in cogent pieces and, by his twenty-first composition for player piano, had begun "sliding" (increasing and decreasing) tempi within strata. (See William Duckworth, Talking Music.) Nancarrow later said he had been interested in exploring electronic resources but that the piano rolls ultimately gave him more temporal control over his music.[6]

Temporarily buoyed by an inheritance, Nancarrow traveled to New York City in 1947 and bought a custom-built manual punching machine to enable him to punch the piano rolls. The machine was an adaptation of one used in the commercial production of rolls, and using it was very hard work and very slow. He also adapted the player pianos, increasing their dynamic range by tinkering with their mechanism and covering the hammers with leather (in one player piano) and metal (in the other) so as to produce a more percussive sound. On this trip to New York, he met Cowell and heard a performance of John Cage's Sonatas and Interludes for prepared piano (also influenced by Cowell's aesthetics), which would later lead to Nancarrow modestly experimenting with prepared piano in his Study No. 30.

Nancarrow's first pieces combined the harmonic language and melodic motifs of early jazz pianists like Art Tatum with extraordinarily complicated metrical schemes. The first five rolls he made are called the Boogie-Woogie Suite (later assigned the name Study No. 3 a-e). His later works were abstract, with no obvious references to any music apart from his own.

Many of these later pieces (which he generally called studies) are canons in augmentation or diminution (i.e. prolation canons). While most canons using this device, such as those by Johann Sebastian Bach, have the tempos of the various parts in quite simple ratios, such as 2:1, Nancarrow's canons are in far more complicated ratios. The Study No. 40, for example, has its parts in the ratio e:pi, while the Study No. 37 has twelve individual melodic lines, each one moving at a different tempo.












Norman McLaren


McLaren was born in Stirling, Scotland and studied set design at the Glasgow School of Art.[1] His early experiments with film and animation included actually scratching and painting the film stock itself, as he did not have ready access to a camera. His earliest extant film, Seven Till Five (1933), a "day in the life of an art school" was influenced by Eisenstein and displays a strongly formalist attitude.

That included painting on the optical sound track.

In the 1950s, National Film Board of Canada animators Norman McLaren and Evelyn Lambart, and film composer Maurice Blackburn, began their own experiments with graphical sound, adapting the techniques of Pfenninger and Russian artist Nikolai Voinov.[2] McLaren created a short 1951 film Pen Point Percussion, demonstrating his work.[3] The next year, McLaren completed his most acclaimed work, his Academy Award-winning anti-war film Neighbours, which combined stop-motion pixilation with a graphical soundtrack. Blinkity Blank is a 1955 animated short film by Norman McLaren, engraved directly onto black film leader, combining improvisational jazz along with graphical sounds. In 1971, McLaren created his final graphical sound film Synchromy.[4]





Tuesday, February 10, 2015

Some points on Uber

This is Joseph

A very nice piece on Uber.  It hits both of the issues with Uber, the shift to lower regulation:
But Uber has little incentive to build well-paying, stable opportunities with reasonable hours at salaries of $50,000 a year. Quite the opposite: by creating part-time jobs that are the equivalent of Walmart greeters on wheels, the company can keep wages low (benefits, of course, are out of the question). It’s little wonder why Uber fights regulations that would require it to insure its drivers’ vehicles, conduct background checks, pay fees or limit its workforce: without restraints on the number of passenger-serving cars, and with a very low barrier of entry to the profession, the number of drivers will continue to grow until the market hits a point of saturation, sending costs plummeting in the process. Because Uber has few operating expenses to speak of—the investment is all made up front in developing the app, after which maintenance is minimal—the company enjoys a substantial profit no matter how many drivers flood the market.
And the unexpected consequences of these new changes:
But Uber has no requirement to serve the public. Indeed, there is a strong race, class and age bias as to who can utilize the service. You have to own a smartphone, which has an average cost of more than $500. Uber requires customers to pay with a credit card, cutting off those with no or poor credit. Until recently, the company had no wheelchair-accessible vehicles in Virginia, and continues to lack adequate services for the disabled in many places. 
The real issue with this sort disruption is that regulations may have a place in markets.   Now it isn't the business of any single corporation to deal with the inefficiencies present in the American marketplace.  But it is a matter of public concern if basic services stop being offered to key segments of the market.  We ensure, for example, that everyone has postal service (perhaps with some limitations) because the ability to accept mail is fundamental to being a member of society.

Ironically, Uber is a case where liberals often cross the isle.  Just like conservative libertarians often see subsides for oil companies as a good thing, middle class progressives focus on (the real and amazing) improvements in service that Uber provides to customers.  In a real sense, it is hard to argue that better service for customers and more flexibility for workers is a bad thing.  Because isn't.

But it would be good if we could keep the pieces of the past regulatory regime that created social benefits while helping to reduce the ossification that the industry has suffered.  It's a tricky line to draw.

Monday, February 9, 2015

Why some policy changes should be slow and measured: a case study

This is Joseph

Gary Ray nails it:
The problem with minimum wage is when it rises too fast. When it rises above a reasonable level of annual profit of a small business, you've got trouble. Using Borderlands numbers, a 20% rise over 3 years would require a nearly 7% increase in sales. Most retail businesses are not growing at 7% a year. I don't need back of the napkin math to figure this out. The Department of Commerce numbers show average retail sales growth median to be 5.04%, with 2014 at 3.17%. So minimum wage yes, but not so quickly.
I think that fast changes accelerate the costs of policies by increasing the penalty from disruption.  I think that this is an unpopular idea in the "age of disruption" (see Uber) but slower phasing in of policy can often permit industries and people to adapt.  It's also notable that bad industry practice, like the MSRP, was involved in this outcome by making bookstores less flexible in the face of changing costs.

Noah Smith has more on the specific situation, and his own comment on the externalities involved in fast changes. 

Does this mean a minimum wage increase is bad policy?  No, but I think it would be technically better to phase it in slowly, like Canada, rather than a large change all at once. 

Friday, February 6, 2015

People knew smoking was a bad idea before 1964

One of the great rewards of digging into old pop culture -- not the retconned, re-edited, just-the-hits version but the raw data -- is the way you keep running into counter-examples of conventional wisdom about the periods that produced the art. Fortunately, with online resources such as Internet Archive or Digital Comics Museum (which supplied the ad below), you can dig to your heart's content.

There's a popular genre of health narratives that run along roughly the same lines as the "last person you suspect" school of mystery fiction. In this genre, the revelation of some major effect (like smoking causes cancer) catches everyone off guard.

As far as I can tell, when you dig into these cases, you generally find that, if the magnitudes really are big, people will have started to notice that something was going on. For example, this ad (from the still extant Weider company) was running a decade before the Surgeon General's admittedly groundbreaking report. You can argue that people didn't realize just how strong the relationship was and the report certainly added a degree of certainty that was perhaps unprecedented for this type of  public health question but  people suspected smoking was unhealthy for a long time.




From the horror comic Dark Mysteries published in 1954.

We covered similar territory back in 2011.

Thursday, February 5, 2015

The end of public health?

This is Joseph.

In a sense, this one is just too easy to parody:
There’s a certain sort of Republican who likes to talk about the problems of big government.  The basic problem with these guys is that when you ask them for examples of how we could reduce the size of government, the best they can think of is hand-washing.  Not the drug war, not mass surveillance, not the prison system, not police abusing suspects, not the bloated defense budget, but hand-washing.

The worst thing you can say about mandating those “Employees must wash hands” signs is that anyone sensible would post those signs anyway.  That’s the absolute worst thing you can say about that rule.
There are a lot of things that infringe upon a person's liberty.  It is a precondition of living in a society that you need to be able to make trade-offs between absolute freedom and the benefit of numbers.  Even very libertarian societies (think of Vikings, with the personal enforcement of legal judgments and interesting ideas like "outlaw" status) had a framework to handle disputes and to enforce the acceptable code of conduct between people. 

But these sorts of rules (that reduce the risk of infection via a passive reminder system) really do seem to be a low priority.  There are some regulations that really harm small businesses, and they are worth talking about.  But this sort of reform seems to be a very low priority area, even if you did think that the signs were a bad idea. 

[and this is a non-partisan comment on my part, it is easy for any group to set poor priorities and focus on the small and inconsequential]

Wednesday, February 4, 2015

Intellectual Property

This is Joseph and I read over Mark's latest post with a great deal of interest.  One issue that was quite interesting was how many of the properties that he is talking about are ancient.  But they are highly profitable for companies because they remain under protection.  This applies not just to trademarked characters, but even to the original properties (which can make it harder to preserve them). 

Matthew Yglesias points out that this very fate is happening to Star Wars:

The ridiculous thing about the situation isn't that Lucas doesn't want to make the cut of the film that I want to watch. It's that it was illegal for Harmy to make it. And it was illegal for me to download it. And it would be illegal for me to make it available for download from Vox.com or even to put a link on this page that would let you go get it. It's illegal because of the ways that, over the years, Congress has extended and expanded the scope of copyright law in ways that have become perverse and destructive to human culture.

Of course it would be difficult to have a thriving commercial culture if films like Star Wars didn't enjoy some copyright protection.

But how many years of exclusive right to profit off a hit film do creators need to make production worthwhile? Five? Ten? Fifteen? Thirty? I'm not sure exactly what the right answer is. But obviously Star Wars made more than enough money in its first three decades of existence to satisfy any sane human being. The additional decades of copyright protection it enjoys do nothing to create meaningful financial incentives for creators. Even worse, Congress has gotten in the habit of retroactively extending copyright terms, so that in practice nothing ever loses copyright protection anymore.
With Star Wars, I am going to go out on a limb and note that it is exceedingly unlikely that the original film moving out of copyright (now) to allow fan versions would not have influenced the creators to not make the film in the first place. 

This also has a negative effect in other media.  The Kindle has been amazing for making very old books available -- I've read several histories from the 1800's which have been exceedingly useful in understanding how social norms have evolved.  But once the era of copyright enters, very few older yet obscure properties are around. 

What makes this annoying is that it is the reverse result of what copyright was intended to do.  It was supposed to spur creative work by making it financially rewarding.  But a book from the 1950's that has been out of print for decades is not making anybody money.  Instead, it makes it harder to preserve the intellectual history of some key areas, that might not be popular right now but could become so in the future. 

And, for the record, retroactive copyright cannot possibly induce more creative work because people judge risk/reward ratios on what the laws are and not what the laws might one day become. 

Tuesday, February 3, 2015

Anti-militarism

From Alex at Marginal Revolution:
Today the patriotic brand of anti-militarism, the brand that sees skepticism about the military and the promotion of peace and commerce as specifically American, is largely forgotten.
In the comments, people seem to focus on the question of just how historical is this viewpoint, anyway.  There are some obvious historical advocates (President Eisenhower, for example) but it may or may not have been widespread and people's views may have evolved.

However, I think this falls into the "is/ought" fallacy, in the sense that (even if it was not especially widespread as a view) there is no reason that this could not be a valid expression of patriotism.  The question of why a country with a history of isolationism, small standing armies, and a tradition of innovation/commerce could not focus on the economic parts of being American is unclear.

It's an idea that should get more attention. 

Before the age of the fan

The following is pulled from a much longer Mark Evanier piece on a notoriously unsuccessful attempt at bringing Archie and the Riverside gang to prime time television in the Seventies. TV buffs should read the whole thing (particularly the big reveal at the end), but as a business and marketing guy, this passage seemed especially interesting.
This one first ran on this blog on September 27, 2003 and it's about two pilots that were done in the seventies by the company for which I was writing Welcome Back, Kotter. Contrary to what has been reported elsewhere, I was not a writer on either, though I turned down an offer to work on the second. I was an unbilled consultant on the first. What happened was that the Komack Company had obtained the rights to do these special-pilots and Jimmie Komack had this odd notion of how to approach doing an adaptation of an existing property. His view was that it should be done by writers and producers who were completely unfamiliar with the source material so their minds were uncluttered by what had been done before.

While I was working there, he also did a pilot that brought back the not-dissimilar character of Dobie Gillis. Dobie's creator Max Shulman had co-written a pilot script for the revival — a real good one, I thought, that updated the property but still captured what was great about the old series. ABC assigned the project to Komack's company and Komack used the Shulman script to attract the necessary actors from the original version — Dwayne Hickman, Bob Denver, Frank Faylen and Sheila James. Then, once they were committed, he tossed out the Shulman script and had a new one written by two writers who'd never seen the original show. (If you think I'm making this up, read Dwayne Hickman's autobiography. [Hickman went on to become a CBS programming executive from '77 to '88, so Komack may have come to regret pissing him off on this one.-- MP])

Jimmie took a similar approach to turning Archie into a TV show. The creative staff he engaged were not totally unfamiliar with the property but he urged them not to read the old comics and to instead work from a rough outline someone had written about who they were. This did not sit well with John Goldwater, who ran and co-owned the Archie company and who regarded himself as the creator of the feature. One day, Komack called me in and said, "You know all about comic books, don't you?" I said I did. He said, "Archie Comics?" I said I did. Later that day, he brought me into a meeting with Mr. Goldwater, who was visiting from New York, and introduced me as his resident Archie expert and consultant.
I suspect that Komack's attitude was fairly representative. Fan culture was still in a fairly embryonic state in the Seventies. Other than the occasional opportunity to mine for camp, adaptations of popular pulp and nostalgia properties largely ignored the source material. The idea of sending A-list stars to woo the crowds at Comic-Con was decades away. These days, studios are extraordinarily concerned with winning the support of the fan base, even on bubble gum properties like Archie. It's almost impossible to imagine a producer instructing writers not to read the comics they were adapting.

What changed? For starters, the fan grew, both in number and economic clout. In the early Seventies, the San Diego Comic-Con was attracting a few hundred attendees. These days it exceeds 130,000 with tickets selling out within a couple of hours.

More importantly, Hollywood realized there was even more money to be made in straight adaptations than from the camp versions of the Sixties (though it should be remembered that the Adam West Batman was a massive hit for its first season). The 1978 Superman made 300 million on a budget of 55 million. The 1989 Batman cost seven million less and broke 400 million. Recently, Iron Man, the Avengers and company, though generally falling a bit short of those films on ROI, have opened up the possibility of unequaled potential franchise profits. Superheroes are the biggest thing to hit the movie industry since the big budget musicals of the Sixties (Sound of Music had a budget of just eight million and made almost 300 million. Along with other hits like Mary Poppins and Oliver!, this box office success inspired Hollywood to pour more and more money into big budget musicals. The trend did not end well).

Is this new emphasis on the fan a good thing? So, is the rise of the fan a good thing or a bad thing?Probably, a little bit of both. Certainly, the cross fertilization with many talented writers moving easily back-and-forth from the different media is a good thing. Lots of talented people were going underutilized as were many excellent, and as time has shown, highly profitable ideas.

However, we should remember that most of these ideas are disturbingly old. Fans have a tremendous fondness for properties with lots of history. That has been a dangerous combination with media consolidation, spiraling budgets and regulatory capture of the copyright process. Billions of dollars are going to the constant rebooting of aging franchises owned by a handful of companies while start-ups with fresh ideas struggle.

There are also risks associated with focusing too much on any segment of the audience, be it teenagers, critics, industry types or hardcore fans. As someone who's in the dangerous middle -- enough of a pop-culture geek to get the jokes but not enough of one to enjoy them -- I may be overly sensitive to nerd-pandering, particularly to self-reference, but I don't think I'm entirely unrepresentative.  J. J. Abrams has put me off Star Trek and probably Star Wars for life (If I ever hear another actor say "I'm a doctor, not a..." it will be too soon). Even in Skyfall, a movie I greatly enjoyed, the allusions to exploding pens and Aston-Martins were a distraction on first viewing and have gotten more annoying since.

Another indication of this influence can be found in the coverage of TV programs and movies that inhabit well-known fictional universes. PR is still the most valuable form of advertising and one of the most effective ways for these shows to generate that attention is reveal some connection to that universe.

For example, there was a great deal of coverage around the revelation that the characters played by Chloe Bennet and Kyle MacLachlan were a couple of fairly minor Marvel characters, though MacLachlan's has been bouncing around for a long time now. For serious fans, this was a big deal, but for most viewers, the connection meant nothing. If the name "Calvin Zabo" doesn't mean anything to you, you are not the target audience.

Eventually, the strategy of making most of the audience sit through allusions and in-jokes they don't get will inevitably have a negative impact on the ratings and the box office. This makes an already unstable trend even less sustainable. We are seeing lots of records being set on the revenue side, but far more on the other side of the ledger.

Take a look at this list of the most expensive films adjusted for inflation. If we exclude the 1978 Superman (the budget of which was meant to cover the sequel as well and which barely made the list anyway), the only movie less that twenty years old to break 200 million in today's dollars was the notorious Cleopatra and that only comes in at number 17. Hollywood has been burned before when hot genres petered out, but it has never stood to lose this much.

Monday, February 2, 2015

"Every Tech Commercial"

College Humor does some very good work, particularly when compared with its overproduced, underwritten competition at Funny or Die. The following clip is a sharp parody of an all-too-familiar advertising genre. It is also a reminder of of how much of the marketing of technology is driven by play on associations, and how little relies on the actual functionality of the products.










Friday, January 30, 2015

Electronic voting

Another problem with single point of failure solutions (and another video from the same author, who seems to be paranoid about trusting people as a personal vocation):


The idea of "trust nobody" is probably a wise decision in voting matters.  After all, we worry about voter fraud at the individual level (thus the new laws on ID being required to vote) but an electronic system gives a very targeted intervention a much better chance of success. 

Thursday, January 29, 2015

What's the big deal about an indictment?

I am a regular reader of Esquire's political blogger Charles Pierce but I am always cautious about citing him on our blog. Pierce is openly and aggressively inflammatory. He is also smart, funny and clear-eyed. He is largely immune to the groupthink that afflicts most of his colleagues accross the political spectrum and, in a profession where most have lost their stomach for calling a spade a spade, Pierce goes out of his way to point out naked emperors. 

I may not agree with all of the specifics in the following passage, but the main points about lack of proportionality and the dangers of naive cynicism are extraordinarily important and get nowhere near the attention they deserve.
I have made light of the fact that, of the putative Republican frontrunners on their "deep bench," two of them are under criminal investigation (Walker and Chris Christie), while one of them, the guy that Costa assures us has learned from his mistakes in 2008 and who is now bringing energy to his public appearances (which is true) is actually under indictment. Now, I grew up in the Commonwealth (god save it!), where once people re-elected James Michael Curley when Himself was in prison, so I have no illusions about the traditional American taste for rogues and mountebanks in our politics. But this is also an era in which the elite political press makes every candidate jump through countless biographical and intellectual hoops to qualify as a "serious" contender. (If you don't believe me, watch what happens 30 seconds after Hillary Clinton announces.) You can be disqualified from the "top tier" if, for example, 20 years ago, you underreported your maid on your income taxes, even if you made good on it later. You can be disqualified from political life and your government job if you once voiced the opinion that the Bush Administration hid a great deal of what it knew about the circumstances surrounding the 9/11 attacks. The fact that you wind-surf, god help us, can be discussed endlessly on the campaign trail. We've got people taking Mike Huckabee's bad-mouthing of Beyonce seriously. But the fact that three of the prime cuts from the GOP have the law on their heels is somehow disappeared from relevance almost entirely. 'Ees a puzzlement, and IOKIYAR doesn't begin to explain it.

One of the factors in play unquestionably is the fact that over 40 years of empowered hatred toward government has had the very much intended consequence of cresting generally a belief that all government is not only oppressive and incompetent, but also corrupt. Along with that, a more modern variation has been created whereby, if everything is political, then any investigation of a politician must needs be political, too. This began, I believe, with the delegitimization of Lawrence Walsh's probe into the crimes of Iran-Contra. (Side note: I know I'm a one-note piano on this subject but, dammit, so much of what's wrong in our politics goes back to those days, and those people, and the crimes with which they got away. Ollie North arranged the sale of missiles to a terrorist-sponsoring state. He got to be a hero. The country went bad wrong.) It got worse when the Republicans determined to use the actual criminal investigative techniques that Walsh conducted to pursue Bill Clinton on charges that actually were purely political, at least prior to those used in the impeachment kabuki, which was such a farce that it soured all but the most virulent souls on any investigations at all. Which is how we lost the special prosecutor status, and why the Bush administration was so cavalier about stonewalling Congress in the latter's pursuit of what was being done in the name of the country all over the world. The redefining of any investigation into government corruption as essentially political has so deprived any such investigation of widespread public credibility as to delegitimize any such investigation almost from jump.

That's pretty bad when it comes to our functioning as a self-governing republic, but it's halfway understandable. People have lives and problems of their own. But the elite political media has no such excuse. This is their job. That a man under criminal indictment can zip around the country, selling T-shirts off his own alleged wrongdoing, and do so full in the knowledge that his criminal indictment is treated in the coverage as less important than the fact that he wears glasses now, is a dreadful verdict on journalistic malpractice. The fact that Scott Walker is under investigation (again) for crimes in (another) office really ought to count more than the fact that he's learned how to yell at people on the stump.

Wednesday, January 28, 2015

Great man theory of history: a counter example

One of the persistent ideas that shows up again and again in history (and even in debate about modern social construction) is the idea of the great man.  Somebody (like Napolean, for example), who can transform a nation or who enable a group to do exceedingly remarkable things.  The competing, albeit wildly unpopular, alternative is attributing these things to institutions.  In the era of anti-government mania, this isn't an overwhelmingly popular view. 

But one good example from history about how these two things worked together is Hannibal.  Rome was famous for good institutions and for people holding exceedingly temporary appointments (consul, for example, was a one year post but was the key post in Roman governance). 

Razib Khan has a nice discussion of these issues:
 As you may know Hannibal was the general who led the armies of Carthage in the Italian peninsula during the Second Punic War, to great effect. In fact, until the battle of Zama in North Africa, during the last phases of the war, Hannibal did not lose to a Roman army. And yet despite his record of victory in tactical engagements, he was strategically bested by the Romans and lost the war. Unsurprisingly if there is one figure who looms large in the narrative of The Fall of Carthage it is Hannibal. This is striking because almost all of what we know about these wars comes down to us thanks to the Romans, so our perceptions are coloured by their biases, and he was their great antagonist. And yet it is undeniable that Hannibal’s raw tactical genius won grudging admiration and respect from the Romans. He was a singular figure, with no equivalent among the Romans of his era, with all due apologies to Scipio Africanus. And yet Rome won, and Carthage lost.
This has a lot to do with modern theories of governance.  Should the emphasis be on inspiring leaders (like the current president) or on the institutions of the state (or corporation or university . . . ), if one wants to improve outcomes?  This is an important data point on the side of improving institutions.

[and, yes, it is always possible that the answer is both]

Tuesday, January 27, 2015

What if Google forgot about passwords?

This was an interesting thought experiment


I thought it made a rather good point about just how interconnected everything on the web is becoming and how important large players can become to the overall ecosystem.  It's also a good engineering principle that one should avoid single points of failure when possible, because what could fail will eventually fail.   

Monday, January 26, 2015

Infrastructure maintance and suburban development

One does not normally think of suburban developments as being financially bad ideas.  I often see these building patterns as contributing to urban sprawl (wiping out wildlife habitat), congestion on roads, and inequality but being a way to make housing affordable.  But this piece has a strong opinion on even the long term financial viability of this development pattern:
In each of these mechanisms, the local unit of government benefits from the enhanced revenues associated with new growth. But it also typically assumes the long-term liability for maintaining the new infrastructure. This exchange — a near-term cash advantage for a long-term financial obligation — is one element of a Ponzi scheme.

The other is the realization that the revenue collected does not come near to covering the costs of maintaining the infrastructure. In America, we have a ticking time bomb of unfunded liability for infrastructure maintenance. The American Society of Civil Engineers (ASCE) estimates the cost at $5 trillion — but that's just for major infrastructure, not the minor streets, curbs, walks, and pipes that serve our homes.

The reason we have this gap is because the public yield from the suburban development pattern — the amount of tax revenue obtained per increment of liability assumed — is ridiculously low. Over a life cycle, a city frequently receives just a dime or two of revenue for each dollar of liability. The engineering profession will argue, as ASCE does, that we're simply not making the investments necessary to maintain this infrastructure. This is nonsense. We've simply built in a way that is not financially productive.
Interestingly, this hypothesis suggests that tax flight is more effective than one might think at protecting people from paying taxes (municipal taxes being the easiest to avoid by relocation) but that this has long term consequences in terms of infrastructure.  And, of course, this same mobility will let people leave an area after the infrastructure begins to decay and cannot easily be maintained.

While this does seem overly simplistic as a complete explanation for what is happening with US infrastructure, it certainly cannot be helping matters.  It's also pretty hard to decide what to do about.  My best bet is to make people bear more direct costs -- but congestion and mileage taxes don't strike me as especially politically palatable. 

But it does bear thinking about.