Tuesday, March 17, 2015

The Mars conversation I'd like to be having

As a bit of an antidote for all the accumulated bile of the last few Mars One posts, I'd like to recommend this IEEE paper from Ian McNab. I'll admit it lost me on the curves a few times, but on the whole it's remarkably readable.

From the introduction:
In the past 40 years, mankind has ventured into space using well-established rocket technology involving liquid fuels and/or solid propellants. This approach has the advantage for astronauts and fragile payloads that the rocket starts slowly from the surface of the Earth with its full fuel load, and, as the fuel is burned off, the altitude and speed increase. In addition to minimizing the aerodynamic and aerothermal loads, this provides relatively modest accelerations—maximum values of a few “gees” are used for human passengers. Because only a small fraction of the initial mass reaches orbit, rockets of substantial size are required to place tens of tons into near-Earth orbit. Offsetting these remarkable successes is the very high cost of burning chemical fuel with a modest efficiency in a rocket engine to get out of the Earth’s gravitational well. Present estimates are that it costs $20 000 to get one kilogram of material into orbit. Unless alternatives can be found, it seems likely that mankind’s ventures into space will be limited to a few adventures that can only be undertaken by wealthy nations—the science-fiction writer’s dream of colonizing the planets and stars may be unaffordable.

Proposed solutions fall into four general categories: better rocket propellants; the space elevator; gun launch from the Earth’s surface; and laser launch. Although these options will not be discussed in detail, a few comments are appropriate. First, there appear to be no acceptable alternative rocket propellants that can offer substantial improvements compared with present choices. Second, although the space elevator seems to have great promise as a concept for the future, its practical realization awaits the development of a material that is strong enough to be able to carry its own weight (and that of the payloads it will lift) from the Earth’s surface to geosynchronous orbit. Third, estimates indicate that to launch payloads of less than a ton with a laser would require multigigawatt lasers far larger than any presently in existence
[Having concluded that gun launches are currently the most viable option, McNab starts drilling down into the details.] 

... 
If the launcher is sufficiently long, the acceleration can be reduced to a level that is compatible with present component technology, although the acceleration forces will not allow people or fragile payloads to be launched with feasible launcher lengths. Guns may therefore be limited to launching robust packages such as food, water, fuel, and replaceable components. This may be an important support function for the International Space Station (ISS) or other missions

A disadvantage of gun launch is that the launch package has to leave the gun barrel at a very high velocity ( 7500 m/s) through the Earth’s atmosphere, leading to a very high aerothermal load on the projectile. The reentry vehicle community has successfully developed techniques to overcome this situation (when traveling in the reverse direction), and it seems possible that similar techniques can resolve this problem, either through the use of refractory or ablative nose materials or by evaporative cooling techniques. The mass of coolant required for this appears to be acceptable, as discussed below. The second concern for a gun is the size of the package that can be launched. Unless a very large gun can be built, the payload launched into orbit per launch will be a few hundred kilograms, which will require a large number of launches per year. For example, to provide 500 tons/year to orbit would require 2000 launches/year—a little over five per day on average. An infrastructure in space for handling this traffic and distributing the payloads will have to be created. Issues to be addressed will include decisions on handling or recycling the nonpayload components that reach orbit.
I don't want to get into whether or not we should be spending more on space exploration, and I certainly don't want to argue the merits of this proposal (that topic would take me out of my depth almost immediately). For now, I want to stay meta and discuss the discussion.

Let's think about the question of why so many reputable news organizations are devoting so much coverage to Mars One and so little to other, better aerospace stories such as this one.

What do I mean by better?

For starters, this is a credible proposal from a well-established authority published in an IEEE journal, and, based on my experience, it's an idea that other engineers in the field are not willing to dismiss out of hand; they may not consider it practical, but they do take it seriously. (For example, JPL was looking into using orbiting railguns to launch small interplanetary probes as far back as 1988.)

And we really are talking about a game-changer here. If McNab's estimates hold up, we're talking about reducing launch costs by considerably more than an order of magnitude. Even if we factor in the need to use traditional rockets for people and other delicate cargo, that cost reduction is still enough to shift the underlying economics of all space-based enterprises, ranging from asteroid mining to tourism to, yes, interplanetary colonies.

Finally, rail guns are cool. All Mars One has to offer is cheesy artist conceptions. With railguns you get video like this:



This is, of course, just one example. There are any number of fascinating stories about aerospace research. Why do they go unnoticed while Mars One continues to make the cut? Here are my guesses:

1. Bullshit does not count against you. As Elmo Keep spelled out in painful detail, every aspect of this story collapses under inspection, but even after Keep's expose, the stories kept coming;

2. People love a bargain (i.e. there's a sucker born every minute). I've noticed a number of cases recently where an unrealistically low price seems to make proposals more newsworthy (this example jumps to mind);

3. Everybody loves a messiah (even a Galtian one). Entrepreneurs and market forces are also easy pitches these days, while stories of public action and collective sacrifice fall out of favor. Of course, even in the Sixties, space was a tough sell (even as we were sending men to the moon, people were suggesting that the money would have been better spent down here), but now even the suggestion that we as a society would take on something expensive and challenging seems oddly quaint.

Monday, March 16, 2015

Corporate PR vs. Beat Sweetening -- different but not that different

Brad DeLong recently provided an interesting complement to our ongoing flack-n-hacks thread (which Andrew just joined). Just to review, we were talking about how PR firms (the 'flacks' in question) provide leads, leg-work and even finished  copy in exchange for favorable coverage. DeLong uses this embarrassing Politico puff piece of Stephen Moore to examine the way journalists trade favorable coverage for access and scoops (which is, more or less, Politico's unofficial mission statement).

This piece is what my old next-door office neighbor Jack DeVore, then Treasury Secretary Lloyd Bentsen's Assistant Secretary for Public Relations, called a "beat sweetener": the point of such an article is not to inform the article's readers about some person regarded as either being influential or typical in an interesting way, but rather to burnish the reputation of the subject. As such, it omits key parts of the story and so misleads the readers in the interest of achieving that goal. The hope is that the subject of the article will at some future point then open up and preferentially dish to the reporter who has done him the favor of burnishing his reputation.
...

That is it. No observations about publishing the wrong numbers. No observations about how Stephen Moore has been a huge backer of Sam Brownback's Kansas tax-cut state revenue disaster. Nothing about how Herman Cain' 9-9-9 plan blew up in his face because no analyst who could add could get it to work arithmetically no matter how many thumbs they put on these scale. No critical quotes from anybody about the quality of Stephen Moore's analytical work--which would have been the easiest thing in the world to get. In fact, no positive quotes at all from any economist about Stephen Moore as an economist or an analyst.
...

So the message I get from this is that there is, still, an enormous need for publications and platforms that will call a spade a goddam shovel, afflict the comfortable, entertain-along-with-informing rather than entertain-instead-of-informing, and be trusted information intermediaries in which the words on the page are there to inform you about what is what rather than to mislead you in the hope that those in whose interest you have been misled will at some point in the future dish the writer a scoop.

Sunday, March 15, 2015

I don't have time to discuss it today...

but you should definitely check out this post from Dean Dad if you've been following the higher ed side of the education reform debate.

Public Matters: A Response to Kevin Carey

Friday, March 13, 2015

More on the rise of PR

Living in North Hollywood, one gets plenty of notice of the upcoming Emmy season. Perhaps even more then with the Oscars, this award show is preceded by a blanketing of the neighborhood in "For your consideration" billboards. You can get a rough but reasonable idea of who is spending what by looking at how many billboards you see for different shows and different networks.

Per show, at least, there seems to be a huge disparity between Netflix originals and virtually everybody else. Billboards for House of Cards or Orange Is the New Black are all over the place. By contrast, I don't recall seeing an Emmy ad for Orphan Black or Justified or even the Good Wife. On this per show basis, it would certainly appear that the Emmy-season outdoor advertising budget for Netflix is many times larger than that of its nearest competitor.

I don't want to get into the question of whether or not this is a good business decision on the part of Netflix and I certainly don't want to open the topic of which awards were deserved. Instead, I want to tie this into the previous post on the rise of PR and the decline of journalism.

I read any number of pieces about how winning Emmys meant that Netflix had "arrived." As far as I can remember, none of these articles mentioned the disproportionate level of marketing it took to win these awards. Of course, omitting context is a common sin, particularly when the details undercut the standard narrative (adherence to the standard narrative is pretty much the prime directive of modern journalism), but there is an added layer of conflict of interest here.

The practice of letting interested parties research stories and even write copy is as old as typesetting, but there is reason to believe things have gotten much worse. What was once an occasional lapse now appears to be the norm.

Modern journalism is now basically row upon row of glass houses. Stone-throwers have become decidedly unpopular (check out the NYT's attitude toward Nate Silver). Even if a reporter wasn't beholden to some publicist, he or she would still face considerable pressure from colleagues and editors not to make a big deal of these questionable relationships.

I realize I seem to pick on Netflix a lot, but I really don't have a serious problem with the company.  My problem is with the way today's journalists cover business, neglecting due diligence, allowing conventional wisdom to outweigh facts, and letting companies write their own version of reality.

Netflix just happens to be a great example.

Thursday, March 12, 2015

A premature diagnosis of cost disease?

After cuts in state funding, the most popular theory to explain the rapid increase in college tuition seems to be cost disease:
Baumol's cost disease (also known as the Baumol Effect) is a phenomenon described by William J. Baumol and William G. Bowen in the 1960s.[1] It involves a rise of salaries in jobs that have experienced no increase of labor productivity in response to rising salaries in other jobs which did experience such labor productivity growth. This seemingly goes against the theory in classical economics that wages are closely tied to labor productivity changes.
I've never felt entirely comfortable with the way this explanation fits (or fails to fit) the data. It always seemed to me that the tremendous increase in very low-priced adjunct labor would more than balance out the flat productivity gains.

If this post by Paul Campos of Lawyers, Guns and Money is accurate, the theory is even more at variance with the facts that I thought. ( Campos also has some interesting things to say about the drop in state funding explanation.)
Everyone is aware that the cost of going to college has skyrocketed since [fill in any date going back to the middle of the last century]. Why has this happened? This post is about one possible explanation, that turns out not to have any validity at all: increases in faculty salaries. In fact, over the past 40+ years, average salaries for college and university faculty have dropped dramatically.

Salaries have increased, sometimes substantially, for a tiny favored slice of academia, made up of tenured professors at elite institutions, some professional school faculty (business, law, medicine), and most especially faculty who have moved into the higher echelons of university administration. Such examples merely emphasize the extent to which the economics of the New Gilded Age have infiltrated the academic world: the one percent are doing fabulously well, and the ten percenters are doing fine, while the wretched refuse of our teeming shores will adjunct for food.

Numbers:

Average salary for all full-time faculty in degree-granting post-secondary institutions (this category includes instructors and lecturers, as well as all ranks of professors) in constant 2012-13 dollars:

1970: $74,019

2012: $77,301

These figures, of course, give a very incomplete picture of the economic circumstances of the actual teaching faculty in America’s institutions of higher education.

One of the more astonishing statistics regarding the economics of our colleges and universities is that, despite the fantastic increase in the cost of attending them, there are now on a per-student basis far fewer full-time faculty employed by these institutions than was the case 40 years ago. Specifically, in 1970 nearly 80% of all faculty were full-time; by 2011, more part-time than full-time faculty were employed by American institutions of higher learning (note that the former category does not include graduate students who teach).

While comprehensive salary figures for part-time faculty aren’t available, it’s clear that their salaries are on average vastly lower than those of full-time faculty (and of course when it comes to who does the bulk of the actual teaching at many schools, the designations “full-time” and “part-time” have a distinctly Orwellian flavor). If we assume that “pat-time” faculty earn one-third as much as their full-time counterparts — and this seems improbably optimistic, given that the average compensation for part-time faculty for teaching a three-credit course is around $2,700 — that would mean that in 1970 average salaries for college and university faculty were nearly 30% higher, in real dollars, than they are today.

This an astonishing figure, given that, in the last 40 years, tuition at private colleges has more than tripled, while resident tuition at public institutions has nearly quadrupled.

You guys can write this post yourselves -- I'm tired

http://www.nytimes.com/2015/03/10/opinion/david-brooks-the-cost-of-relativism.html






Wednesday, March 11, 2015

More Martian musings – – reality shows and diet pills

Given all of the renewed attention to the Mars One project, this might be a good time for a quick little catch-up essay.

Maybe it's just me, but there are a few extremely salient points that have a way of being neglected in this conversation.

First, manned interplanetary spaceflight is almost certain to be very expensive and the cost for setting up permanent self-sustaining colonies is almost certain to be many times more so.

Second, the superiority of manned versus unmanned spaceflight is, for now, almost entirely symbolic. This does not mean that there are not certain specific economic and scientific benefits associated with manned spaceflight nor does it mean that manned spaceflight is a bad idea. It just means that, given current technology, sending explorers to Mars is something that, in the final analysis, we would do because we choose to as a society. This is even more true with sending colonists.

I actually don't have a problem with this kind of argument. At the risk of some muddleheaded nostalgia, I like the idea of leaders standing up and asking the people what kind of country we'd like to live it. Though I am not a huge fan of JFK, I greatly admire both the rhetoric and the sentiment behind "not because they are easy, but because they are hard."

Which brings me to my main objection to Mars one.

Without delving too deeply into the promise and the limitations of businesses like space X, when it comes to the kind of massive operations we're talking about here, we really only have two choices:

The first is to decide this is something we want to do and that we are willing to spend a considerable amount of money it would take to do it;

The second is to wait for a technological breakthrough which will change the underlying economics, with the understanding that this breakthrough may not come through in our lifetime.

I don't want to wait into that debate right now but, if landing on Mars is important, then it is a debate we need to have.

Though every major aspect of the Mars One proposal is laughably unrealistic, it resonates with people because it gives us an out. We can sit around and enjoy dreaming about how exciting the future will be without actually having to make any of the tough choices or do any of the hard work to make it exciting.

The idea of sending people to Mars just by watching a reality show is analogous to the idea that you can solve a lifelong problem with obesity by taking a miracle diet pill. I suspect that most of the people who try these products know on some level that it is foolish to trust the unlikely and unverified claims of late night TV pitchmen but the desire to believe outweighs their judgement.

Buying a proposal for a space program from a reality show producer isn't all that different.

Tuesday, March 10, 2015

The Decline of Journalism and The Rise of Public Relations

Any comments to a recent post, Andrew Gelman brought up a point that I want to dig into a bit more, at least briefly, the connection between the decline of journalism and the rise of public relations.

Here's my take: there is clearly a powerful relationship here though the direction of causality gets a bit complicated and goes both ways. For a variety of reasons, including but not limited to downsizing, an increasingly insular culture, and a shift to a star system that serves to hollow out the middle of the profession, journalism became both less diligent about maintaining quality and hungrier for free content (an appetite greatly expanded by online forums). While things changes were happening, companies were also growing more experienced at measuring and manipulating public opinion.

The decline in journalism created an extraordinary opportunity for corporate PR departments. News stories that portrayed products and companies in a favorable light were both more persuasive than traditional advertising and considerably cheaper.

While we are on the subject, my biggest concerns about the role of PR in modern journalism are not the question of accuracy or bias, though both of those are important. What really concerns me is the way these outside influences determine what does and does not get covered and the lack of awareness (or at least acknowledgement) on the part of the press. More on that coming later.

Monday, March 9, 2015

MOOCs and the Eugen Weber Paradox

As always seems to happen when I have other things I need to be doing, all sorts of interesting threads have started popping up and saying "Blog me! Blog me!"

Case in point, Erik Loomis of LGM has gotten back on the MOOC beat. I've got a couple of original posts on the subject in the works, but first I want to bring an old post from the teaching blog back into the conversation. It addresses what I think may be the fundamental questions of the ed-reform-through-technology debate:

After over a century of experimenting with educational technology, why have the results up until now been so underwhelming?;

And how will the new approaches being proposed fix the problems that plagued all of those previous attempts?


The Eugen Weber Paradox


If you follow education at all, you've probably heard about the rise of online courses and their potential for reinventing the way we teach. The idea is that we can make lectures from the best schools in the world available through YouTube or some similar platform. It's not a bad idea, but before we start talking about how much this can change the world, consider the following more-serious-than-it-sounds point.

Let's say, if we're going to do this, we do it right. Find an world renowned historian who's also a skilled and popular lecturer, shoot the series with decent production values (a couple of well-operated cameras, simple but professional pan and zoom), just polished enough not to distract from the content.

And if we're going to talk about democratizing education, let's not spend our time on some tiny niche course like "Building a Search Engine." Instead, let's do a general ed class with the widest possible audience.

If you'll hold that thought for a moment...

A few years ago, while channel surfing in the middle of the night, I came across what looked like Harvey Korman giving a history lesson. It turned out not to be Korman, but it was a history lesson, and an extraordinarily good one by a historian named Eugene Weber, described by the New York Times as "one of the world’s foremost interpreters of modern France." Weber was also a formidable teacher known for popular classes at UCLA.



The program I was watching was “The Western Tradition,” a fifty-two part video course originally produced for public television in 1989. If you wanted to find the ideal lecturer for a Western Civ class, it would probably be Eugen Weber. Like Polya, Weber combined intellectual standing of the first order with an exceptional gift and passion for teaching. On top of that, the Annenberg Foundation put together a full set of course materials to go with it This is about as good as video instruction gets.

All of which raises a troubling question. As far as I know, relatively few schools have set up a Western Civ course around "the Western Tradition." Given the high quality and low cost of such a course, why isn't it a standard option at more schools?

Here are a few possible explanations:

1. Medium is the message

There are certain effects that only work on stage, that fall strangely flat when there's not an audience physically present in the room. Maybe something similar holds with lectures -- something is inevitably lost when moved to another medium.

2. Lecturers already work for kind words and Pez

Why should administrators go to the trouble of developing new approaches when they can get adjuncts to work for virtually nothing?

3. It's that treadmill all over again

You probably know people who have pinned great hopes on home exercise machines, people who showed tremendous excitement about getting fit then lost all interest when they actually brought the Bowflex home and talking about exercise had to be replaced by doing it. Lots of technological solutions are like that. The anticipation is fun; the work required once you get it isn't.

This is not a new story. One of the original missions of educational TV back in the NET days was to provide actual classroom instruction, particularly for rural schools.* The selection was limited and it was undoubtedly a pain for administrators to match class schedules with broadcast schedules but the basic idea (and most of the accompanying rhetoric) was the same as many of the proposals we've been hearing recently.

Of course, educational television was just one of a century of new media and manipulatives that were supposed to revolutionize education. Film, radio, mechanical teaching machines, film strips and other mixed media, visual aides, television, videotape, distance learning, computer aided instruction, DVDs, the internet, tablet computing. All of these efforts had some good ideas behind and many actually did real good in the classroom, but none of them lived up to expectations.

Is this time different? Perhaps. It's possible that greatly expanded quantity and access may push us past some kind of a tipping point, but I'm doubtful. We still haven't thought through the deeper questions about what makes for effective instruction and why certain educational technologies tend to under-perform. Instead we get the standard ddulite boilerplate, made by advocates who are blissfully unaware of how familiar their claims are to anyone reasonably versed in the history of education.

* From Wikipedia
 The Arkansas Educational Television Commission was created in 1961, following a two-year legislative study to assess the state’s need for educational television. KETS channel 2 in Little Rock, the flagship station, signed on in 1966 as the nation's 124th educational television station. In the early years, KETS was associated with National Educational Television, the forerunner of the current PBS. The early days saw black-and-white broadcasting only, with color capabilities beginning in 1972. Limited hours of operation in the early years focused primarily on instructional programming for use in Arkansas classrooms


More on Mars One -- I expect this from ABC News but Sheldon?

There's been another wave of PR in support of the privately funded "Mars mission" Mars One (and yes, I do need to use quotation marks). There have been news stories, interviews with applicants who did or didn't qualify for the "mission," (NPR, how could you?) and even fictional characters like Castle and, sadly, Sheldon Cooper ("The Colonization Application").

Just to review, not only is this mission almost certain never to happen, but every major aspect of it collapses under scrutiny.

The funding goals are wildly unrealistic, the budget estimates are comically optimistic, and what little technology has actually been proposed is so badly designed that, according to an MIT study, it would be likely to kill all the colonists within a few months. I am pretty sure Howard would have pointed all of these things out to Sheldon.

You could also find some these objections in this piece from ABC, but you'd have to look closely because the reporters buried them as deep as possible, just far enough from the end to allow Mars One CEO/confidence man Bas Landorp have the last word.

Obviously this is a fun story, a lottery where anyone can become a colonist to Mars made even more dramatic by the twist of being a one-way trip. I also get that this is a story many probably most of us would like to believe. That is a high enough standard to justify a hook on a TV episode, but it is an embarrassingly low one for major news outlets.

Friday, March 6, 2015

The Golden Age of PR

I usually check out Jonathan Chait's blog once or twice a week and I usually ignore the "Most Viewed Stories" column to the right of the page. Recently, though, one of the items caught my eye.

"Who Was That at the End of the New Avengers Trailer, and Why Should You Be So Excited?"


The link led to a Vulture.com post about the comic book character the Vision, followed by some speculation about what part he might play in the upcoming Avengers movie. Looking over the article, it struck me that this is an amazing time to be a publicist.  We have gone from publications hyping movies to hyping trailers to hyping two-second shots in trailers.

Thursday, March 5, 2015

Cartoon Metalogic

Bud Grace, creator of the comic strip the Piranha Club, is a former nuclear physicist -- No, really, look it up -- and every now and then a bit of STEM humor pops up.

For investors, all money isn't created equal

In our most recent discussion of driverless cars, I made the following assertion about Google:
Google has a lot of reasons to want to be seen as a diversified, broadly innovative technology company, rather than as a very good one-trick pony cashing in on a monopoly (possibly two monopolies depending on how you want to count YouTube). A shiny reputation helps to keep stock prices high and regulators at bay.
I didn't really think of it at the time, but this concern is a point we have hit tangentially in the past and which probably deserves a bit more direct scrutiny. Investors often care a great deal about where a company's money comes from. This concern is often neither rational nor consistent and often leads companies to mislead the public about the makeup of their revenues.

Here are a couple of examples. I am going to be rather vague about some details because, you know, lawyers, but the broad outlines are both accurate and the circumstances are common enough that I could always find other cases if pressed . The first involved a financial services company that had products for customers at both ends of the economic spectrum. If you were to look at the company as an outsider or even as a new employee, you might very well assume that the two divisions were roughly equal. You might even suspect that the upscale was more profitable.

In reality a large majority of company's profits came from the low end. It turned out that the profit margin for providing services for poor people in this case was much higher. Stockholders, however, did not particularly like products that served this demographic. Also having a heavily promoted line of products for upper-class people did wonders for the stock price.

Here's another example:
The bank in question was in the middle of a very good run, making a flood of money from its credit card line, but investors kept complaining that the bank was making all that money the wrong way. This was the height of the Internet boom but the bank was booking all of these profitable accounts through old-fashioned direct mail. If it wanted to maximize its stock price, the bank needed to start booking accounts online.

The trouble was that (at least at the time) issuing credit cards over the Internet was a horrible idea. The problem was fraud. With direct mail, the marketer decides who to contact and has various ways to check that a customer's card is in fact going to that customer. With a website, it was the potential customers who initiated contact and a stunning number of those potential customers were identity thieves.

The Internet was an excellent tool for account management, but the big institutional investors were adamant; they wanted to see the bank booking accounts online. Faced with the choice between unhappy investors and a disastrous business move, the company came up with a truly ingenious solution: they added a feature that let people who received a pre-approved credit card card offer fill out the application online.

Just to be absolutely clear, this service was limited to people who had been solicited by the bank and based on the response rates, the people who went online were basically the same people who would have applied anyway. From a net acquisitions standpoint, it had little or no impact.

From an investor relations standpoint, however, it accomplished a great deal. Everyone who filled out one of those applications and was approved* was counted as an online acquisition. Suddenly the bank was using this metric to bill itself as one of the leading Internet providers. This satisfied the investors (who had no idea how cosmetic the change was) and allowed the bank to continue to follow its highly profitable business plan (which was actually a great deal more sophisticated than the marketing techniques of many highly-touted Internet companies).

*'pre-approved' actually means 'almost pre-approved.'
Put bluntly, companies will often pursue strategies or introduce products that are profit neutral or worse because these strategies and products make the companies look diversified or forward thinking or poised to take advantage of some major opportunity. Investors reward these perceptions. With this fact in mind, you can make sense of all sorts of strange business decisions.

For example, Amazon is an innovative, well-run. forward-thinking company, but its  P/E ratio (when it turns a profit) is often in the hundreds,* meaning the company has to be seen as being on the cusp of explosive growth. When you read about the company's online grocery service or its proposed drone-to-door deliveries** and you ask yourself how they can ever make a profit doing this on a large scale, the answer may be that they don't expect to.


* There may be some controversy over how  P/E ratio is calculated for Amazon but that's a topic for another post and probably another blogger.

** I added "drone-to-door" to emphasis the distinction between that proposed technology and large cargo drones. The latter actually does make business sense but would face huge regulatory hurdles.

Wednesday, March 4, 2015

Elegant theories and headless clowns -- more bad tech reporting from the New York Times

The previously mentioned Paul Krugman piece on opera singer Jenny Lind  included a link to this NYT article "adapted from The Price of Everything: Solving the Mystery of Why We Pay What We Do, by Eduardo Porter, an editorial writer for The New York Times." Krugman was criticizing the reliance on simple economic stories that don't fit the facts. Porter was telling one.
Baseball aficionados might conclude that all of this points to some pernicious new trend in the market for top players. But this is not specific to baseball, or even to sport. Consider the market for pop music. In 1982, the top 1 percent of pop stars, in terms of pay, raked in 26 percent of concert ticket revenue. In 2003, that top percentage of stars — names like Justin Timberlake, Christina Aguilera or 50 Cent — was taking 56 percent of the concert pie.
...

But broader forces are also at play. Nearly 30 years ago, Sherwin Rosen, an economist from the University of Chicago, proposed an elegant theory to explain the general pattern. In an article entitled “The Economics of Superstars,” he argued that technological changes would allow the best performers in a given field to serve a bigger market and thus reap a greater share of its revenue. But this would also reduce the spoils available to the less gifted in the business.

The reasoning fits smoothly into the income dynamics of the music industry, which has been shaken by many technological disruptions since the 1980s. First, MTV put music on television. Then Napster took it to the Internet. Apple allowed fans to buy single songs and take them with them. Each of these breakthroughs allowed the very top acts to reach a larger fan base, and thus command a larger audience and a bigger share of concert revenue.
Putting aside the fact that, as Krugman pointed out, we have examples of superstar musicians that predate both recording and broadcasting, this paragraph is still stunningly incomplete and comically ill-informed.

The 1980s cutoff is arbitrary and misleading. The 1880's would make more sense, though it really wasn't until the 1890s that things really took off and it has been a fairly steady stream of technological innovations since then.

Here's a brief, roughly chronological view of some of the highlights:

Disc records

Amplification

Radio

Optical sound tracks on film

Stereo

FM

LPs

HiFi

Television (which brought with it everything from Hit Parade to American Bandstand, Sullivan, Midnight Special and countless shows like this)

Cassettes

CDs

Recordable  CDs

Affordable digital audio editing

And then, of course, a whole family of internet-based innovations.

The past 125 years has been one long stream of "technological disruptions" for the music industry, but most of the innovation over the past couple of decades has mainly broadened the market by increasing selection and lowering production costs. In terms of "allow[ing] the very top acts to reach a larger fan base, and thus command a larger audience and a bigger share of concert revenue," at least for the North American and European audience, the top acts have been near saturation since the Sixties. (Check out the ratings for Elvis or the Beatles on Sullivan.)

By looking at the past thirty years of advances and ignoring the previous ninety, Porter gives us a blatant example of headless clown causal reasoning, arguing that x explains the difference in A and B because x is present in A while ignoring the fact that x is also present in B. Data journalism has fully embraced the idea that two numbers briefly moving in sync constitutes a causal argument.



The phrase "elegant theory" should have set off the red flags and warning lights. Elegance in these books pretty much always means "simplistic and unrealistic." The theories are aesthetically and emotionally appealing but they just barely fit the data in their examples and they usually fall apart completely when taken out on the road.

As previously mentioned, this goes back to what George Polya called (in a quote I really need to dig up) thinking like a mathematician . Polya suggested that the default setting of most people when presented with a rule is to look for examples while the default setting of mathematicians and scientists was to look for exceptions. Mathematical ideas get a tremendous amount of press these days but very few of the people covering them think like mathematicians.

Tuesday, March 3, 2015

Epidemiology Research

I am a big fan of Aaron Carroll, who often blogs at the incidental economist.  However, in his latest New York Times column he says:
Most of the evidence for that recommendation has come from epidemiologic studies, which can be flawed.

Use of these types of studies happens far more often than we would like, leading to dietary guidelines that may not be based on the best available evidence. But last week, the government started to address that problem, proposing new guidelines that in some cases are more in line with evidence from randomized controlled trials, a more rigorous form of scientific research.
So when have randomized controlled trials stopped being a part of epidemiology?  It comes as news to me, who has done this type of work as an epidemiologist.  In particular, there are threats to validity in trials as well, and a lot of smart causal inference research has looked at that as well.  Trials also have concerns about cross-over, attrition, and even valid design.  These elements are all part of a typical epidemiological education and are an important part of public health practice.  Even things like meta-analyses, where trials (and now sometimes observational studies) are pooled are typical parts of epidemiology. 

It seems like he wants to conflate that observational research = epidemiology. 

There is also a difference of estimands.  The trials can only assess interventions in diet and how they perform.  The real (true) intake of the participants is always approximated, except perhaps in a metabolic ward.  Even doubly labeled water studies need to make assumptions. 

The real bug-bear of nutritional research in humans is measurement error.  It is present in all studies (even trials, which are much less susceptible to bias than cohort or case-control studies).  That is a lot of what we struggle with in this research area. 

Now, it is true (and I agree with Aaron Carroll completely) that the trials tell us a lot of what we want to know.  In a real sense we want to know how dietary interventions, as they will actually work out in reality, will change outcomes.  So I share his concern that the trials seemed to be overlooked by people writing guidelines.  Or, in other words, I think his main conclusion is quite sensible.

But let's not forget the observational research is critical to understanding patterns of diet that create the hypotheses and interventions that can actually be tested in trials.  They also give a lot of understanding into how people consume food in the state of nature.  I am never going to stop focusing on high quality data and the most rigorous possible study designs.  But I think it would be wiser to represent the eco-system more completely. 

On the other hand, I am not an expert in health care communications, and it might be that such broad strokes are necessary when focusing on the general public.  After all, improving public health is everyone's goal, and I am happy to take a few "hits" if that is the ultimate outcome.  But I think it's also a challenge to think about how to make this type of research, and the nuances in it, better understood in general.

I have a lot of thinking to do. 

Monday, March 2, 2015

Defining away concerns about charter school attrition

[New information has come in and we may be making some changes to this post.]

After what seems like a long time, we are back on the bad education statistics beat. Joseph kicked things off with this post discussing some recent charter school research, particularly this paper by Angrist et al. I followed by reposting a couple of earlier pieces on attrition.

If you didn't see them when they came out, I strongly recommend you take a minute a read those two reposts (Selection effects on steroids and Selection on Spinach*). This is a big, incredibly complex story and it makes much more sense if you come in with some context.

I also want to say explicitly that I am not singling out the Angrist paper for criticism. It is, if anything, above average for the field; that's the scary part. I have a number of concerns about this study but they are all problems that you find in much, if not most of the research on charter schools.

Let's start with attrition and this passage from the paper. The first half of the paragraph mostly seems to be pretty good, except for one red flag [emphasis added].
A second potential threat to the validity of lottery-based estimates is the differential loss to follow-up between winners and losers (also called differential attrition). Students in our study are lost to follow-up if they are missing the MCAS score data we use to measure charter achievement effects. This usually happens when a student moves out of state or to a private school. Attrition can bias lottery-based estimates if different types of students are more likely to leave the sample depending on lottery results.
There are a couple of fairly subtle points here (since I'm not an expert on this research you might want to dig up a copy of the original paper -- I believe mine is behind a firewall -- and check my work). The first centers around the various reasons why a student might miss one or more standardized tests. The researchers do deserve some credit for mentioning the private school option but the don't seem to quantify it, nor do they mention reasons like changing schools which are much more likely than interstate moves to interact in a problematic way.

Easier to miss but far more important is the defining of attrition as leaving the data set rather than leaving the program. This isn't necessarily wrong but it's incomplete and worrisome in at least two ways: first because it differs from what we might call the standard definition. If you Google "charter school student attrition," you will generally find stories about students leaving charter schools and moving to other schools; second because the more common definition of attrition is far more likely to cause problems that can invalidate this study.

The rest of the paragraph is more troubling.
 For instance, losers might be more likely to leave than winners, and highly motivated students might be more likely to opt for private school if they lose. We therefore compare the likelihood that winners and losers have an outcome test score in our data. There are no statistically significant differences in follow-up rates in the lottery sample schools, a result documented in Appendix Table A3. It therefore seems unlikely that differential attrition has an impact on the lottery-based results.
That "seems unlikely" is very hard to justify. Putting aside for a moment, the issue of definitions, you can't control for differential attrition this way. It is entirely possible for two groups to have roughly the same level of attrition and yet have the selection effects going in opposite directions. Furthermore, the kind of highly selective attrition we're talking about here is very powerful (particularly if you throw in peer effects). Even if the selective attrition is limited to one group, it is entirely possible for a statistically insignificant difference in attrition rates to led to a substantial difference in outcomes.

(Perhaps it is just a coincidence but it seems that, as economists have played more and more the role of statisticians-at-large, we seem to be seeing more of these "don't worry, everything will balance out" assumptions.)

I want to be careful with the next part because as mentioned before, I'm not an expert in this field nor have I gone through the paper in great detail, but think about the following line from the paper:

"The effects of charter attendance are modeled as a function of years spent attending a charter school."

Keep in mind that we appear to have a lot of cases of charters (particularly those with the 'no-excuse' model) singling out out students who are likely to do poorly and either forcing them out of the program or encouraging them to leave voluntarily. This probably means that a lot of students who would have been low-score/high-charter-years had they stayed where they were assigned by the lottery have been shifted to the the low-score/low-charter-year category.

This isn't my biggest concern with this research -- it isn't even my second biggest -- but it is enough to potentially undermine the validity of the research.


Sunday, March 1, 2015

Nimoy tribute on MeTV

One of these days, I would love to spend some time discussing the many clever ideas of Weigel Broadcasting. (Keep in mind, Carl Reiner has called MeTV's promos "brilliant.") The company provides a fascinating case study of a well-run business.

Unfortunately, this post is time-sensitive, so I'll limit myself to a quick DVR alert.



I particularly recommend the Man From UNCLE episode, which also features William Shatner and Werner Klemperer and is simply a lot of fun.

MeTV also ran the Star Trek episode "Amok Time" last night and dusted off this irreverent but affectionate spot.





It's the kick that sells it.

Friday, February 27, 2015

I was about to slam Krugman for ignoring meaningful counter-examples...

I generally like Paul Krugman a great deal, partially because I have a high tolerance for quality snark and partially because... well, let's save that for later. Sometimes though, for lack of a better description, writes like an economist. By this I (somewhat unfairly) mean that he is occasionally too quick to embrace the sweeping and aesthetically pleasing theory that collapses under scrutiny. I have mainly noticed this trend when he ventures out of econ or when he is summarizing the work of colleagues.

Recent case in point (or so I thought).
[Sherwin] Rosen’s argument, more than 30 years ago, was that technology was leading to sharp increases in inequality among performers of any kind, because the reach of talented individuals was vastly increased by mass media. Once upon a time, he said, all comedians had to entertain live audiences in the Borscht Belt; some drew bigger, better-paying crowds than others, but there were limits to the number of people one comic could reach, and hence limits on the disparity in comedian incomes. In modern times, however, an especially funny guy can reach millions on TV; an especially talented band can sell records around the world; hence the emergence of a skewed income distribution with huge rewards for a few.
There is undoubtedly some truth to this, but there are huge counter-examples as well, and substantial parts of the entertainment industry where the hypothesized relationships don't hold at all. I was all set to skewer Krugman over these problems when he had to go and say this:
But the more I look into this, the less I think this story works, at least for music.
He then goes on to show how the theory breaks down, particularly when placed in the context of the general economy.

Here's my favorite example.
But are the big incomes of music superstars something new, or at least a late 20th-century development? Well, let’s take an example where there are pretty good numbers: Jenny Lind, the famous soprano, who toured America from 1850 to 1852.

Tickets at Lind’s first concert sold for an average of about 6 dollars, which seems to have been more or less typical during the tour. Adjusting for inflation, that’s the equivalent of around $180 today, which isn’t too shabby (a lot of the indie concerts I go to are $15-20, although they also make money on beer). But you also want to bear in mind that real incomes and wages were much lower, so that these were actually huge ticket prices relative to typical incomes.

Overall, Lind was paid about $350,000 for 93 concerts, or a bit less than $4,000 a concert. If we adjust for the rise in GDP per capita since then, this was the equivalent of around $2 million a concert today. In other words, to a first approximation Jenny Lind = Taylor Swift. And this was in an era not only without recordings, but without amplification, so that the size of audiences was limited by the acoustics of the halls and the performer’s voice projection.
Which brings me around to that other reason I like Krugman.

I believe it was in one of the plausible reasoning books that George Pólya observed that, as a general principle, if you gave most people a rule they would usually start trying to think of examples; if you gave a mathematician a rule, he or she would generally start trying to think of exceptions.

At the risk of making a sweeping statement as part of an attack on sweeping statements, one of my biggest problems with economist as statistician-at-large trend (see Levitt et al.) is that so few of them think like one of Pólya's mathematicians. Krugman, for all his other flaws, is the kind of writer who tends to notice exceptions.

Thursday, February 26, 2015

Bondholders as Stakeholders

I agree with Dean Dad that this is a really major development -- the idea that bondholders will directly be able to act as stakeholders in higher education is a very big deal.  Consider: 
Which is where a financial issue becomes a governance issue.  Suddenly, “shared” governance isn’t just shared with people on campus, or in the legislature.  Now it’s shared with bondholders, and those bondholders have different priorities and varying degrees of patience.  Unlike the other participants in shared governance, they may not have any particular obligation to the other parties at hand.  It might not be worth their while to go for the quick kill, but that’s prudence, rather than deference.  They aren’t big on deference, as a group.
It also means that institutions will become subject, even more so, to all of the economic pressures of the corporate world.  One of the few things that made higher education uniquely valuable was the ability to resist institutional change.  It seems paradoxical that this would be the case in an organization devoted to innovation, but higher education has always been focused on the long game and not the short game. 

Now this one case could well be an outlier and this could all blow over.  But it is worth thinking very carefully about how this will play out in an environment where schools are strapped for cash. 

Wednesday, February 25, 2015

Quick Post: Financial Advice

Just a quick hit today, from Matt Yglesias, discussing whether investment advisors should be regulated to give helpful advice:
This of course raises the question of what it is that brokers who serve the middle class — people at mass market brokerages who pick up the phone when you dial the number on your company's 401(k) site — are doing to make money. The answer is that they are earning a living marketing financial products that are profitable to their employer and disguising the marketing as advice.
I think that this is entirely correct.  The idea that this sort of regulation could eliminate or reduce the number of financial advisors is not surprising.  It'd replace them with salespeople, which would limit the amount of potential confusion. 

I also think that the quote Matt includes at the top of his post is telling in a very different dimension:
"While concerns about improper actions by investment advisors should certainly be addressed, an overly broad proposal could price professional financial advice beyond the reach of many modest income families."
The theory behind things like the 401(k) is that people will be able to make better investment decisions than, for example, the state.  Thus private savings would work better than, for example, social security.  However, if the advice needed to be successful at saving using financial instruments is outside of the reach of the middle class (when regulated so that advisors need to act in the best interests of their clients) then it rather undermines this entire thesis. 

It's been a quiet story, but the implications for policy are enormous. 


CBS joins the terrestrial superstation ranks

[There's a Car 54 marathon coming up on Decades. Every episode of William Faulkner's favorite show starting March 3rd. Just wanted to get that out of the way.]

We haven't hit this one for a while so perhaps a bit of review is in order.

Back in 2008, the US finally caught up with the rest of the world and switched over to digital broadcast television. One of the many largely unreported results was that, since over-the-air broadcasters could carry multiple channels on the same signal, the satellite superstation model could be extended to terrestrial television.

At first, the field was limited to one well-respected but minor regional player called Weigel Broadcasting, which in rapid succession launched the TBS-style movie channel ThisTV and the TVLand style retro-channel MeTV. Weigel had what appeared to be no external marketing budget, instead relying on walk-ins and word-of-mouth (their internal marketing was a different story with no less an authority than Carl Reiner calling their station promos 'brilliant').

Terrestrial superstations received almost no coverage outside of trade publications and a few industry-heavy towns like Chicago. The lack of coverage was perhaps not surprising given the absence of promotion and the downscale demographics of the market, but it raised a potentially troubling issue. The broadcast television industry occupies a valuable piece of virtual real estate. The telecom industry was lobbying hard for a chance to grab that portion of the spectrum. The national press (particularly in the Northeast) was discussing the possibility of shutting down terrestrial TV while being completely unaware of what was going on in the medium.

The debate over what to do with the spectrum quickly came down to two narratives: the first was that the over the air market was tiny and rapidly shrinking and that its resources could be better used elsewhere. This argument, supported by Nielsen data, had lots of powerful friends and was widely promoted; The counterargument, supported by the market research firm GfK, was that the market had grown sharply since the conversion to digital. Under this scenario, selling of the television spectrum would kill a fledgling industry, reduce media diversification and remove a service that greatly improves the quality of life for the bottom quartile in order to slightly improve things for the top. Rajiv Sethi may have been the first major blogger to take the OTA side.  Our blog also jumped in early in the debate.

(You can find a summary of the argument here. Make sure to check the comment section.)

Given the huge discrepancy between the Nielsen and GfK numbers, I suggested that we should watch what companies with high-quality proprietary data (particularly ad revenue) were doing. Two early indicators were NBC's terrestrial superstation COZI and the Fox/Weigel joint venture Movies!.

The comically inept COZI was of interest primarily because is part of the same corporate family as the cable company Comcast. Movies! was far more notable, both for quality and innovation and for the business arrangement that spawned it.

The Fox/Weigel deal was really something unusual, perhaps even at the time unique (more on that in a minute). At first glance, Weigel seemed to bring nothing to the table. Fox had the money, the stations, the library and at least as much experience putting together channels. If Fox were treating this as just another cable station, the deal would make no sense, but Movies! launched as a terrestrial superstation, and in that area Weigel had an unmatched track record.

Since then the number of terrestrial superstations has continued to grow. In addition to numerous smaller players, major studios like Sony and MGM entered the market, and now one of the biggest, smartest and most cautious major has decided to give the model a try.
NEW YORK and CHICAGO – The CBS Television Stations group, a division of CBS Corporation (NYSE: CBS.A and CBS), and Weigel Broadcasting today announced plans to launch DECADES, a new national entertainment programming service for distribution across local television stations’ digital subchannels – broadcast channels that utilize a local station’s available spectrum to provide a companion to that station’s primary channel.  For example, in the New York market, WCBS-TV will continue to be available digitally as Channel 2.1 and DECADES will be available as Channel 2.2. In addition to being available as an over-the-air broadcast channel, DECADES will appear on numerous local cable systems and other multichannel video programming distribution services along with the stations’ primary channels.

Utilizing a library of more than 100 classic television series, including select titles from the CBS library such as I LOVE LUCY from the 1950s, STAR TREK from the 1960s, HAPPY DAYS from the 1970s and CHEERS from the 1980s, as well as a wide selection of theatrical and made-for-television movies and footage of historical news events from the archives of CBS News and ENTERTAINMENT TONIGHT, DECADES will provide viewers with a new way to experience our shared historical and cultural past.

As the ultimate TV time machine, DECADES will differentiate itself from other subchannel programming services by varying the classic series and movies that appear on the network every day.     
“DECADES is the most ambitious and creative subchannel programming service that has ever been created,” said Peter Dunn, President, CBS Television Stations. “We are thrilled to partner with Weigel Broadcasting, the leaders in this space, to make smart use of our stations’ spectrums and our companies’ considerable programming assets. This service will be a tremendous new business for CBS and all of the other stations across the country that participate, regardless of their primary network affiliation.”
...
DECADES will take viewers into a daily time capsule presentation of entertainment, popular culture and news. The service will feature DECADES RETROSPECTICAL (SM), a daily one-hour program that will be produced around the news events and cultural touchstones of a specific day, week or other time frame or theme. The TV series and movies presented each day will reflect that day’s theme or commemorative event.

For example, DECADES will look back at classic series such as HAPPY DAYS and its “jump the shark” episode, explain its historical significance and then broadcast that episode. Viewers will also be taken back in time to rediscover events that shaped our world, such as the assassination of President John F. Kennedy, Neil Armstrong walking on the moon, the Beatles’ U.S. debut on THE ED SULLIVAN SHOW and the birth of software and technology companies like Microsoft and Apple. DECADES will connect these events to what people were watching on television, seeing at the movies and experiencing as a nation.
Even more than the Fox Movies! deal, Decades shows how much Weigel has come to be recognized as the dominant player in the terrestrial television market. As with the earlier collaboration, CBS would seem to be the one bringing everything to the table: the name, the money, the stations, the library, even expertise (keep in mind that in an earlier incarnation, CBS/Viacom* virtually invented the retro-genre in the Eighties with Nick-at-Nite, followed by TVLand).

The decision not only to start a MeTV style station but actually to bring in a competitor to run it is enormously telling. First, as an indication of Weigel's standing and second,  as an illustration of how much the terrestrial subchannel market is seen as both distinct and important.

We can probably never say whether Nielsen or GfK got it right, but we can say that the companies with the best proprietary data seem to see a future in rabbit ears.

* CBS and Viacom are not exactly the same company these days, but they are basically owned by the same people.

Tuesday, February 24, 2015

Skimming the cream -- a history lesson from Charles Pierce

This could be the starting point for all sorts of interesting discussions, from the role of government sponsored research to the profound and ubiquitous technological advances that clustered around the end of the Nineteenth and the beginning of the Twentieth Century.

For liberal political blogger, Charles Pierce (the source of the following passage), it's another reason to object to Scott Walker's approach to higher education.
Up until the 1890's, dairy farming was a sucker's game. Milk was sold to the factories by volume; farmers could cheat by skimming the cream, or by watering down the product. Honest dairy farmers producing good milk got cheated pretty badly in this system. In 1890, however, a man named Stephen Babcock developed a simple test by which, through the use of sulfuric acid and a centrifuge, any farmer could measure the butterfat content of his milk. This caused such a boom in the dairy industry that Wisconsin did indeed become America's Dairyland. In collaboration with another scientist, Babcock also developed a method for cold-curing cheese that helped the state become so prolific at producing cheesy comestibles that people now wear mock-ups on their heads at football games. He also did some revolutionary work with cattle feed that became the basis for the development of the concept of vitamins.

Babcock did all of this because he worked for the Wisconsin Agricultural Experiment Station, which had been founded in 1883 as part of the University of Wisconsin's land-grant mission under the Morrill Act. This was a precursor to the agricultural extension services that were developed at other land-grant institutions after the passage of the Smith-Lever Act in 1914. The land-grant mission, which was to provide an education that would be useful to the public at large, dovetailed perfectly with what became known as The Wisconsin Idea -- that the boundaries of the university are the boundaries of the state, an idea that Scott Walker has dedicated himself to tossing into the wood chipper. And thus it is that butterfat undermines the very raison d'etre of Scott Walker's entire political career and the very basis of his political philosophy. QED.

Also, moo.

Monday, February 23, 2015

Driverless cars may actually be getting closer

This announcement has me intrigued.







Today, Volvo announced a real, on-the-streets test of 100 of its self-driving cars — a first in the world, and one that will put regular owners in the seats of what it says are production-ready autonomous vehicles, by 2017.

Doing so requires far more than the 28 cameras, sensors and lasers Volvo says its system uses, along with a complex set of software rules, to tackle nearly 100 percent of all driving situations. It also required the approval of lawmakers in Sweden and Gotheberg, the city which will allow owners of these Volvos to legally cruise the streets while reading or chatting away on their phones from behind the wheel.

Making it possible for computers to understand everyday driving situations requires multiple types of radars, several cameras, a multiple-beam laser scanner in the front bumper and 12 ultrasonic sensors — the kind normally used to tell you if you're about to back into a pole. All of these are permanently linked to a special high-definition 3D map, refined GPS sensors and the local traffic control office — which can not only warn of jams, but command inattentive drivers to shut off their autopilots and drive themselves if necessary. And all of the systems have fail-safe modes and backups in case something goes wrong.
It is always risky to say "this is the right way to do this." With that in mind, the right way to talk about technology pretty much always revolves around the following:

Functionality;

Costs;

Implementation and infrastructure;

And the new technology's place in the existing technology landscape.

Most technology reporters (and I mean the vast majority) don't get these fundamental principles which leads them to more often than not get their stories backwards. In this case, the reporter, Justin Hyde, takes the attitude "wow, it has a special high-definition 3-D map when the appropriate response would've been "damn, it still needs a special high-definition 3-D map." Requiring special infrastructure, even really cool special infrastructure, is a bug, not a feature.

That said, this announcement does make me a bit more optimistic about the technology, at least in part because it didn't come from Google.

Google has a lot of reasons to want to be seen as a diversified, broadly innovative technology company, rather than as a very good one-trick pony cashing in on a monopoly (possibly two monopolies depending on how you want to count YouTube). A shiny reputation helps to keep stock prices high and regulators at bay.

Google has always been good at branding and they do have an extraordinary track record of innovation, but their really impressive advances (natural language processing, mapping, data mining) are closely related to their core business. The further away you move from search engines, the bigger the hype-to-substance ratio gets. This is nowhere more true than with driverless cars. The last round of publicity showed that the company could get as much buzz out of a cosmetic change (removing the steering wheel years after having demonstrated hands-free driving) as it did with the genuine breakthroughs of its earlier model.

Volvo's core competency is making, not only cars, but very safe cars. They have tons of relevant experience and engineering talent and a much larger stake in getting a viable product on the road. What's more, they seem more serious about getting the legal barriers out of the way. I still think that having a fully autonomous car generally available by the end of the decade is a long shot, but those odds might be getting a little better.

Friday, February 20, 2015

Checking in with MovieBob

I've been working on some video projects lately and putting quite a bit of thought into what makes a video podcast good. This has given me another excuse to spend too much time going through Bob Chipman videos. Chipman, a.k.a. MovieBob, has the obsessive love for and knowledge of pop culture that marks the ultimate nerd, but unlike, say, virtually all of the writers for the Onion's A.V. Club, he somehow has managed to maintain a sense of perspective on the subject accompanied by a refreshing amount of common sense.

In addition to keeping his sense of perspective about the fan-boy fodder, Chipman also does the same with the business of entertainment. He understands how things like intellectual property and antitrust laws...






marketing seasons...



bad accounting...



and technical limitations can affect our culture in subtle and interesting ways.



Chipman is also displays that same sense of perspective and common sense when discussing more controversial issues.








Thursday, February 19, 2015

I don't think you want to go with the "handful" defense

Before we go on, a quick caveat. There is tremendous variation in charter school models and philosophies. That's a big part of the story below and the reporter does a poor job addressing it. I can't say for certain, but I suspect that most of the worst offenders in the story follow the popular "no excuses" model.

From the New York Times:
The Advocates for Children report cites complaints from parents who said their children had been suspended from charter schools over minor offenses such as wearing the wrong shoes or laughing while serving detention. Ultimately, though, the group said the main issue was legal.

Half of the policies examined by Advocates for Children let charter schools suspend or expel students for being late or cutting class — punishments the group said violated state law. At three dozen schools, there were no special rules covering the suspension or expulsion of children with disabilities, which the group said violated federal law. And in 25 instances, charter schools could suspend students for long periods without a hearing, which the group said violated the United States and New York State Constitutions, as well as state law.

James D. Merriman, chief executive of the New York City Charter School Center, an advocacy group for charter schools, questioned how frequently the incidents cited by Advocates for Children occur.

“No one can disagree that those policies that do not fully meet applicable law should be amended,” he said in an email. “But it is tremendously unfair to suggest, as A.F.C. does, that a handful of one-sided anecdotes compiled over a long time are any evidence that charter schools are wholesale violating civil rights laws.”
I know I've made this point before but it bears repeating: excessively harsh disciplinary policies can make incompetent administrators look good while taking a horrible toll on kids. By locking out or chasing away the kids they can't handle (who also tend to be the kids who most need our help), administrators can pump up virtually all of a school's metrics.

Fortunately, in my experience, most administrators are too ethical to rely on these methods. Unfortunately, we have started setting up a system of incentives that encourage unethical behavior and if we continue, that balance will shift.

Wednesday, February 18, 2015

The politics of that pile of old comics

As mentioned before, writer and historian Mark Evanier is arguably the go-to guy for pop culture when it comes to both comics and television. One of his areas of particular expertise is the career of his friend, Jack Kirby.

The following excerpt confirms some assumptions I've had about the politics of Silver Age Marvel.
So when someone asks what Captain America would have felt about some topic, the first question is, "Which Captain America?" If the character's been written by fifty writers, that makes fifty Captain Americas, more or less…some closely in sync with some others, some not. And even a given run of issues by one creator or team is not without its conflicts. When Jack was plotting and pencilling the comic and Stan Lee was scripting it, Stan would sometimes write dialogue that did not reflect what Jack had in mind. The two men occasionally had arguments so vehement that Jack's wife made him promise to refrain. As she told me, "For a long time, whenever he was about to take the train into town and go to Marvel, I told him, 'Remember…don't talk politics with Stan.' Neither one was about to change the other's mind, and Jack would just come home exasperated." (One of Stan's associates made the comment that he was stuck in the middle, vis-a-vis his two main collaborators. He was too liberal for Steve Ditko and too conservative for Kirby.)

Jack's own politics were, like most Jewish men of his age who didn't own a big company, pretty much Liberal Democrat. He didn't like Richard Nixon and he really didn't like the rumblings in the early seventies of what would later be called "The Religious Right." At the same time, he thought Captain America represented a greater good than the advancement of Jack Kirby's worldview.

During the 1987 Iran-Contra hearings, Jack was outraged when Ollie North appeared before Congress and it wasn't just because North lied repeatedly or tried to justify illegal actions. Jack thought it was disgraceful that North wore his military uniform while testifying. The uniform, Jack said, belonged to every man and woman who had every worn it (including former Private First Class Jack Kirby) and North had no right to exploit it the way he did. I always thought that comment explained something about the way Kirby saw Captain America. Cap, obviously, should stand for the flag and the republic for which it stands but — like the flag — for all Americans, not merely those who wish to take the nation in some exclusionary direction.
We've already been over Ditko's Randian views.

I also knew that Lee, who is a bit of a revisionist, had overstated some of the progressive positions he had taken on issues like racism while downplaying the red-baiting and sexism. Marvel apologists have also tried to explain away the more reactionary aspects of these stories but they are pretty difficult to ignore and it appears that most of them can be credited to Lee. (Kirby never had Lee's gift for self-promotion or reinvention and he preferred to let his work speak for itself -- always a risky approach in a collaborative medium.)

For more thoughts on the subject, check out this piece by one of my favorite critics/pop historians, Bob Chipman (more from Chipman later).


You should note that the red-baiting version of the character was done by Lee with no involvement from Kirby.

Tuesday, February 17, 2015

Secret Origins -- College Humor

As mentioned before, I'm a long time fan of the site College Humor.




[Slightly NSFW]



But I always found the name a bit odd until I came across this post from Dick Cavett's much lamented NYT blog:





Woody Allen has said that of the greats, Groucho had the richest number of gifts. He could sing, dance and act, and beyond those fairly common gifts, when you add the distinctive voice, faultless instinct for wording, genius wit, hilarious physical movement, rich supply of expressions and physical “takes” — and the list goes on — it arguably adds up to the most supremely gifted comedian of our time.

And there’s one thing more. He could write. A born scribe. And many a Groucho fan is unaware of the degree to which this was true.

This problem has been put to bed by Bader’s book. (Full disclosure: I know Rob from the masterful job he did putting together the “Dick Cavett Show” DVD sets.) Bader, too, can write, and in a fresh, humorous, scholarly and entertaining way, with shrewd analysis and observations about the products of Groucho’s pen and typewriter.

If your reaction to this is, “So what did he write?” this book holds the answer. In his early years, and aside from his books, Groucho’s written pieces appeared widely, including in the beloved magazine College Humor and, yes, The New Yorker. Bader has found and retrieved priceless specimens of Groucho’s impressively large output from all over, some of the pieces early enough to have been bylined “Julius H. Marx,” Groucho’s vrai nom. Open the book to any page and try not to laugh.
A quick trip to Wikipedia filled in the details:

College Humor was a popular American humor magazine from the 1920s to the 1940s. Published monthly by Collegiate World Publishing, it began in 1920 with reprints from college publications and soon introduced new material, including fiction. Contributors included Robert Benchley, Heywood Broun, Groucho Marx, Ellis Parker Butler, Katherine Brush, F. Scott Fitzgerald and Zelda Fitzgerald. Editor H.N. Swanson later became Fitzgerald's Hollywood agent.

The magazine featured cartoons by Sam Berman, Ralph Fuller, John Held Jr., Otto Soglow and others.
I suppose this could be a coincidence, but if not, that's awfully good company.


Monday, February 16, 2015

Repost -- Selection on Spinach*

As part of a follow-up to this recent post by Joseph, I'm going to be discussing the role of attrition in the charter school debate. To get the conversation started, I'll be reposting a couple of earlier entries.

______________________________________________





[I have the nagging feeling that I'm not using the proper terminology with the following but the underlying concepts should be clear enough. At least for a blog post.]

Let's talk about three levels of selection effects :

The first is initial selection. At this level, certain traits of potential subjects influence the likelihood of their being included in the study. If you ask for volunteers in person, you will end up underrepresenting shy people. If you use mail surveys, you will underrepresent the homeless:

The second level comes after a study starts. You will frequently lose subjects over time. This type of selection is particularly dangerous because you cannot assume that the likelihood of dropping out is independent of the target variable. The isue comes up all the time in medical studies. For serious conditions, a turn for the worse can make it extremely difficult to continue treatment. The result is that the people who stick around till the end of the study are far more likely to be those who were getting better;

(Up until now, the types of selection bias we have discussed, though potentially serious, are generally not deliberate. Their consequences are unpredictable and they happen to even the best and most conscientious of researchers. That is no longer the case with level three.)

The third level concerns attempts to manipulate attrition so as to affect the results of a study. In these cases, researchers will attempt to get rid of those subjects who are likely to drag down the average. This is blatant data cooking and it can be remarkably effective. In school administration, the term of art is "counseling out." It is shockingly widespread, particularly among the "no excuses" charter schools.

The effect of this practice on kids can be brutal but that is a topic for another post. What interests us here are the statistical concerns; what are the analytic implications of this policy? In terms of direction, the answer is simple: schools that engage in these policies will see their test scores artificially inflated. In terms of magnitude, there is really no telling. The potential for distortion here is huge, particularly when you take into account the possibility of peer effects.

Put bluntly, in cases like this, "The first Success graduating class, for example, had just 32 students. When they started first grade in August 2006, those pupils were among 73 enrolled at the school," data showing above-average results are almost meaningless.

[A few weeks ago, I put out a collection of our early posts on education (Things I Saw at the Counter-Reformation).  The impact of attrition is one of the big running themes.]



*Spinach being, in this case, a substance that greatly increases the power of a given effect.

Repost -- Selection effects on steroids

As part of a follow-up to this recent post by Joseph, I'm going to be discussing the role of attrition in the charter school debate. To get the conversation started, I'll be reposting a couple of earlier entries.

______________________________________________

I'm about to have a lot more to say about the various ways high attrition can pump up a school's performance metrics, some directly through removing low performers, some indirectly through peer effects, treatment interactions and accounting tricks. At the risk of spoiling the punchline of those future posts, it is next to impossible to perform meaningful analyses of the academic quality of high-attrition schools. About the only safe conclusion is that those schools are worse than they look.

If charter schools are going to have a future (and I hope that they will, though my reasons will have to wait for another post), they will have to overcome two existential threats, both of which originated not with their critics but with their supporters. It was supporters who pushed a radical deregulation agenda that led to massive looting of the system and it was supporters who advocated for a flawed system where success was defined solely by metrics and those metrics were easily cooked by methods which took a brutal toll on kids.

In a devastating post, Diane Ravitch spells out just how bad the problem has gotten.
Reformers tend to make two very different arguments about charter schools. Argument #1 is that charter schools serve the same students as public schools and manage to put public schools to shame by producing amazingly better results on standardized exams. Therefore, reformers claim, if only public schools did what charter schools do (or better yet, if all public schools were closed and charter schools took over), student learning would dramatically increase and America might even beat South Korea or Finland on international standardized tests. When it is pointed out that, as a whole, charters do no better than public schools on standardized tests [2], reformers will quickly turn their attention to specific charter chains that, they claim, do indeed produce much better standardized test results. So what’s the deal with these chains? Well, in every case that has been subjected to scrutiny their results are extremely suspicious. Here is a short list of examples:

1. Achievement First in New Haven had a freshman class of 64 students (2 students enrolled later), and only 25 graduated- a 38% graduation rate- yet the school claimed a 100% graduation rate by ignoring the 62% attrition rate. [3]

2. Denver School of Science and Technology (DSST) had a freshman class of 144 students and only 89 12th graders- a 62% graduation rate- yet the school (and Arne Duncan) claimed a 100% graduation rate by ignoring the 38% attrition rate. [4] As a 6-12 charter chain, DSST also manages to attrite vast numbers of their middle school students before they even enter the high school.

3. Uncommon Schools in Newark disappears 38% of its general test takers from 6th to 8th grade.[5] Another analysis found that through high school the attrition rate was, alarmingly, much higher “Uncommon loses 62 to 69% of all males and up to 74% of Black males.”[6]

4. BASIS in Arizona- “At…BASIS charter school in Tucson, the class of 2012 had 97 students when they were 6th graders. By the time those students were seniors, their numbers had dwindled to 33, a drop of 66%. At BASIS Scottsdale…its class of 2012 fell from 53 in the 6th grade to 19 in its senior year, a drop of 64%.” [7]

5. The Noble Network in Chicago- “Every year, the graduating class of Noble Charter schools matriculates with around 30 percent fewer students than they started with in their freshman year.” [8]

6. Harmony Charters in Texas- “Strikingly, Harmony lost more than 40% of 6th grade students over a two-year time.” [9]

7. KIPP in San Francisco- “A 2008 study of the (then-existing) Bay Area KIPP schools by SRI International showed a 60% attrition rate…the students who left were overwhelmingly the lower achievers.” [10]

8. KIPP in Tennessee had 18% attrition in a single year! “In fact, the only schools that have net losses of 10 to 33 percent are charter schools.” [11]

In every case these charter chains accepted students that were significantly more advantaged than the typical student in the district, and then the charters attrited a significant chunk of those students.

Success Academy in New York City plays the same game. It accepts many fewer high needs special education students, English Language Learners, and poor students. [12] It attrites up to 1/3 of its students before they even get to testing grades and then loses students at an even faster pace. It selectively attrites those students most likely to get low scores on standardized tests. [13] It is legally permitted to mark its own exams (as are all New York City charter schools) while public schools cannot. It loses 74% of its teachers in a single year at some of its schools. [14] The author of the Daily News editorial that sparked the initial blog commented “even in the aggregate that wouldn’t seem to account for” the results. It is entirely unclear what he means by “in the aggregate.” But it is clear that he has his arithmetic wrong. A charter chain that starts with an entering class that is likely to score well on standardized tests, then selectively prunes 50% or more of the students who don’t score well on standardized tests and refuses to replace the disappeared students with others, can easily show good standardized test results with the remaining students. Any school could do this. It’s really not rocket science.


And here are the footnotes



[1] https://dianeravitch.net/2014/08/22/is-eva-moskowitz-the-lance-armstrong-of-education/
[2] http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/09/24/the-bottom-line-on-charter-school-studies/
[3] http://jonathanpelto.com/2013/05/30/another-big-lie-from-achievement-first-100-percent-college-acceptance-rate/
[4] http://garyrubinstein.wordpress.com/2014/04/16/arne-debunkin/
[5] http://schoolfinance101.wordpress.com/2010/12/02/truly-uncommon-in-newark /
[6] http://danley.rutgers.edu/2014/08/11/guest-post-where-will-all-the-boys-go/
[7] http://blogforarizona.net/basis-charters-education-model-success-by-attrition/
[8] http://jerseyjazzman.blogspot.com/2012/04/no-bull-in-chicago.html
[9] http://fullerlook.wordpress.com/2012/08/23/tx_ms_charter_study/
[10] http://parentsacrossamerica.org/high-kipp-attrition-must-be-part-of-san-francisco-discussion/
[11] http://www.wsmv.com/story/22277105/charter-schools-losing-struggling-students-to-zoned-schools
[12] https://dianeravitch.net/2014/03/12/fact-checking-evas-claims-on-national-television/
[13] https://dianeravitch.net/2014/02/28/a-note-about-success-academys-data/. The high attrition rate before testing in 3rd grade may explain the data pattern noted in this http://shankerblog.org/?p=10346#more-10346 analysis.
[14] http://www.citylimits.org/news/articles/5156/why-charter-schools-have-high-teacher-turnover#.U_gqR__wtMv
[15] http://edexcellence.net/commentary/education-gadfly-daily/flypaper/2013/the-charter-expulsion-flap-who-speaks-for-the-strivers.html
[16] http://schoolfinance101.wordpress.com/2012/12/03/when-dummy-variables-arent-smart-enough-more-comments-on-the-nj-credo-study/