Monday, March 9, 2015

MOOCs and the Eugen Weber Paradox

As always seems to happen when I have other things I need to be doing, all sorts of interesting threads have started popping up and saying "Blog me! Blog me!"

Case in point, Erik Loomis of LGM has gotten back on the MOOC beat. I've got a couple of original posts on the subject in the works, but first I want to bring an old post from the teaching blog back into the conversation. It addresses what I think may be the fundamental questions of the ed-reform-through-technology debate:

After over a century of experimenting with educational technology, why have the results up until now been so underwhelming?;

And how will the new approaches being proposed fix the problems that plagued all of those previous attempts?


The Eugen Weber Paradox


If you follow education at all, you've probably heard about the rise of online courses and their potential for reinventing the way we teach. The idea is that we can make lectures from the best schools in the world available through YouTube or some similar platform. It's not a bad idea, but before we start talking about how much this can change the world, consider the following more-serious-than-it-sounds point.

Let's say, if we're going to do this, we do it right. Find an world renowned historian who's also a skilled and popular lecturer, shoot the series with decent production values (a couple of well-operated cameras, simple but professional pan and zoom), just polished enough not to distract from the content.

And if we're going to talk about democratizing education, let's not spend our time on some tiny niche course like "Building a Search Engine." Instead, let's do a general ed class with the widest possible audience.

If you'll hold that thought for a moment...

A few years ago, while channel surfing in the middle of the night, I came across what looked like Harvey Korman giving a history lesson. It turned out not to be Korman, but it was a history lesson, and an extraordinarily good one by a historian named Eugene Weber, described by the New York Times as "one of the world’s foremost interpreters of modern France." Weber was also a formidable teacher known for popular classes at UCLA.



The program I was watching was “The Western Tradition,” a fifty-two part video course originally produced for public television in 1989. If you wanted to find the ideal lecturer for a Western Civ class, it would probably be Eugen Weber. Like Polya, Weber combined intellectual standing of the first order with an exceptional gift and passion for teaching. On top of that, the Annenberg Foundation put together a full set of course materials to go with it This is about as good as video instruction gets.

All of which raises a troubling question. As far as I know, relatively few schools have set up a Western Civ course around "the Western Tradition." Given the high quality and low cost of such a course, why isn't it a standard option at more schools?

Here are a few possible explanations:

1. Medium is the message

There are certain effects that only work on stage, that fall strangely flat when there's not an audience physically present in the room. Maybe something similar holds with lectures -- something is inevitably lost when moved to another medium.

2. Lecturers already work for kind words and Pez

Why should administrators go to the trouble of developing new approaches when they can get adjuncts to work for virtually nothing?

3. It's that treadmill all over again

You probably know people who have pinned great hopes on home exercise machines, people who showed tremendous excitement about getting fit then lost all interest when they actually brought the Bowflex home and talking about exercise had to be replaced by doing it. Lots of technological solutions are like that. The anticipation is fun; the work required once you get it isn't.

This is not a new story. One of the original missions of educational TV back in the NET days was to provide actual classroom instruction, particularly for rural schools.* The selection was limited and it was undoubtedly a pain for administrators to match class schedules with broadcast schedules but the basic idea (and most of the accompanying rhetoric) was the same as many of the proposals we've been hearing recently.

Of course, educational television was just one of a century of new media and manipulatives that were supposed to revolutionize education. Film, radio, mechanical teaching machines, film strips and other mixed media, visual aides, television, videotape, distance learning, computer aided instruction, DVDs, the internet, tablet computing. All of these efforts had some good ideas behind and many actually did real good in the classroom, but none of them lived up to expectations.

Is this time different? Perhaps. It's possible that greatly expanded quantity and access may push us past some kind of a tipping point, but I'm doubtful. We still haven't thought through the deeper questions about what makes for effective instruction and why certain educational technologies tend to under-perform. Instead we get the standard ddulite boilerplate, made by advocates who are blissfully unaware of how familiar their claims are to anyone reasonably versed in the history of education.

* From Wikipedia
 The Arkansas Educational Television Commission was created in 1961, following a two-year legislative study to assess the state’s need for educational television. KETS channel 2 in Little Rock, the flagship station, signed on in 1966 as the nation's 124th educational television station. In the early years, KETS was associated with National Educational Television, the forerunner of the current PBS. The early days saw black-and-white broadcasting only, with color capabilities beginning in 1972. Limited hours of operation in the early years focused primarily on instructional programming for use in Arkansas classrooms


More on Mars One -- I expect this from ABC News but Sheldon?

There's been another wave of PR in support of the privately funded "Mars mission" Mars One (and yes, I do need to use quotation marks). There have been news stories, interviews with applicants who did or didn't qualify for the "mission," (NPR, how could you?) and even fictional characters like Castle and, sadly, Sheldon Cooper ("The Colonization Application").

Just to review, not only is this mission almost certain never to happen, but every major aspect of it collapses under scrutiny.

The funding goals are wildly unrealistic, the budget estimates are comically optimistic, and what little technology has actually been proposed is so badly designed that, according to an MIT study, it would be likely to kill all the colonists within a few months. I am pretty sure Howard would have pointed all of these things out to Sheldon.

You could also find some these objections in this piece from ABC, but you'd have to look closely because the reporters buried them as deep as possible, just far enough from the end to allow Mars One CEO/confidence man Bas Landorp have the last word.

Obviously this is a fun story, a lottery where anyone can become a colonist to Mars made even more dramatic by the twist of being a one-way trip. I also get that this is a story many probably most of us would like to believe. That is a high enough standard to justify a hook on a TV episode, but it is an embarrassingly low one for major news outlets.

Friday, March 6, 2015

The Golden Age of PR

I usually check out Jonathan Chait's blog once or twice a week and I usually ignore the "Most Viewed Stories" column to the right of the page. Recently, though, one of the items caught my eye.

"Who Was That at the End of the New Avengers Trailer, and Why Should You Be So Excited?"


The link led to a Vulture.com post about the comic book character the Vision, followed by some speculation about what part he might play in the upcoming Avengers movie. Looking over the article, it struck me that this is an amazing time to be a publicist.  We have gone from publications hyping movies to hyping trailers to hyping two-second shots in trailers.

Thursday, March 5, 2015

Cartoon Metalogic

Bud Grace, creator of the comic strip the Piranha Club, is a former nuclear physicist -- No, really, look it up -- and every now and then a bit of STEM humor pops up.

For investors, all money isn't created equal

In our most recent discussion of driverless cars, I made the following assertion about Google:
Google has a lot of reasons to want to be seen as a diversified, broadly innovative technology company, rather than as a very good one-trick pony cashing in on a monopoly (possibly two monopolies depending on how you want to count YouTube). A shiny reputation helps to keep stock prices high and regulators at bay.
I didn't really think of it at the time, but this concern is a point we have hit tangentially in the past and which probably deserves a bit more direct scrutiny. Investors often care a great deal about where a company's money comes from. This concern is often neither rational nor consistent and often leads companies to mislead the public about the makeup of their revenues.

Here are a couple of examples. I am going to be rather vague about some details because, you know, lawyers, but the broad outlines are both accurate and the circumstances are common enough that I could always find other cases if pressed . The first involved a financial services company that had products for customers at both ends of the economic spectrum. If you were to look at the company as an outsider or even as a new employee, you might very well assume that the two divisions were roughly equal. You might even suspect that the upscale was more profitable.

In reality a large majority of company's profits came from the low end. It turned out that the profit margin for providing services for poor people in this case was much higher. Stockholders, however, did not particularly like products that served this demographic. Also having a heavily promoted line of products for upper-class people did wonders for the stock price.

Here's another example:
The bank in question was in the middle of a very good run, making a flood of money from its credit card line, but investors kept complaining that the bank was making all that money the wrong way. This was the height of the Internet boom but the bank was booking all of these profitable accounts through old-fashioned direct mail. If it wanted to maximize its stock price, the bank needed to start booking accounts online.

The trouble was that (at least at the time) issuing credit cards over the Internet was a horrible idea. The problem was fraud. With direct mail, the marketer decides who to contact and has various ways to check that a customer's card is in fact going to that customer. With a website, it was the potential customers who initiated contact and a stunning number of those potential customers were identity thieves.

The Internet was an excellent tool for account management, but the big institutional investors were adamant; they wanted to see the bank booking accounts online. Faced with the choice between unhappy investors and a disastrous business move, the company came up with a truly ingenious solution: they added a feature that let people who received a pre-approved credit card card offer fill out the application online.

Just to be absolutely clear, this service was limited to people who had been solicited by the bank and based on the response rates, the people who went online were basically the same people who would have applied anyway. From a net acquisitions standpoint, it had little or no impact.

From an investor relations standpoint, however, it accomplished a great deal. Everyone who filled out one of those applications and was approved* was counted as an online acquisition. Suddenly the bank was using this metric to bill itself as one of the leading Internet providers. This satisfied the investors (who had no idea how cosmetic the change was) and allowed the bank to continue to follow its highly profitable business plan (which was actually a great deal more sophisticated than the marketing techniques of many highly-touted Internet companies).

*'pre-approved' actually means 'almost pre-approved.'
Put bluntly, companies will often pursue strategies or introduce products that are profit neutral or worse because these strategies and products make the companies look diversified or forward thinking or poised to take advantage of some major opportunity. Investors reward these perceptions. With this fact in mind, you can make sense of all sorts of strange business decisions.

For example, Amazon is an innovative, well-run. forward-thinking company, but its  P/E ratio (when it turns a profit) is often in the hundreds,* meaning the company has to be seen as being on the cusp of explosive growth. When you read about the company's online grocery service or its proposed drone-to-door deliveries** and you ask yourself how they can ever make a profit doing this on a large scale, the answer may be that they don't expect to.


* There may be some controversy over how  P/E ratio is calculated for Amazon but that's a topic for another post and probably another blogger.

** I added "drone-to-door" to emphasis the distinction between that proposed technology and large cargo drones. The latter actually does make business sense but would face huge regulatory hurdles.

Wednesday, March 4, 2015

Elegant theories and headless clowns -- more bad tech reporting from the New York Times

The previously mentioned Paul Krugman piece on opera singer Jenny Lind  included a link to this NYT article "adapted from The Price of Everything: Solving the Mystery of Why We Pay What We Do, by Eduardo Porter, an editorial writer for The New York Times." Krugman was criticizing the reliance on simple economic stories that don't fit the facts. Porter was telling one.
Baseball aficionados might conclude that all of this points to some pernicious new trend in the market for top players. But this is not specific to baseball, or even to sport. Consider the market for pop music. In 1982, the top 1 percent of pop stars, in terms of pay, raked in 26 percent of concert ticket revenue. In 2003, that top percentage of stars — names like Justin Timberlake, Christina Aguilera or 50 Cent — was taking 56 percent of the concert pie.
...

But broader forces are also at play. Nearly 30 years ago, Sherwin Rosen, an economist from the University of Chicago, proposed an elegant theory to explain the general pattern. In an article entitled “The Economics of Superstars,” he argued that technological changes would allow the best performers in a given field to serve a bigger market and thus reap a greater share of its revenue. But this would also reduce the spoils available to the less gifted in the business.

The reasoning fits smoothly into the income dynamics of the music industry, which has been shaken by many technological disruptions since the 1980s. First, MTV put music on television. Then Napster took it to the Internet. Apple allowed fans to buy single songs and take them with them. Each of these breakthroughs allowed the very top acts to reach a larger fan base, and thus command a larger audience and a bigger share of concert revenue.
Putting aside the fact that, as Krugman pointed out, we have examples of superstar musicians that predate both recording and broadcasting, this paragraph is still stunningly incomplete and comically ill-informed.

The 1980s cutoff is arbitrary and misleading. The 1880's would make more sense, though it really wasn't until the 1890s that things really took off and it has been a fairly steady stream of technological innovations since then.

Here's a brief, roughly chronological view of some of the highlights:

Disc records

Amplification

Radio

Optical sound tracks on film

Stereo

FM

LPs

HiFi

Television (which brought with it everything from Hit Parade to American Bandstand, Sullivan, Midnight Special and countless shows like this)

Cassettes

CDs

Recordable  CDs

Affordable digital audio editing

And then, of course, a whole family of internet-based innovations.

The past 125 years has been one long stream of "technological disruptions" for the music industry, but most of the innovation over the past couple of decades has mainly broadened the market by increasing selection and lowering production costs. In terms of "allow[ing] the very top acts to reach a larger fan base, and thus command a larger audience and a bigger share of concert revenue," at least for the North American and European audience, the top acts have been near saturation since the Sixties. (Check out the ratings for Elvis or the Beatles on Sullivan.)

By looking at the past thirty years of advances and ignoring the previous ninety, Porter gives us a blatant example of headless clown causal reasoning, arguing that x explains the difference in A and B because x is present in A while ignoring the fact that x is also present in B. Data journalism has fully embraced the idea that two numbers briefly moving in sync constitutes a causal argument.



The phrase "elegant theory" should have set off the red flags and warning lights. Elegance in these books pretty much always means "simplistic and unrealistic." The theories are aesthetically and emotionally appealing but they just barely fit the data in their examples and they usually fall apart completely when taken out on the road.

As previously mentioned, this goes back to what George Polya called (in a quote I really need to dig up) thinking like a mathematician . Polya suggested that the default setting of most people when presented with a rule is to look for examples while the default setting of mathematicians and scientists was to look for exceptions. Mathematical ideas get a tremendous amount of press these days but very few of the people covering them think like mathematicians.

Tuesday, March 3, 2015

Epidemiology Research

I am a big fan of Aaron Carroll, who often blogs at the incidental economist.  However, in his latest New York Times column he says:
Most of the evidence for that recommendation has come from epidemiologic studies, which can be flawed.

Use of these types of studies happens far more often than we would like, leading to dietary guidelines that may not be based on the best available evidence. But last week, the government started to address that problem, proposing new guidelines that in some cases are more in line with evidence from randomized controlled trials, a more rigorous form of scientific research.
So when have randomized controlled trials stopped being a part of epidemiology?  It comes as news to me, who has done this type of work as an epidemiologist.  In particular, there are threats to validity in trials as well, and a lot of smart causal inference research has looked at that as well.  Trials also have concerns about cross-over, attrition, and even valid design.  These elements are all part of a typical epidemiological education and are an important part of public health practice.  Even things like meta-analyses, where trials (and now sometimes observational studies) are pooled are typical parts of epidemiology. 

It seems like he wants to conflate that observational research = epidemiology. 

There is also a difference of estimands.  The trials can only assess interventions in diet and how they perform.  The real (true) intake of the participants is always approximated, except perhaps in a metabolic ward.  Even doubly labeled water studies need to make assumptions. 

The real bug-bear of nutritional research in humans is measurement error.  It is present in all studies (even trials, which are much less susceptible to bias than cohort or case-control studies).  That is a lot of what we struggle with in this research area. 

Now, it is true (and I agree with Aaron Carroll completely) that the trials tell us a lot of what we want to know.  In a real sense we want to know how dietary interventions, as they will actually work out in reality, will change outcomes.  So I share his concern that the trials seemed to be overlooked by people writing guidelines.  Or, in other words, I think his main conclusion is quite sensible.

But let's not forget the observational research is critical to understanding patterns of diet that create the hypotheses and interventions that can actually be tested in trials.  They also give a lot of understanding into how people consume food in the state of nature.  I am never going to stop focusing on high quality data and the most rigorous possible study designs.  But I think it would be wiser to represent the eco-system more completely. 

On the other hand, I am not an expert in health care communications, and it might be that such broad strokes are necessary when focusing on the general public.  After all, improving public health is everyone's goal, and I am happy to take a few "hits" if that is the ultimate outcome.  But I think it's also a challenge to think about how to make this type of research, and the nuances in it, better understood in general.

I have a lot of thinking to do. 

Monday, March 2, 2015

Defining away concerns about charter school attrition

[New information has come in and we may be making some changes to this post.]

After what seems like a long time, we are back on the bad education statistics beat. Joseph kicked things off with this post discussing some recent charter school research, particularly this paper by Angrist et al. I followed by reposting a couple of earlier pieces on attrition.

If you didn't see them when they came out, I strongly recommend you take a minute a read those two reposts (Selection effects on steroids and Selection on Spinach*). This is a big, incredibly complex story and it makes much more sense if you come in with some context.

I also want to say explicitly that I am not singling out the Angrist paper for criticism. It is, if anything, above average for the field; that's the scary part. I have a number of concerns about this study but they are all problems that you find in much, if not most of the research on charter schools.

Let's start with attrition and this passage from the paper. The first half of the paragraph mostly seems to be pretty good, except for one red flag [emphasis added].
A second potential threat to the validity of lottery-based estimates is the differential loss to follow-up between winners and losers (also called differential attrition). Students in our study are lost to follow-up if they are missing the MCAS score data we use to measure charter achievement effects. This usually happens when a student moves out of state or to a private school. Attrition can bias lottery-based estimates if different types of students are more likely to leave the sample depending on lottery results.
There are a couple of fairly subtle points here (since I'm not an expert on this research you might want to dig up a copy of the original paper -- I believe mine is behind a firewall -- and check my work). The first centers around the various reasons why a student might miss one or more standardized tests. The researchers do deserve some credit for mentioning the private school option but the don't seem to quantify it, nor do they mention reasons like changing schools which are much more likely than interstate moves to interact in a problematic way.

Easier to miss but far more important is the defining of attrition as leaving the data set rather than leaving the program. This isn't necessarily wrong but it's incomplete and worrisome in at least two ways: first because it differs from what we might call the standard definition. If you Google "charter school student attrition," you will generally find stories about students leaving charter schools and moving to other schools; second because the more common definition of attrition is far more likely to cause problems that can invalidate this study.

The rest of the paragraph is more troubling.
 For instance, losers might be more likely to leave than winners, and highly motivated students might be more likely to opt for private school if they lose. We therefore compare the likelihood that winners and losers have an outcome test score in our data. There are no statistically significant differences in follow-up rates in the lottery sample schools, a result documented in Appendix Table A3. It therefore seems unlikely that differential attrition has an impact on the lottery-based results.
That "seems unlikely" is very hard to justify. Putting aside for a moment, the issue of definitions, you can't control for differential attrition this way. It is entirely possible for two groups to have roughly the same level of attrition and yet have the selection effects going in opposite directions. Furthermore, the kind of highly selective attrition we're talking about here is very powerful (particularly if you throw in peer effects). Even if the selective attrition is limited to one group, it is entirely possible for a statistically insignificant difference in attrition rates to led to a substantial difference in outcomes.

(Perhaps it is just a coincidence but it seems that, as economists have played more and more the role of statisticians-at-large, we seem to be seeing more of these "don't worry, everything will balance out" assumptions.)

I want to be careful with the next part because as mentioned before, I'm not an expert in this field nor have I gone through the paper in great detail, but think about the following line from the paper:

"The effects of charter attendance are modeled as a function of years spent attending a charter school."

Keep in mind that we appear to have a lot of cases of charters (particularly those with the 'no-excuse' model) singling out out students who are likely to do poorly and either forcing them out of the program or encouraging them to leave voluntarily. This probably means that a lot of students who would have been low-score/high-charter-years had they stayed where they were assigned by the lottery have been shifted to the the low-score/low-charter-year category.

This isn't my biggest concern with this research -- it isn't even my second biggest -- but it is enough to potentially undermine the validity of the research.


Sunday, March 1, 2015

Nimoy tribute on MeTV

One of these days, I would love to spend some time discussing the many clever ideas of Weigel Broadcasting. (Keep in mind, Carl Reiner has called MeTV's promos "brilliant.") The company provides a fascinating case study of a well-run business.

Unfortunately, this post is time-sensitive, so I'll limit myself to a quick DVR alert.



I particularly recommend the Man From UNCLE episode, which also features William Shatner and Werner Klemperer and is simply a lot of fun.

MeTV also ran the Star Trek episode "Amok Time" last night and dusted off this irreverent but affectionate spot.





It's the kick that sells it.

Friday, February 27, 2015

I was about to slam Krugman for ignoring meaningful counter-examples...

I generally like Paul Krugman a great deal, partially because I have a high tolerance for quality snark and partially because... well, let's save that for later. Sometimes though, for lack of a better description, writes like an economist. By this I (somewhat unfairly) mean that he is occasionally too quick to embrace the sweeping and aesthetically pleasing theory that collapses under scrutiny. I have mainly noticed this trend when he ventures out of econ or when he is summarizing the work of colleagues.

Recent case in point (or so I thought).
[Sherwin] Rosen’s argument, more than 30 years ago, was that technology was leading to sharp increases in inequality among performers of any kind, because the reach of talented individuals was vastly increased by mass media. Once upon a time, he said, all comedians had to entertain live audiences in the Borscht Belt; some drew bigger, better-paying crowds than others, but there were limits to the number of people one comic could reach, and hence limits on the disparity in comedian incomes. In modern times, however, an especially funny guy can reach millions on TV; an especially talented band can sell records around the world; hence the emergence of a skewed income distribution with huge rewards for a few.
There is undoubtedly some truth to this, but there are huge counter-examples as well, and substantial parts of the entertainment industry where the hypothesized relationships don't hold at all. I was all set to skewer Krugman over these problems when he had to go and say this:
But the more I look into this, the less I think this story works, at least for music.
He then goes on to show how the theory breaks down, particularly when placed in the context of the general economy.

Here's my favorite example.
But are the big incomes of music superstars something new, or at least a late 20th-century development? Well, let’s take an example where there are pretty good numbers: Jenny Lind, the famous soprano, who toured America from 1850 to 1852.

Tickets at Lind’s first concert sold for an average of about 6 dollars, which seems to have been more or less typical during the tour. Adjusting for inflation, that’s the equivalent of around $180 today, which isn’t too shabby (a lot of the indie concerts I go to are $15-20, although they also make money on beer). But you also want to bear in mind that real incomes and wages were much lower, so that these were actually huge ticket prices relative to typical incomes.

Overall, Lind was paid about $350,000 for 93 concerts, or a bit less than $4,000 a concert. If we adjust for the rise in GDP per capita since then, this was the equivalent of around $2 million a concert today. In other words, to a first approximation Jenny Lind = Taylor Swift. And this was in an era not only without recordings, but without amplification, so that the size of audiences was limited by the acoustics of the halls and the performer’s voice projection.
Which brings me around to that other reason I like Krugman.

I believe it was in one of the plausible reasoning books that George Pólya observed that, as a general principle, if you gave most people a rule they would usually start trying to think of examples; if you gave a mathematician a rule, he or she would generally start trying to think of exceptions.

At the risk of making a sweeping statement as part of an attack on sweeping statements, one of my biggest problems with economist as statistician-at-large trend (see Levitt et al.) is that so few of them think like one of Pólya's mathematicians. Krugman, for all his other flaws, is the kind of writer who tends to notice exceptions.

Thursday, February 26, 2015

Bondholders as Stakeholders

I agree with Dean Dad that this is a really major development -- the idea that bondholders will directly be able to act as stakeholders in higher education is a very big deal.  Consider: 
Which is where a financial issue becomes a governance issue.  Suddenly, “shared” governance isn’t just shared with people on campus, or in the legislature.  Now it’s shared with bondholders, and those bondholders have different priorities and varying degrees of patience.  Unlike the other participants in shared governance, they may not have any particular obligation to the other parties at hand.  It might not be worth their while to go for the quick kill, but that’s prudence, rather than deference.  They aren’t big on deference, as a group.
It also means that institutions will become subject, even more so, to all of the economic pressures of the corporate world.  One of the few things that made higher education uniquely valuable was the ability to resist institutional change.  It seems paradoxical that this would be the case in an organization devoted to innovation, but higher education has always been focused on the long game and not the short game. 

Now this one case could well be an outlier and this could all blow over.  But it is worth thinking very carefully about how this will play out in an environment where schools are strapped for cash. 

Wednesday, February 25, 2015

Quick Post: Financial Advice

Just a quick hit today, from Matt Yglesias, discussing whether investment advisors should be regulated to give helpful advice:
This of course raises the question of what it is that brokers who serve the middle class — people at mass market brokerages who pick up the phone when you dial the number on your company's 401(k) site — are doing to make money. The answer is that they are earning a living marketing financial products that are profitable to their employer and disguising the marketing as advice.
I think that this is entirely correct.  The idea that this sort of regulation could eliminate or reduce the number of financial advisors is not surprising.  It'd replace them with salespeople, which would limit the amount of potential confusion. 

I also think that the quote Matt includes at the top of his post is telling in a very different dimension:
"While concerns about improper actions by investment advisors should certainly be addressed, an overly broad proposal could price professional financial advice beyond the reach of many modest income families."
The theory behind things like the 401(k) is that people will be able to make better investment decisions than, for example, the state.  Thus private savings would work better than, for example, social security.  However, if the advice needed to be successful at saving using financial instruments is outside of the reach of the middle class (when regulated so that advisors need to act in the best interests of their clients) then it rather undermines this entire thesis. 

It's been a quiet story, but the implications for policy are enormous. 


CBS joins the terrestrial superstation ranks

[There's a Car 54 marathon coming up on Decades. Every episode of William Faulkner's favorite show starting March 3rd. Just wanted to get that out of the way.]

We haven't hit this one for a while so perhaps a bit of review is in order.

Back in 2008, the US finally caught up with the rest of the world and switched over to digital broadcast television. One of the many largely unreported results was that, since over-the-air broadcasters could carry multiple channels on the same signal, the satellite superstation model could be extended to terrestrial television.

At first, the field was limited to one well-respected but minor regional player called Weigel Broadcasting, which in rapid succession launched the TBS-style movie channel ThisTV and the TVLand style retro-channel MeTV. Weigel had what appeared to be no external marketing budget, instead relying on walk-ins and word-of-mouth (their internal marketing was a different story with no less an authority than Carl Reiner calling their station promos 'brilliant').

Terrestrial superstations received almost no coverage outside of trade publications and a few industry-heavy towns like Chicago. The lack of coverage was perhaps not surprising given the absence of promotion and the downscale demographics of the market, but it raised a potentially troubling issue. The broadcast television industry occupies a valuable piece of virtual real estate. The telecom industry was lobbying hard for a chance to grab that portion of the spectrum. The national press (particularly in the Northeast) was discussing the possibility of shutting down terrestrial TV while being completely unaware of what was going on in the medium.

The debate over what to do with the spectrum quickly came down to two narratives: the first was that the over the air market was tiny and rapidly shrinking and that its resources could be better used elsewhere. This argument, supported by Nielsen data, had lots of powerful friends and was widely promoted; The counterargument, supported by the market research firm GfK, was that the market had grown sharply since the conversion to digital. Under this scenario, selling of the television spectrum would kill a fledgling industry, reduce media diversification and remove a service that greatly improves the quality of life for the bottom quartile in order to slightly improve things for the top. Rajiv Sethi may have been the first major blogger to take the OTA side.  Our blog also jumped in early in the debate.

(You can find a summary of the argument here. Make sure to check the comment section.)

Given the huge discrepancy between the Nielsen and GfK numbers, I suggested that we should watch what companies with high-quality proprietary data (particularly ad revenue) were doing. Two early indicators were NBC's terrestrial superstation COZI and the Fox/Weigel joint venture Movies!.

The comically inept COZI was of interest primarily because is part of the same corporate family as the cable company Comcast. Movies! was far more notable, both for quality and innovation and for the business arrangement that spawned it.

The Fox/Weigel deal was really something unusual, perhaps even at the time unique (more on that in a minute). At first glance, Weigel seemed to bring nothing to the table. Fox had the money, the stations, the library and at least as much experience putting together channels. If Fox were treating this as just another cable station, the deal would make no sense, but Movies! launched as a terrestrial superstation, and in that area Weigel had an unmatched track record.

Since then the number of terrestrial superstations has continued to grow. In addition to numerous smaller players, major studios like Sony and MGM entered the market, and now one of the biggest, smartest and most cautious major has decided to give the model a try.
NEW YORK and CHICAGO – The CBS Television Stations group, a division of CBS Corporation (NYSE: CBS.A and CBS), and Weigel Broadcasting today announced plans to launch DECADES, a new national entertainment programming service for distribution across local television stations’ digital subchannels – broadcast channels that utilize a local station’s available spectrum to provide a companion to that station’s primary channel.  For example, in the New York market, WCBS-TV will continue to be available digitally as Channel 2.1 and DECADES will be available as Channel 2.2. In addition to being available as an over-the-air broadcast channel, DECADES will appear on numerous local cable systems and other multichannel video programming distribution services along with the stations’ primary channels.

Utilizing a library of more than 100 classic television series, including select titles from the CBS library such as I LOVE LUCY from the 1950s, STAR TREK from the 1960s, HAPPY DAYS from the 1970s and CHEERS from the 1980s, as well as a wide selection of theatrical and made-for-television movies and footage of historical news events from the archives of CBS News and ENTERTAINMENT TONIGHT, DECADES will provide viewers with a new way to experience our shared historical and cultural past.

As the ultimate TV time machine, DECADES will differentiate itself from other subchannel programming services by varying the classic series and movies that appear on the network every day.     
“DECADES is the most ambitious and creative subchannel programming service that has ever been created,” said Peter Dunn, President, CBS Television Stations. “We are thrilled to partner with Weigel Broadcasting, the leaders in this space, to make smart use of our stations’ spectrums and our companies’ considerable programming assets. This service will be a tremendous new business for CBS and all of the other stations across the country that participate, regardless of their primary network affiliation.”
...
DECADES will take viewers into a daily time capsule presentation of entertainment, popular culture and news. The service will feature DECADES RETROSPECTICAL (SM), a daily one-hour program that will be produced around the news events and cultural touchstones of a specific day, week or other time frame or theme. The TV series and movies presented each day will reflect that day’s theme or commemorative event.

For example, DECADES will look back at classic series such as HAPPY DAYS and its “jump the shark” episode, explain its historical significance and then broadcast that episode. Viewers will also be taken back in time to rediscover events that shaped our world, such as the assassination of President John F. Kennedy, Neil Armstrong walking on the moon, the Beatles’ U.S. debut on THE ED SULLIVAN SHOW and the birth of software and technology companies like Microsoft and Apple. DECADES will connect these events to what people were watching on television, seeing at the movies and experiencing as a nation.
Even more than the Fox Movies! deal, Decades shows how much Weigel has come to be recognized as the dominant player in the terrestrial television market. As with the earlier collaboration, CBS would seem to be the one bringing everything to the table: the name, the money, the stations, the library, even expertise (keep in mind that in an earlier incarnation, CBS/Viacom* virtually invented the retro-genre in the Eighties with Nick-at-Nite, followed by TVLand).

The decision not only to start a MeTV style station but actually to bring in a competitor to run it is enormously telling. First, as an indication of Weigel's standing and second,  as an illustration of how much the terrestrial subchannel market is seen as both distinct and important.

We can probably never say whether Nielsen or GfK got it right, but we can say that the companies with the best proprietary data seem to see a future in rabbit ears.

* CBS and Viacom are not exactly the same company these days, but they are basically owned by the same people.

Tuesday, February 24, 2015

Skimming the cream -- a history lesson from Charles Pierce

This could be the starting point for all sorts of interesting discussions, from the role of government sponsored research to the profound and ubiquitous technological advances that clustered around the end of the Nineteenth and the beginning of the Twentieth Century.

For liberal political blogger, Charles Pierce (the source of the following passage), it's another reason to object to Scott Walker's approach to higher education.
Up until the 1890's, dairy farming was a sucker's game. Milk was sold to the factories by volume; farmers could cheat by skimming the cream, or by watering down the product. Honest dairy farmers producing good milk got cheated pretty badly in this system. In 1890, however, a man named Stephen Babcock developed a simple test by which, through the use of sulfuric acid and a centrifuge, any farmer could measure the butterfat content of his milk. This caused such a boom in the dairy industry that Wisconsin did indeed become America's Dairyland. In collaboration with another scientist, Babcock also developed a method for cold-curing cheese that helped the state become so prolific at producing cheesy comestibles that people now wear mock-ups on their heads at football games. He also did some revolutionary work with cattle feed that became the basis for the development of the concept of vitamins.

Babcock did all of this because he worked for the Wisconsin Agricultural Experiment Station, which had been founded in 1883 as part of the University of Wisconsin's land-grant mission under the Morrill Act. This was a precursor to the agricultural extension services that were developed at other land-grant institutions after the passage of the Smith-Lever Act in 1914. The land-grant mission, which was to provide an education that would be useful to the public at large, dovetailed perfectly with what became known as The Wisconsin Idea -- that the boundaries of the university are the boundaries of the state, an idea that Scott Walker has dedicated himself to tossing into the wood chipper. And thus it is that butterfat undermines the very raison d'etre of Scott Walker's entire political career and the very basis of his political philosophy. QED.

Also, moo.

Monday, February 23, 2015

Driverless cars may actually be getting closer

This announcement has me intrigued.







Today, Volvo announced a real, on-the-streets test of 100 of its self-driving cars — a first in the world, and one that will put regular owners in the seats of what it says are production-ready autonomous vehicles, by 2017.

Doing so requires far more than the 28 cameras, sensors and lasers Volvo says its system uses, along with a complex set of software rules, to tackle nearly 100 percent of all driving situations. It also required the approval of lawmakers in Sweden and Gotheberg, the city which will allow owners of these Volvos to legally cruise the streets while reading or chatting away on their phones from behind the wheel.

Making it possible for computers to understand everyday driving situations requires multiple types of radars, several cameras, a multiple-beam laser scanner in the front bumper and 12 ultrasonic sensors — the kind normally used to tell you if you're about to back into a pole. All of these are permanently linked to a special high-definition 3D map, refined GPS sensors and the local traffic control office — which can not only warn of jams, but command inattentive drivers to shut off their autopilots and drive themselves if necessary. And all of the systems have fail-safe modes and backups in case something goes wrong.
It is always risky to say "this is the right way to do this." With that in mind, the right way to talk about technology pretty much always revolves around the following:

Functionality;

Costs;

Implementation and infrastructure;

And the new technology's place in the existing technology landscape.

Most technology reporters (and I mean the vast majority) don't get these fundamental principles which leads them to more often than not get their stories backwards. In this case, the reporter, Justin Hyde, takes the attitude "wow, it has a special high-definition 3-D map when the appropriate response would've been "damn, it still needs a special high-definition 3-D map." Requiring special infrastructure, even really cool special infrastructure, is a bug, not a feature.

That said, this announcement does make me a bit more optimistic about the technology, at least in part because it didn't come from Google.

Google has a lot of reasons to want to be seen as a diversified, broadly innovative technology company, rather than as a very good one-trick pony cashing in on a monopoly (possibly two monopolies depending on how you want to count YouTube). A shiny reputation helps to keep stock prices high and regulators at bay.

Google has always been good at branding and they do have an extraordinary track record of innovation, but their really impressive advances (natural language processing, mapping, data mining) are closely related to their core business. The further away you move from search engines, the bigger the hype-to-substance ratio gets. This is nowhere more true than with driverless cars. The last round of publicity showed that the company could get as much buzz out of a cosmetic change (removing the steering wheel years after having demonstrated hands-free driving) as it did with the genuine breakthroughs of its earlier model.

Volvo's core competency is making, not only cars, but very safe cars. They have tons of relevant experience and engineering talent and a much larger stake in getting a viable product on the road. What's more, they seem more serious about getting the legal barriers out of the way. I still think that having a fully autonomous car generally available by the end of the decade is a long shot, but those odds might be getting a little better.

Friday, February 20, 2015

Checking in with MovieBob

I've been working on some video projects lately and putting quite a bit of thought into what makes a video podcast good. This has given me another excuse to spend too much time going through Bob Chipman videos. Chipman, a.k.a. MovieBob, has the obsessive love for and knowledge of pop culture that marks the ultimate nerd, but unlike, say, virtually all of the writers for the Onion's A.V. Club, he somehow has managed to maintain a sense of perspective on the subject accompanied by a refreshing amount of common sense.

In addition to keeping his sense of perspective about the fan-boy fodder, Chipman also does the same with the business of entertainment. He understands how things like intellectual property and antitrust laws...






marketing seasons...



bad accounting...



and technical limitations can affect our culture in subtle and interesting ways.



Chipman is also displays that same sense of perspective and common sense when discussing more controversial issues.








Thursday, February 19, 2015

I don't think you want to go with the "handful" defense

Before we go on, a quick caveat. There is tremendous variation in charter school models and philosophies. That's a big part of the story below and the reporter does a poor job addressing it. I can't say for certain, but I suspect that most of the worst offenders in the story follow the popular "no excuses" model.

From the New York Times:
The Advocates for Children report cites complaints from parents who said their children had been suspended from charter schools over minor offenses such as wearing the wrong shoes or laughing while serving detention. Ultimately, though, the group said the main issue was legal.

Half of the policies examined by Advocates for Children let charter schools suspend or expel students for being late or cutting class — punishments the group said violated state law. At three dozen schools, there were no special rules covering the suspension or expulsion of children with disabilities, which the group said violated federal law. And in 25 instances, charter schools could suspend students for long periods without a hearing, which the group said violated the United States and New York State Constitutions, as well as state law.

James D. Merriman, chief executive of the New York City Charter School Center, an advocacy group for charter schools, questioned how frequently the incidents cited by Advocates for Children occur.

“No one can disagree that those policies that do not fully meet applicable law should be amended,” he said in an email. “But it is tremendously unfair to suggest, as A.F.C. does, that a handful of one-sided anecdotes compiled over a long time are any evidence that charter schools are wholesale violating civil rights laws.”
I know I've made this point before but it bears repeating: excessively harsh disciplinary policies can make incompetent administrators look good while taking a horrible toll on kids. By locking out or chasing away the kids they can't handle (who also tend to be the kids who most need our help), administrators can pump up virtually all of a school's metrics.

Fortunately, in my experience, most administrators are too ethical to rely on these methods. Unfortunately, we have started setting up a system of incentives that encourage unethical behavior and if we continue, that balance will shift.

Wednesday, February 18, 2015

The politics of that pile of old comics

As mentioned before, writer and historian Mark Evanier is arguably the go-to guy for pop culture when it comes to both comics and television. One of his areas of particular expertise is the career of his friend, Jack Kirby.

The following excerpt confirms some assumptions I've had about the politics of Silver Age Marvel.
So when someone asks what Captain America would have felt about some topic, the first question is, "Which Captain America?" If the character's been written by fifty writers, that makes fifty Captain Americas, more or less…some closely in sync with some others, some not. And even a given run of issues by one creator or team is not without its conflicts. When Jack was plotting and pencilling the comic and Stan Lee was scripting it, Stan would sometimes write dialogue that did not reflect what Jack had in mind. The two men occasionally had arguments so vehement that Jack's wife made him promise to refrain. As she told me, "For a long time, whenever he was about to take the train into town and go to Marvel, I told him, 'Remember…don't talk politics with Stan.' Neither one was about to change the other's mind, and Jack would just come home exasperated." (One of Stan's associates made the comment that he was stuck in the middle, vis-a-vis his two main collaborators. He was too liberal for Steve Ditko and too conservative for Kirby.)

Jack's own politics were, like most Jewish men of his age who didn't own a big company, pretty much Liberal Democrat. He didn't like Richard Nixon and he really didn't like the rumblings in the early seventies of what would later be called "The Religious Right." At the same time, he thought Captain America represented a greater good than the advancement of Jack Kirby's worldview.

During the 1987 Iran-Contra hearings, Jack was outraged when Ollie North appeared before Congress and it wasn't just because North lied repeatedly or tried to justify illegal actions. Jack thought it was disgraceful that North wore his military uniform while testifying. The uniform, Jack said, belonged to every man and woman who had every worn it (including former Private First Class Jack Kirby) and North had no right to exploit it the way he did. I always thought that comment explained something about the way Kirby saw Captain America. Cap, obviously, should stand for the flag and the republic for which it stands but — like the flag — for all Americans, not merely those who wish to take the nation in some exclusionary direction.
We've already been over Ditko's Randian views.

I also knew that Lee, who is a bit of a revisionist, had overstated some of the progressive positions he had taken on issues like racism while downplaying the red-baiting and sexism. Marvel apologists have also tried to explain away the more reactionary aspects of these stories but they are pretty difficult to ignore and it appears that most of them can be credited to Lee. (Kirby never had Lee's gift for self-promotion or reinvention and he preferred to let his work speak for itself -- always a risky approach in a collaborative medium.)

For more thoughts on the subject, check out this piece by one of my favorite critics/pop historians, Bob Chipman (more from Chipman later).


You should note that the red-baiting version of the character was done by Lee with no involvement from Kirby.

Tuesday, February 17, 2015

Secret Origins -- College Humor

As mentioned before, I'm a long time fan of the site College Humor.




[Slightly NSFW]



But I always found the name a bit odd until I came across this post from Dick Cavett's much lamented NYT blog:





Woody Allen has said that of the greats, Groucho had the richest number of gifts. He could sing, dance and act, and beyond those fairly common gifts, when you add the distinctive voice, faultless instinct for wording, genius wit, hilarious physical movement, rich supply of expressions and physical “takes” — and the list goes on — it arguably adds up to the most supremely gifted comedian of our time.

And there’s one thing more. He could write. A born scribe. And many a Groucho fan is unaware of the degree to which this was true.

This problem has been put to bed by Bader’s book. (Full disclosure: I know Rob from the masterful job he did putting together the “Dick Cavett Show” DVD sets.) Bader, too, can write, and in a fresh, humorous, scholarly and entertaining way, with shrewd analysis and observations about the products of Groucho’s pen and typewriter.

If your reaction to this is, “So what did he write?” this book holds the answer. In his early years, and aside from his books, Groucho’s written pieces appeared widely, including in the beloved magazine College Humor and, yes, The New Yorker. Bader has found and retrieved priceless specimens of Groucho’s impressively large output from all over, some of the pieces early enough to have been bylined “Julius H. Marx,” Groucho’s vrai nom. Open the book to any page and try not to laugh.
A quick trip to Wikipedia filled in the details:

College Humor was a popular American humor magazine from the 1920s to the 1940s. Published monthly by Collegiate World Publishing, it began in 1920 with reprints from college publications and soon introduced new material, including fiction. Contributors included Robert Benchley, Heywood Broun, Groucho Marx, Ellis Parker Butler, Katherine Brush, F. Scott Fitzgerald and Zelda Fitzgerald. Editor H.N. Swanson later became Fitzgerald's Hollywood agent.

The magazine featured cartoons by Sam Berman, Ralph Fuller, John Held Jr., Otto Soglow and others.
I suppose this could be a coincidence, but if not, that's awfully good company.


Monday, February 16, 2015

Repost -- Selection on Spinach*

As part of a follow-up to this recent post by Joseph, I'm going to be discussing the role of attrition in the charter school debate. To get the conversation started, I'll be reposting a couple of earlier entries.

______________________________________________





[I have the nagging feeling that I'm not using the proper terminology with the following but the underlying concepts should be clear enough. At least for a blog post.]

Let's talk about three levels of selection effects :

The first is initial selection. At this level, certain traits of potential subjects influence the likelihood of their being included in the study. If you ask for volunteers in person, you will end up underrepresenting shy people. If you use mail surveys, you will underrepresent the homeless:

The second level comes after a study starts. You will frequently lose subjects over time. This type of selection is particularly dangerous because you cannot assume that the likelihood of dropping out is independent of the target variable. The isue comes up all the time in medical studies. For serious conditions, a turn for the worse can make it extremely difficult to continue treatment. The result is that the people who stick around till the end of the study are far more likely to be those who were getting better;

(Up until now, the types of selection bias we have discussed, though potentially serious, are generally not deliberate. Their consequences are unpredictable and they happen to even the best and most conscientious of researchers. That is no longer the case with level three.)

The third level concerns attempts to manipulate attrition so as to affect the results of a study. In these cases, researchers will attempt to get rid of those subjects who are likely to drag down the average. This is blatant data cooking and it can be remarkably effective. In school administration, the term of art is "counseling out." It is shockingly widespread, particularly among the "no excuses" charter schools.

The effect of this practice on kids can be brutal but that is a topic for another post. What interests us here are the statistical concerns; what are the analytic implications of this policy? In terms of direction, the answer is simple: schools that engage in these policies will see their test scores artificially inflated. In terms of magnitude, there is really no telling. The potential for distortion here is huge, particularly when you take into account the possibility of peer effects.

Put bluntly, in cases like this, "The first Success graduating class, for example, had just 32 students. When they started first grade in August 2006, those pupils were among 73 enrolled at the school," data showing above-average results are almost meaningless.

[A few weeks ago, I put out a collection of our early posts on education (Things I Saw at the Counter-Reformation).  The impact of attrition is one of the big running themes.]



*Spinach being, in this case, a substance that greatly increases the power of a given effect.

Repost -- Selection effects on steroids

As part of a follow-up to this recent post by Joseph, I'm going to be discussing the role of attrition in the charter school debate. To get the conversation started, I'll be reposting a couple of earlier entries.

______________________________________________

I'm about to have a lot more to say about the various ways high attrition can pump up a school's performance metrics, some directly through removing low performers, some indirectly through peer effects, treatment interactions and accounting tricks. At the risk of spoiling the punchline of those future posts, it is next to impossible to perform meaningful analyses of the academic quality of high-attrition schools. About the only safe conclusion is that those schools are worse than they look.

If charter schools are going to have a future (and I hope that they will, though my reasons will have to wait for another post), they will have to overcome two existential threats, both of which originated not with their critics but with their supporters. It was supporters who pushed a radical deregulation agenda that led to massive looting of the system and it was supporters who advocated for a flawed system where success was defined solely by metrics and those metrics were easily cooked by methods which took a brutal toll on kids.

In a devastating post, Diane Ravitch spells out just how bad the problem has gotten.
Reformers tend to make two very different arguments about charter schools. Argument #1 is that charter schools serve the same students as public schools and manage to put public schools to shame by producing amazingly better results on standardized exams. Therefore, reformers claim, if only public schools did what charter schools do (or better yet, if all public schools were closed and charter schools took over), student learning would dramatically increase and America might even beat South Korea or Finland on international standardized tests. When it is pointed out that, as a whole, charters do no better than public schools on standardized tests [2], reformers will quickly turn their attention to specific charter chains that, they claim, do indeed produce much better standardized test results. So what’s the deal with these chains? Well, in every case that has been subjected to scrutiny their results are extremely suspicious. Here is a short list of examples:

1. Achievement First in New Haven had a freshman class of 64 students (2 students enrolled later), and only 25 graduated- a 38% graduation rate- yet the school claimed a 100% graduation rate by ignoring the 62% attrition rate. [3]

2. Denver School of Science and Technology (DSST) had a freshman class of 144 students and only 89 12th graders- a 62% graduation rate- yet the school (and Arne Duncan) claimed a 100% graduation rate by ignoring the 38% attrition rate. [4] As a 6-12 charter chain, DSST also manages to attrite vast numbers of their middle school students before they even enter the high school.

3. Uncommon Schools in Newark disappears 38% of its general test takers from 6th to 8th grade.[5] Another analysis found that through high school the attrition rate was, alarmingly, much higher “Uncommon loses 62 to 69% of all males and up to 74% of Black males.”[6]

4. BASIS in Arizona- “At…BASIS charter school in Tucson, the class of 2012 had 97 students when they were 6th graders. By the time those students were seniors, their numbers had dwindled to 33, a drop of 66%. At BASIS Scottsdale…its class of 2012 fell from 53 in the 6th grade to 19 in its senior year, a drop of 64%.” [7]

5. The Noble Network in Chicago- “Every year, the graduating class of Noble Charter schools matriculates with around 30 percent fewer students than they started with in their freshman year.” [8]

6. Harmony Charters in Texas- “Strikingly, Harmony lost more than 40% of 6th grade students over a two-year time.” [9]

7. KIPP in San Francisco- “A 2008 study of the (then-existing) Bay Area KIPP schools by SRI International showed a 60% attrition rate…the students who left were overwhelmingly the lower achievers.” [10]

8. KIPP in Tennessee had 18% attrition in a single year! “In fact, the only schools that have net losses of 10 to 33 percent are charter schools.” [11]

In every case these charter chains accepted students that were significantly more advantaged than the typical student in the district, and then the charters attrited a significant chunk of those students.

Success Academy in New York City plays the same game. It accepts many fewer high needs special education students, English Language Learners, and poor students. [12] It attrites up to 1/3 of its students before they even get to testing grades and then loses students at an even faster pace. It selectively attrites those students most likely to get low scores on standardized tests. [13] It is legally permitted to mark its own exams (as are all New York City charter schools) while public schools cannot. It loses 74% of its teachers in a single year at some of its schools. [14] The author of the Daily News editorial that sparked the initial blog commented “even in the aggregate that wouldn’t seem to account for” the results. It is entirely unclear what he means by “in the aggregate.” But it is clear that he has his arithmetic wrong. A charter chain that starts with an entering class that is likely to score well on standardized tests, then selectively prunes 50% or more of the students who don’t score well on standardized tests and refuses to replace the disappeared students with others, can easily show good standardized test results with the remaining students. Any school could do this. It’s really not rocket science.


And here are the footnotes



[1] https://dianeravitch.net/2014/08/22/is-eva-moskowitz-the-lance-armstrong-of-education/
[2] http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/09/24/the-bottom-line-on-charter-school-studies/
[3] http://jonathanpelto.com/2013/05/30/another-big-lie-from-achievement-first-100-percent-college-acceptance-rate/
[4] http://garyrubinstein.wordpress.com/2014/04/16/arne-debunkin/
[5] http://schoolfinance101.wordpress.com/2010/12/02/truly-uncommon-in-newark /
[6] http://danley.rutgers.edu/2014/08/11/guest-post-where-will-all-the-boys-go/
[7] http://blogforarizona.net/basis-charters-education-model-success-by-attrition/
[8] http://jerseyjazzman.blogspot.com/2012/04/no-bull-in-chicago.html
[9] http://fullerlook.wordpress.com/2012/08/23/tx_ms_charter_study/
[10] http://parentsacrossamerica.org/high-kipp-attrition-must-be-part-of-san-francisco-discussion/
[11] http://www.wsmv.com/story/22277105/charter-schools-losing-struggling-students-to-zoned-schools
[12] https://dianeravitch.net/2014/03/12/fact-checking-evas-claims-on-national-television/
[13] https://dianeravitch.net/2014/02/28/a-note-about-success-academys-data/. The high attrition rate before testing in 3rd grade may explain the data pattern noted in this http://shankerblog.org/?p=10346#more-10346 analysis.
[14] http://www.citylimits.org/news/articles/5156/why-charter-schools-have-high-teacher-turnover#.U_gqR__wtMv
[15] http://edexcellence.net/commentary/education-gadfly-daily/flypaper/2013/the-charter-expulsion-flap-who-speaks-for-the-strivers.html
[16] http://schoolfinance101.wordpress.com/2012/12/03/when-dummy-variables-arent-smart-enough-more-comments-on-the-nj-credo-study/

Friday, February 13, 2015

Estimands and targets of inference

This article by Mike the Mad Biologist is worth reading in full (warning: some strong language is included in this piece).  It engages the issues of charter schools in a very detailed way, with a focus on the state MA, which seemed to show fewer issues than many other regions of the country. 

But he also pointed out a very subtle and important point about one of the key findings of the charter school research area:
For long-time readers, I’m going to try (and probably fail) to be polite–just this once, even though I was somehow supposed to craft a coherent response on Twitter after carefully reading technical articles while at work (by the way, never assume someone is unfamiliar with the literature. But I’m getting ahead of myself). The charge, led by Adam Ozimek, focused on a couple of papers by Angrist et al.–though not all of the papers (again, getting ahead of things…). They’re good papers*, and essentially compare students who wanted to attend charters but lost the lottery and wound up in regular public schools (note that ‘unpopular’ charters can’t be assessed with this method).
But that last fragment brings up a really important point.  In the context of popular charter schools, the randomized admissions process is a very good instrument for estimating the average causal effect of the charter school intervention.  There are some issues of composition of classes, that Mark may well comment on, that lead to peer effects.  But this is decent evidence. 

However, it only applies in regions where the charters actually have a waiting list.  This was also really interesting:
However, students in non-urban charter schools do worse than the ‘lottery losers’ in regular public non-urban schools–and with the same effect as the gains in the urban schools. 
Yes, the average causal effect flips for non-urban environments (like the suburbs, for example).  It's obviously not appropriate to take the average causal effect of urban schools and apply it to populations with evidence of a reverse of the causal effect.  It also means we need to do a lot more study to understand when this intervention leads to better (versus worse) outcomes.  It also means that I am now loathe to generalize from the places with waiting lists to those without them (where this random experiment cannot happen). 

So unless there is an issue here that I am missing, this sounds an awful lot like "go slowly and carefully" due to some seriously confusing patterns of average causal effect.  Table 8 is the amazing table, and the effect measure modification is quite startling.  So I think that this adds to the growing body of reasons one can be pro-children and still skeptical about charter schools. 


Thursday, February 12, 2015

Netflix PR

[I'm coming off of a challenging January and I'm playing catch-up. The following should have run about a month ago, but I think the points are still fairly relevant.]

This is yet another one of those post where I based seen to be clubbing Netflix,when I'm actually using Netflix to club someone else. In this case, Netflix is doing exactly what the company is supposed to do, using PR to promote its products and brand . My issue is entirely with the people on the other side of the process.

If you have been paying attention, you may have noticed a new subgenre of entertainment journalism, the inexplicably exciting old show (that just happens to be on Netflix). On one level, this is nothing new. PR departments have been ghost-writing stories for reporters since at least the Twenties but in recent years the flack-to-hack ratio has ticked up noticeably and there are few companies making better or more aggressive use of the planted news story than Netflix does.

It started getting really blatant for me about the time that the company started pushing Young Indiana Jones. For the uninitiated, YIJ was one of the great crash-and-burns of network television history. Despite big budgets and high expectations (coming just three years after Last Crusade), this show flopped so decisively that ABC never even aired the last four episodes. DVDs were released in 2007/2008 to cash in on the buzz around  Indiana Jones and the Kingdom of the Crystal Skull, but other than that the show was largely forgotten.

Then Netflix picked it up and we started seeing stories like these:

5 Reasons Why Young Indiana Jones Is Actually Not As Bad As You Think

What to Binge This Weekend: 'The Young Indiana Jones Chronicles'



Netflix has always been very aggressive about drumming up press coverage, particularly involving originals and new releases. Whenever you see a blogger recommending some show you can stream on on the service, the chances are very good that the idea for this post initiated in some PR firm. As mentioned before, this is nothing new. [I discussed this post with a friend who used to work in the publicity department of one of the major studios (welcome to LA). He pointed out that he frequently wrote press releases that appeared verbatim under reporters' bylines in Variety and the Hollywood Reporter in the Nineties.] You can also argue that the journalistic expectations for the sites mentioned above have never been that high.

However the lines are continuing to grow blurrier, both between news story and press release and between puff piece and journalism. Which brings us to Esquire.

Friends is, in a sense, the opposite of Young Indiana Jones. The latter is a show that no one has heard about me: the former is a show that everyone has seen. It is arguably the seminal romantic comedy sitcom. It was massively popular in the day, continues to sell a ton of DVDs, and has been syndicated to the point of full immersion. It will, no doubt, be a popular feature for Netflix but it is hard to see how adding one more venue qualifies as news.

But Esquire thinks differently:




For those not familiar with the online edition of the magazine, the right side of the page is reserved for recommendation for other articles on the site (and, at the bottom of the page, for sponsored recommendations from other sites, but the ethics of the sponsored link is a topic for another day). The emphasis is not on what you'd call hard news -- men's grooming and style feature prominently and the editors manage to work in lots of pictures of Penélope Cruz and Angelina Jolie -- but it's the same sort of fluff that has always filled out the magazine.

So when "Everything You Should Watch in January 2015" came in number six on WHAT TO READ NEXT, you can reasonably consider it a relatively well-promoted news story. You can also assume that the impetus if not the actual authorship came from someone working PR for Netflix.

The subtitle of the piece is "In theaters and on video and Netflix Streaming." Note the omission of the major competitors Hulu and Amazon and providers of similar content such as YouTube, CBS.com, PBS.com and a slew of smaller players. Hulu's Wrong Mans seems a particularly obvious choice given all of the attention going to James Corden taking over the Late Late Show from Craig Ferguson.

This type of PR push is normally coordinated with a substantial ad campaign. In this case, the unlimited streaming of Friends (not to be confused with per-episode streaming, which has been available for years) prompted a large ad buy which included among other things, disinterring a few careers.





None of this is in any way meant as a criticism of Netflix. If anything, it would be an example of a company being responsible and portraying its product in the best possible light. This does raise some questions about Esquire's editorial policies, but given this is the kind of magazine that encourages you to buy four-hundred dollar watchbands,  I think the more alert readers already suspected that these manufacturers may have been exerting some influence on the journalists covering them.

All that said, it is always useful to remind ourselves from time to time that when news stories work out particularly well for some corporation, that favorable result probably isn't entirely coincidental.

Wednesday, February 11, 2015

Analog recording without analogs

And now for something completely different...

Two examples of artists who manually created their works on media normally used for analog recording.


Conlon Nancarrow


Nevertheless, it was in Mexico that Nancarrow did the work he is best known for today. He had already written some music in the United States, but the extreme technical demands his compositions required meant that satisfactory performances were very rare. That situation did not improve in Mexico's musical environment, also with few musicians available who could perform his works, so the need to find an alternative way of having his pieces performed became even more pressing. Taking a suggestion from Henry Cowell's book New Musical Resources, which he bought in New York in 1939, Nancarrow found the answer in the player piano, with its ability to produce extremely complex rhythmic patterns at a speed far beyond the abilities of humans.

Cowell had suggested that just as there is a scale of pitch frequencies, there might also be a scale of tempi. Nancarrow undertook to create music which would superimpose tempi in cogent pieces and, by his twenty-first composition for player piano, had begun "sliding" (increasing and decreasing) tempi within strata. (See William Duckworth, Talking Music.) Nancarrow later said he had been interested in exploring electronic resources but that the piano rolls ultimately gave him more temporal control over his music.[6]

Temporarily buoyed by an inheritance, Nancarrow traveled to New York City in 1947 and bought a custom-built manual punching machine to enable him to punch the piano rolls. The machine was an adaptation of one used in the commercial production of rolls, and using it was very hard work and very slow. He also adapted the player pianos, increasing their dynamic range by tinkering with their mechanism and covering the hammers with leather (in one player piano) and metal (in the other) so as to produce a more percussive sound. On this trip to New York, he met Cowell and heard a performance of John Cage's Sonatas and Interludes for prepared piano (also influenced by Cowell's aesthetics), which would later lead to Nancarrow modestly experimenting with prepared piano in his Study No. 30.

Nancarrow's first pieces combined the harmonic language and melodic motifs of early jazz pianists like Art Tatum with extraordinarily complicated metrical schemes. The first five rolls he made are called the Boogie-Woogie Suite (later assigned the name Study No. 3 a-e). His later works were abstract, with no obvious references to any music apart from his own.

Many of these later pieces (which he generally called studies) are canons in augmentation or diminution (i.e. prolation canons). While most canons using this device, such as those by Johann Sebastian Bach, have the tempos of the various parts in quite simple ratios, such as 2:1, Nancarrow's canons are in far more complicated ratios. The Study No. 40, for example, has its parts in the ratio e:pi, while the Study No. 37 has twelve individual melodic lines, each one moving at a different tempo.












Norman McLaren


McLaren was born in Stirling, Scotland and studied set design at the Glasgow School of Art.[1] His early experiments with film and animation included actually scratching and painting the film stock itself, as he did not have ready access to a camera. His earliest extant film, Seven Till Five (1933), a "day in the life of an art school" was influenced by Eisenstein and displays a strongly formalist attitude.

That included painting on the optical sound track.

In the 1950s, National Film Board of Canada animators Norman McLaren and Evelyn Lambart, and film composer Maurice Blackburn, began their own experiments with graphical sound, adapting the techniques of Pfenninger and Russian artist Nikolai Voinov.[2] McLaren created a short 1951 film Pen Point Percussion, demonstrating his work.[3] The next year, McLaren completed his most acclaimed work, his Academy Award-winning anti-war film Neighbours, which combined stop-motion pixilation with a graphical soundtrack. Blinkity Blank is a 1955 animated short film by Norman McLaren, engraved directly onto black film leader, combining improvisational jazz along with graphical sounds. In 1971, McLaren created his final graphical sound film Synchromy.[4]





Tuesday, February 10, 2015

Some points on Uber

This is Joseph

A very nice piece on Uber.  It hits both of the issues with Uber, the shift to lower regulation:
But Uber has little incentive to build well-paying, stable opportunities with reasonable hours at salaries of $50,000 a year. Quite the opposite: by creating part-time jobs that are the equivalent of Walmart greeters on wheels, the company can keep wages low (benefits, of course, are out of the question). It’s little wonder why Uber fights regulations that would require it to insure its drivers’ vehicles, conduct background checks, pay fees or limit its workforce: without restraints on the number of passenger-serving cars, and with a very low barrier of entry to the profession, the number of drivers will continue to grow until the market hits a point of saturation, sending costs plummeting in the process. Because Uber has few operating expenses to speak of—the investment is all made up front in developing the app, after which maintenance is minimal—the company enjoys a substantial profit no matter how many drivers flood the market.
And the unexpected consequences of these new changes:
But Uber has no requirement to serve the public. Indeed, there is a strong race, class and age bias as to who can utilize the service. You have to own a smartphone, which has an average cost of more than $500. Uber requires customers to pay with a credit card, cutting off those with no or poor credit. Until recently, the company had no wheelchair-accessible vehicles in Virginia, and continues to lack adequate services for the disabled in many places. 
The real issue with this sort disruption is that regulations may have a place in markets.   Now it isn't the business of any single corporation to deal with the inefficiencies present in the American marketplace.  But it is a matter of public concern if basic services stop being offered to key segments of the market.  We ensure, for example, that everyone has postal service (perhaps with some limitations) because the ability to accept mail is fundamental to being a member of society.

Ironically, Uber is a case where liberals often cross the isle.  Just like conservative libertarians often see subsides for oil companies as a good thing, middle class progressives focus on (the real and amazing) improvements in service that Uber provides to customers.  In a real sense, it is hard to argue that better service for customers and more flexibility for workers is a bad thing.  Because isn't.

But it would be good if we could keep the pieces of the past regulatory regime that created social benefits while helping to reduce the ossification that the industry has suffered.  It's a tricky line to draw.

Monday, February 9, 2015

Why some policy changes should be slow and measured: a case study

This is Joseph

Gary Ray nails it:
The problem with minimum wage is when it rises too fast. When it rises above a reasonable level of annual profit of a small business, you've got trouble. Using Borderlands numbers, a 20% rise over 3 years would require a nearly 7% increase in sales. Most retail businesses are not growing at 7% a year. I don't need back of the napkin math to figure this out. The Department of Commerce numbers show average retail sales growth median to be 5.04%, with 2014 at 3.17%. So minimum wage yes, but not so quickly.
I think that fast changes accelerate the costs of policies by increasing the penalty from disruption.  I think that this is an unpopular idea in the "age of disruption" (see Uber) but slower phasing in of policy can often permit industries and people to adapt.  It's also notable that bad industry practice, like the MSRP, was involved in this outcome by making bookstores less flexible in the face of changing costs.

Noah Smith has more on the specific situation, and his own comment on the externalities involved in fast changes. 

Does this mean a minimum wage increase is bad policy?  No, but I think it would be technically better to phase it in slowly, like Canada, rather than a large change all at once.