Thursday, March 20, 2014

When threads collide -- David Coleman vs. Prof. Feynman

In all the coverage and controversy over the recent changes in the SAT, one of the aspects that troubles me the most is the one that seems to bother most people the least (emphasis added):
[David] COLEMAN: The new math section will focus on three things: Problem solving and data analysis, algebra and real world math related to science, technology and engineering fields.
The response from most journalists and pundits to this push for applicability has been either disinterest or mild approval, but if you dig into the underlying statistics and look into the history of similar educational initiatives, it's hard not to come away with the conclusion that this change pretty much has got to be bad (with a better than even chance of terrible).

The almost inevitable bad outcome will be the nearly unavoidable hit taken by orthogonality. As discussed earlier, the value of a variable (such as an SAT score) in a model lies not in how much information it brings to the model but in how much new information it brings given what the other variables in the model have already told us. Models that colleges use to assess students (perhaps with trivial exceptions) include courses taken and grades earned. We want additional variables to that model to be as uncorrelated as possible with those transcript variables. The math section of the SAT does this by basing its questions on logic, problem solving and on basic math classes that everyone should have taken before taking the SAT. Students whose math education stopped at Algebra I should be on a roughly equal footing with students who took AP calculus, as long as they understood and retained what they learned.

Rather than making the SAT a more effective instrument, "real world" problems only serve to undercut its orthogonality. Meaningful applied questions will strongly tend to favor students who have taken relevant courses. It might be possible to avoid this trap, but it would be extremely difficult and there's no apparent reason for making the change other than the vague but positive connotations of the phrase. (It's important to note here that Coleman's background is in management consulting and the ability to work positive-sounding phrases into presentations is very much a core skill in that field.)

Even more worrisome is the potential for the really bad question, bad enough to have the perverse effect of actually causing more problems (in stress and lost time) for those kids who understand the material. Nothing throws a good student off track worse than a truly stupid question.

Even if the test-makers know what they're doing, writing good, situation-appropriate problems using real situations and data is extraordinarily difficult. The vast majority of the time, real life has to be simplified to an unrealistic degree to make it suitable for a brief math problem. The end result is usually just an old problem with new nouns, take a rate problem and substitute "computer programmer" for "ditch digger."

You can make a fairly good case for real world questions based on teaching-across-the-curriculum -- for example, using Richter scale in a homework problem is a good way of working in some earth science -- but since the purpose of the SAT is to measure, not to instruct, that argument doesn't hold here.

The even bigger concern is what can happen when the authors don't know what they're doing.

From Richard Feynman's "Judging Books by their Covers":
Finally I come to a book that says, "Mathematics is used in science in many ways. We will give you an example from astronomy, which is the science of stars." I turn the page, and it says, "Red stars have a temperature of four thousand degrees, yellow stars have a temperature of five thousand degrees . . ." -- so far, so good. It continues: "Green stars have a temperature of seven thousand degrees, blue stars have a temperature of ten thousand degrees, and violet stars have a temperature of . . . (some big number)." There are no green or violet stars, but the figures for the others are roughly correct. It's vaguely right -- but already, trouble! That's the way everything was: Everything was written by somebody who didn't know what the hell he was talking about, so it was a little bit wrong, always!

Anyway, I'm happy with this book, because it's the first example of applying arithmetic to science. I'm a bit unhappy when I read about the stars' temperatures, but I'm not very unhappy because it's more or less right -- it's just an example of error. Then comes the list of problems. It says, "John and his father go out to look at the stars. John sees two blue stars and a red star. His father sees a green star, a violet star, and two yellow stars. What is the total temperature of the stars seen by John and his father?" -- and I would explode in horror.
Keep in mind, Feynman's example was picked to be amusing but representative ("That's the way everything was...a little bit wrong, always"). The post-Sputnik education reformers of his day were making pretty much the same demands that today's reformers are making. There's no reason to expect a better result this time.

Of course, there are good questions that do use real-world data (you can even find some on the SAT), but in order to write them you need a team that understands both the subtleties of the material and the statistical issues involved in testing it.

The more I hear from David Coleman, whether it concerns the College Board or Common Core, the less confidence I have in his abilities to head these initiatives.

Wednesday, March 19, 2014

Differential growth in life expectancy?

There has been a lot of discussion about Annie Lowrey's article on changes in life expectancy, documenting how most of the recent rise in life expectancy is among Americans of higher socio-economic status.  I did find the question of causality to be less compelling:

It is hard to prove causality with the available information. County-level data is the most detailed available, but it is not perfect. People move, and that is a confounding factor. McDowell’s population has dropped by more than half since the late 1970s, whereas Fairfax’s has roughly doubled. Perhaps more educated and healthier people have been relocating from places like McDowell to places like Fairfax. In that case, life expectancy would not have changed; how Americans arrange themselves geographically would have.

“These things are not nearly as clear as they seem, or as clear as epidemiologists seem to think,” said Angus Deaton, an economist at Princeton.
It is possible that there is a process of re-arrangement going on.  But that still doesn't make charts like the second one in this Aaron Carroll blog post easier to explain.  If the higher earning recipients of social security live longer than the lower earning recipients, then this association is not simple to explain with a direct appeal to the ecological fallacy. 

This is the sort of case where data is limited but we still need to make decisions.  It is odd that with some decisions we are desperately worried about getting things wrong when it advantages the affluent but we seem quite worried about over-interpreting data when redistribution would be the obvious policy solution. 

“The best thing that happened to the education system in New Orleans was Hurricane Katrina”

Joseph has already commented on one aspect of this Valerie Strauss article on Netflix CEO Reed Hastings, but a different passage caught my eye.
He appears to be presenting a vision of education in the United States where nearly all students are educated in collections of charter schools: “So what we have to do is to work with school districts to grow steadily, and the work ahead is really hard because we’re at 8% of students in California, whereas in New Orleans they’re at 90%, so we have a lot of catchup to do.”
As indicated by the Arne Duncan quote I used as a title, the notion of New Orleans as the educational ideal is strongly established in the reform movement. New Orleans has implemented the major tenets of the reform pedagogy to an extraordinary degree, particularly the rigid, metric-driven, no-excuses attitude. On this much, everyone can pretty much agree.

When we get to effects, however, the picture gets murkier. There has been some improvement in test scores but the 'reforms' coincided with increased spending which would be expected to boost scores. In addition, some of the increase can also be assigned to considerably increased pressure of students to take the tests seriously. Even putting all that aside, the improvements still don't look that impressive when compared to demographically similar schools in other states. Bruce Baker of Rutgers did the heavy lifting.

The bigger story for me, though, is in the details of the now dominant culture of New Orleans schools and in how parents and students have reacted.to the new regime. It's apparent that quite a few people are extremely unhappy.

A previous post mentioned students from one New Orleans high school walking out in a mass protest.



This was not an isolated incident.
Sci Academy, the flagship of the Collegiate Academies charter group, is known for high test scores and stringent discipline policies, such as requiring students to walk between lines taped on the floor. School leaders say the two go hand-in-hand: You don't have to walk on the right side of the hallway in college, but the discipline will serve you well.

But students at the group's two new schools, George Washington Carver Collegiate Academy and George Washington Carver Preparatory Academy, walked out the week before Thanksgiving, angry about such rules. On Wednesday (Dec. 18), about 60 students attended a rally. A letter of demands written by some students said kids were being suspended "for every little thing."

Recent state data show there are grounds for that claim. The three Collegiate schools had the city's highest suspension rates in the 2012-13 academic year. A full 69 percent of Carver Collegiate's student body was sent home at least once. Carver Prep suspended 61 percent of its student body. Sci Academy sent home 58 percent, a 9-point increase from the year before.
Anyone with experience with K-12 education can tell you that mass suspension and expulsion may possibly be the simplest and most effective way of improving test scores and making classroom management easier (a particularly pressing issue if you have high teacher turnover and rely heavily on programs like TFA). The problem with the technique is that it takes its greatest toll on the most vulnerable students. To fully grasp the brutality of these methods, you have to look at specific examples, such as this one from a parents' advocate in New Orleans:
The case that still breaks my heart involved a 14-year-old who kept getting demerits because his uniform shirt was too small and came untucked basically every time he moved. His mother was a veteran, well-educated, and had sold real estate but got divorced and ended up losing her job, and became homeless. They were living with friends and really struggling. The school expelled the child because he’d had three suspensions—the last one for selling candy to try to raise enough money to buy a new shoes and a new uniform shirt. I felt that if the mother went and told her story that the school would understand and wouldn’t hold up the expulsion. She didn’t want the school to know how impoverished she was but I convinced her to do it, so she came and told all of these people what she was going through—about her struggles. I thought for sure the board would overturn the expulsion, not just because her story was so compelling, but because there wasn’t actually anything in the school’s discipline book about selling candy. But they upheld it and it broke my heart that this kid was being put out of school because he was poor.
I don't know if this student went to one of the specific schools discussed here, but I can tell you that this is all too often what the process looks like, which is why responsible administrators use it so reluctantly.

Tuesday, March 18, 2014

A rare RPG post

Go and read Greyhawk Grognard on rare spell components.  With creativity you can make powerful spells tough to cast without needing to simply make them cost cash.  Finding each of these components would be an adventure in and of itself.  It makes components fun and interesting instead of just a "pay go" system. 

Has this person ever worked in a large corporation?

I ask in astonishment because I read things like this:
The newest bit of “wisdom” for public education comes to us from Netflix Chief Executive Officer Reed Hastings, who is a big charter school supporter and an investor in the Rocketship Education charter school network. At a meeting of the California Charter Schools Association on March 4, he said in a keynote speech that the problem with public schools is that they are governed by elected local school boards. Charter schools have boards that are not elected and, according to his logic, have “a stable governance” and that’s why “they constantly get better every year.”
See, in the private sector there was this phenomenon called "re-organization" (or re-org) for short that seemed to hit every couple of years.  Each time there was a massive shift in governance and lines of reporting.  If Netflix has managed to avoid these "re-orgs" then I see that as a very positive feature of the company, but it is hardly a guarantee that all private corporations will be able to do the same things.

It also leads to other tough questions.  The reason that the private sector works well is "creative destruction" as better companies outcompete poorer companies.  Is the charter school movement going to be immune to competition as well? 

And if they are immune to market forces, what are they accountable to?  If we think the answer is a higher level of government, then why do we think it will be more stable and more accountable than the school boards? 

This is not so much a defense of school boards (which I have seriously mixed feelings about) as it is a question of what model do we replace them with?  I am not sure that the command and control style socialist model of the state owned or supported corporation has been the most efficient alternative, has it? 

EDIT: Mark Palko wanted me to mention that Valarie Strauss has been going good work in this area for some time.  Also note that idea of California needing to "catch up" to New Orleans  -- it is possible for a former backwater to become dynamic (think Macedonia at the end of the classical Greek era) but this is often not the best bet to make.

NOTE: Mark here. For a bit more context, check out the reform movement gadfly Edushyster's take on the charter chain Hastings was promoting,

Monday, March 17, 2014

Texas versus California

I have been trying to decide if Scott Lemieux covered this too completely, but I decided that there were a couple of useful points in this article.  Especially as relates to my California versus Texas discussion with Mark, where we discuss the relative merits of the two states. For example:
And despite all the gloating by Texas boosters about how the state attracts huge numbers of Americans fleeing California socialism, the numbers don’t bear out this narrative either. In 2012, 62,702 people moved from California to Texas, but 43,005 moved from Texas to California, for a net migration of just 19,697.
This really points out how marginal the population shift is.  It isn't zero, but it is also not a mass population shift driven by the hellish California region.

Even more telling:

Oh yes, I know what you’ve heard. And it’s true, as the state’s boosters like to brag, that Texas does not have an income tax. But Texas has sales and property taxes that make its overall burden of taxation on low-wage families much heavier than the national average, while the state also taxes the middle class at rates as high or higher than in California. For instance, non-elderly Californians with family income in the middle 20 percent of the income distribution pay combined state and local taxes amounting to 8.2 percent of their income, according to the Institute on Taxation and Economic Policy; by contrast, their counterparts in Texas pay 8.6 percent.

And unlike in California, middle-class families in Texas don’t get the advantage of having rich people share equally in the cost of providing government services. The top 1 percent in Texas have an effective tax rate of just 3.2 percent. That’s roughly two-fifths the rate that’s borne by the middle class, and just a quarter the rate paid by all those low-wage “takers” at the bottom 20 percent of the family income distribution. This Robin-Hood-in-reverse system gives Texas the fifth-most-regressive tax structure in the nation.  
That leads to some really interesting questions about he relation of tax rates to prosperity.  If most people in Texas pay more taxes than California, then maybe this is another data point on the scale of more money for government leading to a stronger and more prosperous state.  But these points really don't make the case that Texas is clearly better than California.  Now both states have a strong streak of pro-business advocates, and so I think that both could end up as engines of American prosperity.  But I think that the future for California is pretty optimistic once the actual facts are broadly considered. 

Sunday, March 16, 2014

The second half is more remarkable

Paul Krugman:

But my guess is that in a week or two we will once again hear a supposed wise man saying that we need to raise the retirement age to 67 because of higher life expectancy, unaware that (a) life expectancy hasn’t risen much for half of workers (b) we’ve already raised the retirement age to 67.
I completely understand why part a is misleading -- poorer workers get less social security already and making them work longer means fewer years of benefits increasing the benefits to wealthier contributors, who already get more per month. 

But the second part is the piece that I find truly remarkable.  I mean how hard could it be for the media to fact check that raising eligibility to 65 is current law?  Sure, you might want to protect the law but that is a completely different argument for forgetting that the low changed in 1983.  Or you could want to phase it in faster, but that would also be a) a different argument and b) seemingly ill-timed with the changes in the 401(k) system.

So I could imagine some debate about the first part (based around potential short term trends and the fact that we don't have the complete death curve for the population).  But the second is simply . . .  odd. 

Friday, March 14, 2014

This was both entertaining and thought-provoking

It was also a very clear headed explanation of some of the key mythologies of the modern cult of anarcho-capitalism.  I especially liked (edited with *'s for questionable language choices):

But if none of that stuff existed, there would be nothing stopping Jay-Z from taking your farm. In other words, you don't "own" ****. The entire concept of owning anything, be it a hunk of land or a house or a ****ing sandwich, exists purely because other people pay other armed men to protect it. Without society, all of your brave, individual talents and efforts won't buy you a bucket of ****s. So when I say "We're all in this together," I'm not stating a philosophy. I'm stating a fact about the way human life works. No, you never asked for anything to be handed to you. You didn't have to, because billions of humans who lived and died before you had already created a lavish support system where the streets are all but paved with gold. Everyone reading this -- all of us living in a society advanced enough to have Internet access -- was born one inch away from the finish line, plopped here at birth, by other people.


But it is a very straightforward explanation of the concept of interdependence, and the way that we are all connected based on social convention. 

Sometimes the Cracked site is surprisingly thought provoking. 

Orthogonality and the SAT

[Note: 'SAT' refers to the SAT Reasoning Test]

If you spend any time following the SAT debate, you will frequently encounter some variation on the phrase:
All in all, the changes are intended to make SAT scores more accurately mirror the grades a student gets in school.

The thing is, though, there already is something that accurately mirrors the grades a student gets in school. Namely: the grades a student gets in school. A better way of revising the SAT, from what I can see, would be to do away with it once and for all.
Putting aside the questionable assumption that the purpose of a colleges selection process is to find students who will get good grades at that college, there is a major statistical fallacy here, and it reflects a common but very dangerous type of oversimplification.

When people talk about something being the "best predictor" they generally are talking about linear correlation. The linearity itself is problematic here – we are generally not that concerned with distinguishing potential A students from B students while we are very concerned with distinguishing potential C students from potential D and F students – but there's a bigger concern: The very idea of a "best" predictor is inappropriate in this context.

In our intensely and increasingly multivariate world, this idea ("if you have one perfectly good predictor, why do you need another?") is rather bizarre and yet surprisingly common. It has been the basis of arguments that I and countless other corporate statisticians have had with executives over the years. The importance of looking at variables in context is surprisingly difficult to convey.

The explanation goes something like this. If we have a one-variable model, we want to find the predictor variable that gives us the most relevant information about the target variable. Normally this means finding the highest correlation between some transformation of the variable in question and some transformation of the target where the transformation of the target is chosen to highlight the behavior of interest while the transformation of the predictor is chosen to optimize correlation. In our grading example, we might want to change the grading scale from A through F to three bins of A/B, C, and D/F. If we are limited to one predictor in our model picking, the one that optimizes correlation under these conditions makes perfect sense.

Once we decide to add another variable, however, the situation becomes completely different. Now we are concerned with how much information our new variable adds to our existing model. If our new variable is highly correlated with the variable already in the model, it probably won't improve the model significantly. What we would like to see is a new variable that has some relationship with the target but which is, as much as possible, uncorrelated with the variable already in the model.

That's basically what we are talking about when we refer to orthogonality. There's a bit more to it – – we are actually interested in new variables that are uncorrelated with functions of the existing predictor variables – but the bottom line is that when we add a variable to a model, we want it to add information that the variables currently in the model haven't already provided.

Let's talk about this in the context of the SAT. Let's say I wanted to build a model predicting college GPA and, in that model, I have already decided to include high school courses taken and their corresponding grades. Assume that there's an academic achievement test that asks questions about trigonometric identities or who killed whom in Macbeth. The results of this test may have a high correlation with future GPA but they will almost certainly have a high correlation with variables already in the model, thus making this test a questionable candidate for the model. When statisticians talk about orthogonality this is the sort of thing they have in mind.

The SAT works around this problem by asking questions that are more focused on aptitude and reasoning and which rely on basic knowledge not associated with any courses beyond junior high level. Taking calculus and AP English might help students' SAT scores indirectly by providing practice reading and solving problems so we won't get perfect orthogonality but it will certain do better in this regard than a traditional subject matter exam.

This is another of those posts that sits in the intersection of a couple of major threads. The first concerns the SAT and how we use it. The second concerns orthogonality, both in the specific sense described here and in the general sense of adding information to the system, whether through new data, journalism, analysis or arguments. If, as we are constantly told, we're living in an information-based economy, concepts like orthogonality should be a standard feature of the conversation, not just part of statistical esoterica. 

Thursday, March 13, 2014

Negotiation

This is a really interesting story about a failed academic negotiation.  It is pretty clear that nobody has covered themselves in glory here, although the response from the institution seems awfully harsh and a symptom of the sort of extremely tight labor markets that reduce employee choice.  One only hopes that the maternity leave condition was orthogonal to the decision to rescind the offer, although I suspect asking for a one year delay in start date was more likely as the culprit. 

The comments below are quite interesting as well.

More on inequality

As a follow-up to the last post consider this point by Chris Dillow:
Of course, this calculation only makes sense if we assume such redistribution could occur without reducing aggregate incomes. But such an assumption is at least plausible. The idea that massive pay for the 1% has improved economic performance is - to say the least - dubious. For example, in the last 20 years - a time of a rising share for the top 1% - real GDP growth has averaged 2.3% a year. That's indistinguishable from the 2.2% seen in the previous 20 years - a period which encompassed two oil shocks, three recessions, poisonous industrial relations, high inflation and macroeconomic mismanagement - and less than we had in the more egalitarian 50s and 60s.
It is not that there are no adverse consequences to redistribution.  Nor does it mean than any policy, taken to an extreme, will be as effect as it will on the margin when applied to current conditions.  But it is an even more compelling argument that inequality is not, in and of itself, self evidently a force for economic growth without some additional evidence. 

Tuesday, March 11, 2014

Data Intuition

Paul Krugman:
Even more strikingly, however, the level as opposed to the growth rate of French GDP per capita is substantially lower than that of the US.

This is my main concern about Ostry et al. Suppose we think that strong redistributionist policies reduce the level of output — but that it’s a one-time shift, not a permanent depression of growth. Then you could accept their result of a lack of impact on growth while still believing in serious output effects.
I might be able to accept the one time shift theory of redistribution, where reducing inequality lowers the overall GDP of the economy.  But if these effects are dynamic (they change the rate of growth instead of shifting the absolute level) then they should show up in the historical record.  After all, there are a number of highly unequal societies -- have they outcompeted the more equal societies repeatedly? 

Did the French revolution greatly depress French output and dynamism? 

Now it could be that this is one element of a complex system.  That is totally plausible.  But then it should also be a candidate for trade-offs.  But the countries that have done large levels of redistribution (think US versus Canada or Denmark) have not obviously done worse. 

In general, simple explanations for complex phenomenon are always suspect, especially if it is difficult to formulate a test that night falsify the hypothesis

Sunday, March 9, 2014

Open Data

This is a pretty good argument for why there is resistance to completely open data:
When people don’t want to release their data, they don’t care about the data itself. They care about the papers that could result from these data. I don’t care if people have numbers that I collect. What I care about is the notion that these numbers are scientifically useful, and that I wish to get scientific credit for the usefulness of these numbers. Once the data are public, there is scant credit for that work.

It takes plenty of time and effort to generate data. In my case, lots of sweat, and occasionally some venom and blood, is required to generate data. I also spend several weeks per year away from my family, which any parent should relate with. Many of the students who work with me also have made tremendous personal investments into the work as well. Generating data in my lab often comes at great personal expense. Right now, if we publicly archived data that were used in the creation of a new paper, we would not get appropriate credit in a currency of value in the academic marketplace.
I think the key to this argument is that most of the effort in some fields lies in the collection of the data bit all of the credit is based on papers.  So you would end up, rather quickly, with a form of tragedy of the commons where the people who create the data end up with little credit . . . meaning we would end up with less data. 

Are there are alternatives to this paradigm?  Of course.  The US census is a excellent example of an alternative model -- where the data collection and cleaning is done by a government department on the behalf of all sorts of researchers.  Splitting data collection and data analysis in this way is certainly a viable model. 

But pretending that open data is a simple case of people being reluctant to share their information is really an unfair portrayal.  In my own career I have had lots of access to other peoples data and they are extremely generous so long as I offer to give proper credit.  So I don't think the open data movement is all wrong, but it does suggest that there is a difficult conversation to make this work well. 

Wednesday, March 5, 2014

How did we miss this one?

Mike the Biologist links to a remarkable statistic:
There are numerous problems with using VAM scores for high-stakes decisions, but in this particular release of data, the most obvious and perhaps the most egregious one is this: Some 70 percent of the Florida teachers received VAM scores based on test results from students they didn’t teach and/or in subjects they don’t teach
.Even more remarkable, this was only revealed after a court ordered the Florida Times-Union sued for access to the records.  The source also notes that this issue is live in Tennessee, which has similar problems.  Now we have a lot of moving parts in the area of education reform and there are arguments about the use of value added measures (VAM) testing. 

But nobody has a good argument about testing other teachers and making employment decisions based on their performances.  When we talk about peer effects, it is the students in the classroom and not colleagues that we are thinking of.  It is also striking how much room there is to game statistics when you only collect real data on one third of teachers.  Can we really presume that this data collection is a proper random sample? 

These issues are not necessarily small issues.  They have the potential to replace one set of issues in education with another.  Nor is it 100% clear that they address the issue of social mobility, either, as less job security for teachers does not appear to directly address the drivers of intergenerational social mobility

I have respect for people trying to solve a tough problem, but this does not seem to be a great way to go.

Tuesday, March 4, 2014

Biomedical Patents

In a follow-up to this post, I thought it would be worth looking at a piece of the patent system where I don't have major concerns -- namely drug patents.  According to the FDA, a drug patent is good for 20 years after filing. 

This is very much the low end of the intellectual property patent discussion.  Micky Mouse was invented in 1928, so the current duration of protection has been > 85 years.  On the other hand, a 20 year patent would have expired before the end of Walt Disney's life.  Or consider JRR Tolkien who wrote the hobbit in 1937 and the Lord of the Rings in 1954/55.  He died in 1973 -- meaning the Hobbit would have exited protection during his lifespan and the Lord of the Rings would barely have made it. 

Furthermore, the cost of biomedical drug development are huge.  You could imagine replacing this system with research grants, but there is no way to avoid the conclusion that this would immediately be one of the largest items in the Federal budget.  This is not to say that the process could not be improved or streamlined.  But given that we maintain the current cost structure for drug development, these patent lengths look either short or appropriate. 

Or, in other words, different areas have different issues.