Friday, January 7, 2011

While we're doing "then and now"

Back in June, 2009, this is how Edward L. Glaeser felt about the bailouts:
Since the collapse of Lehman Brothers, the public sector has spent billions saving the banks. While these decisions are certainly debatable, they are understandable. The US financial industry misbehaved badly,... but it is still a sector with a future. ... After all, every other sector in the economy depends on banks for their financing.

But what about cars? ... Does anyone, other than GM's management, believe that this company can come back? The current treatment, cash infusion and a reduction in corporate liabilities, provides a solution for a company that is broke, not for one that is broken.
The future of the financial sector is looking pretty scary these days. How about the auto industry and GM in particular?
Although the transformation has been a long time coming, Ford and the rest of the domestic auto industry appear to be finally giving up their addiction to gas-guzzling trucks and sport utility vehicles. Prodded first by rising federal fuel economy standards, then shocked in 2008 by $145-a-barrel oil and a global credit crisis that forced General Motors and Chrysler to seek federal bailouts, Detroit is making a fundamental shift toward lighter, more fuel-conscious cars — and turning a profit doing so.

Lone star, bad portents

Then:



Now (from Paul Krugman):
These are tough times for state governments. Huge deficits loom almost everywhere, from California to New York, from New Jersey to Texas.

Wait — Texas? Wasn’t Texas supposed to be thriving even as the rest of America suffered? Didn’t its governor declare, during his re-election campaign, that “we have billions in surplus”? Yes, it was, and yes, he did. But reality has now intruded, in the form of a deficit expected to run as high as $25 billion over the next two years.

And that reality has implications for the nation as a whole. For Texas is where the modern conservative theory of budgeting — the belief that you should never raise taxes under any circumstances, that you can always balance the budget by cutting wasteful spending — has been implemented most completely. If the theory can’t make it there, it can’t make it anywhere.

How bad is the Texas deficit? Comparing budget crises among states is tricky, for technical reasons. Still, data from the Center on Budget and Policy Priorities suggest that the Texas budget gap is worse than New York’s, about as bad as California’s, but not quite up to New Jersey levels.

The point, however, is that just the other day Texas was being touted as a role model (and still is by commentators who haven’t been keeping up with the news). It was the state the recession supposedly passed by, thanks to its low taxes and business-friendly policies. Its governor boasted that its budget was in good shape thanks to his “tough conservative decisions.”

Oh, and at a time when there’s a full-court press on to demonize public-sector unions as the source of all our woes, Texas is nearly demon-free: less than 20 percent of public-sector workers there are covered by union contracts, compared with almost 75 percent in New York.

So what happened to the “Texas miracle” many people were talking about even a few months ago?

Part of the answer is that reports of a recession-proof state were greatly exaggerated. It’s true that Texas job losses haven’t been as severe as those in the nation as a whole since the recession began in 2007. But Texas has a rapidly growing population — largely, suggests Harvard’s Edward Glaeser, because its liberal land-use and zoning policies have kept housing cheap. There’s nothing wrong with that; but given that rising population, Texas needs to create jobs more rapidly than the rest of the country just to keep up with a growing work force.

And when you look at unemployment, Texas doesn’t seem particularly special: its unemployment rate is below the national average, thanks in part to high oil prices, but it’s about the same as the unemployment rate in New York or Massachusetts.

What about the budget? The truth is that the Texas state government has relied for years on smoke and mirrors to create the illusion of sound finances in the face of a serious “structural” budget deficit — that is, a deficit that persists even when the economy is doing well. When the recession struck, hitting revenue in Texas just as it did everywhere else, that illusion was bound to collapse.

The only thing that let Gov. Rick Perry get away, temporarily, with claims of a surplus was the fact that Texas enacts budgets only once every two years, and the last budget was put in place before the depth of the economic downturn was clear. Now the next budget must be passed — and Texas may have a $25 billion hole to fill.
As a native of the Lone Star state (now happily on the West Coast), I've got to go with General Sheridan on this one.

Defining Denominators

Via Mark Thoma, I discovered this very interesting article looking at group of gross domestic product by working age population (WAP):

When one looks at GDP/WAP (defined as population aged 20-60), one gets a surprising result: Japan has actually done better than the US or most European countries over the last decade. The reason is simple: Japan’s overall growth rates have been quite low, but growth was achieved despite a rapidly shrinking working-age population.

The difference between Japan and the US is instructive here: in terms of overall GDP growth, it was about one percentage point, but larger in terms of the annual WAP growth rates – more than 1.5 percentage points, given that the US working-age population grew by 0.8%, whereas Japan’s has been shrinking at about the same rate.

Another indication that Japan has fully used its potential is that the unemployment rate has been constant over the last decade. By contrast, the US unemployment rate has almost doubled, now approaching 10%. One might thus conclude that the US should take Japan as an example not of stagnation, but of how to squeeze maximum growth from limited potential.


This is a very good illustration of how important it can be to understand the structure of a population under study. I don't know if the proposed metric is the most relevant metric for the phenomenon under study but it's sure interesting how it completely changes the interpretation of the result. That could have very profound policy results when we consider questions like "would it be a bad thing to emulate Japan's industrial policy?".

Rajiv Sethi was there first

While going through the comment section of this post by Tim Duy, I was reminded that Rajiv Sethi was talking about broadcast HDTV long before the rest of us and was doing a remarkably thorough job of it.

Thursday, January 6, 2011

Earnings of Post-Docs

Felix Salmon links to this post from the Economist and extracts this rather amazing statistic:

In Canada 80% of postdocs earn $38,600 or less per year—the average salary of a construction worker.


Mark has previously quoted from the same post. But I do think that post-doctoral training is a very interesting place to look at; the PhD program is educational and credentialing in a way that the post-doctoral process is not. In a sense, a PhD student is being partially compensated by acquiring their degree, which in theory can open up many career options (only some of which are academic). But a post-doctoral fellow is focused entirely on academics. So long as the success rate among post-doctoral fellow is high, the post-doc can be considered to be an internship. But if the success rate gets too low then the issue of exploitation comes up.

On the other hand, if the post-doctoral fellowship is a time of enrichment and job satisfaction is very high then maybe it is okay. I'm sure I would prefer to be a post-doc than a construction worker, myself. It's a tough issue . . .

Wednesday, January 5, 2011

Research Plans

In the comments to this post, PhysioProf draws a very interesting distinction:

You know all the people at Drugmonkey’s blog and Writedit’s blog who are constantly ranting and raving about how their science is so totally awesome and there must be something NEFARIOUS AND UNFAIR going on in study sections that fail to fund their MAGNIFICENT AND BOLD GROUNDBREAKING RESEARCH? Those people fail to distinguish between appropriate design of their actual research programs and appropriate crafting of a fundable grant application, taking account of their career stage and prior accomplishments. If they keep willfully ignoring this distinction, they are going to keep failing to secure NIH funding.


I think that this is correct, even if I don't especially like the reality of it. Scientific writing is based on a very stylized approach and it makes sense to learn the rules of how to communicate effectively in this medium. It's also true that one needs to put forth a plan that is realistic in a grant proposal. I don't think anybody was ever admonished for accomplishing more than they expected in a research plan.

The hardest thing I am finding in year one of the tenure track position is rescaling the speed at which I can do things. I was a very fast post-doctoral fellow and I could do an amazing amount by just working insane hours. But training junior graduate students takes a lot of time and I find my net productivity is dropping as I focus on training (well, designing courses isn't helping either). So I am sympathetic to the NIH wanting to see realistic goals for a research grant.

But I really liked the clear distinction that was being made between the two processes . . .

The Scalar Fallacy

Sometimes the things that give us the most trouble are most obvious. When something is completely self-evident, it can be difficult to wrap your mind around it and think through its implications. Important points can be mistaken for tautologies (and vice versa) and when you try to work through the questions with essays or conversations, you often find yourself feeling pretentious and, for lack of a better word, silly.

Here's an example: neither vectors, random variables nor vectors of random variables are scalars. This statement is obvious to anyone familiar with the basic terms. Equally obvious is the fact that when you try to represent one of these complex, multidimensional creatures as a point on a line, you will invariably lose some information.

The implications of these points, however, are often not obvious at all.

We have to assign scalars to things all the time because, among other reasons, scalars are the only things we can rank. Any time you want to decide what's the best _____ (car, job offer, candidate), you have to start by assigning _____ a scalar. You can do this by finding a proxy that's already a scalar (like the answer to a survey question) or by using a function of the vector. Simple examples include taking the sum or the sum of the squares or the average or the maximum value. (I'm going to limit this to vectors from here on but everything should generalize to random variables and vectors of random variables fairly easily.)

But, though we have to do it all the time, no one has ever found a perfect way of assigning scalars to vectors and no one ever will. This isn't pessimism; it's mathematics. You lose information when you go from a vector to a scalar. That loss means you have to be careful about contextual questions like range of data. Though there may be a few cases where we can derive the scalars from first principles, we generally have to arrive at the assignments through experimentation. We find methods that have produced useful metrics in previous situations. Unfortunately, when you move out of the range of data you encountered in those previous situations or when you otherwise find yourself in a new context, the information you could safely omit before becomes essential and the metric that has done such a good job up till now suddenly becomes worthless.

Here are a couple of examples:

A "rate your experience" question might do a good job comparing the impact of bad beverage service versus that of short delay in take-off but it will probably not do a satisfactory job comparing a forced landing and a seven hour stay on the tarmac on a hot summer day . These events fall outside the range of data the question was developed for.

A weighted average of nutrients might provide a good way of ranking most of the foods you find in the produce aisle. In the context of comparing different fruits and vegetables found in your neighborhood grocery store, you might be able to get by assuming a linear relationship between the amount of certain nutrients and healthiness. If, however, you move to the context of the dietary supplement aisle, making that linear assumption about certain nutrients can be dangerous, even deadly. Having a bottle of iron supplement pills for lunch is an extraordinarily bad idea.

These are relatively simple examples but think about all the unspeakably complicated things like happiness that people routinely discuss as if they were scalars -- "people in group A were 42% happier than people in group B." Worse yet, many researchers insist on pushing these scales to ludicrous extremes, using the same metrics to measure the impact of everything from trivial lifestyle changes to the birth of a first child. (How this affects theories like rational addiction is a subject for another post.)

Perhaps even more important than being context-specific, the scalars we assign to vectors are generally question-specific. Take the example of health. There's no meaningful way to boil this complex, multidimensional concept down to one number, but we can come up with scalars that are useful when answering certain questions. Let's say we have formulas for deriving two metrics, L and Q. L correlates very well with longevity; Q correlates very well with quality of life. For most questions about health policy, you will get similar answers with either metric, but there are cases where the two diverge sharply. Both L and Q are good measures of health, but their usefulness depends on the question you need answered.

Part of the blame for the tendency to take scalars as ideal representations of vectors rests with the "magic of the market" faction of economists and their camp followers. Markets are basically in the scalarizing business and under the proper conditions they do a pretty good job. It's easy to see how researchers grew enamored with markets' ability to set prices in such a way that resources are effectively allocated. It is a remarkable process.

But as impressive as markets are, they still are not exempt from the laws of mathematics and the limitations listed above. Prices are scalars assigned the values of things. They generally provide us with an excellent tool for prioritizing purchases and production but when you start to think of the scalars as actually being the vectors they represent, your thinking becomes sloppy and you open yourself up to dangerous mistakes.

"Why doing a PhD is often a waste of time"

Via Felix Salmon, there's a bleak but informative article in the Economist on the PhD glut. Here's a sample:
For most of history even a first degree at a university was the privilege of a rich few, and many academic staff did not hold doctorates. But as higher education expanded after the second world war, so did the expectation that lecturers would hold advanced degrees. American universities geared up first: by 1970 America was producing just under a third of the world’s university students and half of its science and technology PhDs (at that time it had only 6% of the global population). Since then America’s annual output of PhDs has doubled, to 64,000.

Other countries are catching up. Between 1998 and 2006 the number of doctorates handed out in all OECD countries grew by 40%, compared with 22% for America. PhD production sped up most dramatically in Mexico, Portugal, Italy and Slovakia. Even Japan, where the number of young people is shrinking, churned out about 46% more PhDs. Part of that growth reflects the expansion of university education outside America. Richard Freeman, a labour economist at Harvard University, says that by 2006 America was enrolling just 12% of the world’s students.

But universities have discovered that PhD students are cheap, highly motivated and disposable labour. With more PhD students they can do more research, and in some countries more teaching, with less money. A graduate assistant at Yale might earn $20,000 a year for nine months of teaching. The average pay of full professors in America was $109,000 in 2009—higher than the average for judges and magistrates.

Indeed, the production of PhDs has far outstripped demand for university lecturers. In a recent book, Andrew Hacker and Claudia Dreifus, an academic and a journalist, report that America produced more than 100,000 doctoral degrees between 2005 and 2009. In the same period there were just 16,000 new professorships. Using PhD students to do much of the undergraduate teaching cuts the number of full-time jobs. Even in Canada, where the output of PhD graduates has grown relatively modestly, universities conferred 4,800 doctorate degrees in 2007 but hired just 2,616 new full-time professors. Only a few fast-developing countries, such as Brazil and China, now seem short of PhDs.

Tuesday, January 4, 2011

Correction

I was wrong when I said industry watchers had been incorrectly announcing the death of network television for over thirty years.

I should have said over forty years:
An awestruck audience at the 1970 CES loved the VCR's convenience -- but Hollywood battled back, warning that piracy would run rampant and kill network television.

Definition of the Day

I hear a lot about "rent-seeking" and have been looking for a very straightforward definition. This is one from Maxine Udall:

The excess of profits over what they would be in a competitive market is called economic rents.

Actually, search engines are like bananas

And Google is like either the the Gros Michel or the Cavendish, but that's a picky complaint (I used to teach SAT prep classes) and it's the only problem I have with Felix Salmon's sharply-written and insightful discussion of the dangers of monoculture:

How Google is like bananas




Definitely worth a look.

Monday, January 3, 2011

Incentives, the TSA, and a question for Tyler Cowen

Tyler Cowen has a post up looking at a Washington Post article on airports considering private options to the TSA. The underlying assumption here is that competition will improve service and satisfaction but neither Cowen nor the Post writer address the big question:

Why should market forces act differently on security than they did on the rest of the industry?

From the moment you miss the shuttle to the long wait for your bags to come down the conveyor belt, air travel may provide the worst customer experience of any major industry. It's true that introducing market-based incentives a few years helped give us cheap flights (though I don't know enough about the underlying economics to say whether they actually bent the curve), but it did nothing to improve a system that was inconvenient, poorly designed and indifferent to the needs (not to mention feelings) of passengers -- pretty much the same problems that market forces are supposed fix in airline security.

In most industries, competition forces players to maintain reasonable customer service and to come up with customer-facing innovations, but only because almost all of the customers can easily choose between different products offered by different sellers. Cars are a good example.

Three years ago, I bought my first new car, a Nissan Altima hybrid. It had been about a decade since I had bought a car and I was amazed by the innovations that were available in mid-priced autos. Some of the innovations were impressive from a technological standpoint like regenerative braking and continuously variable transmission (the first automatic transmission I actually enjoyed driving), others were just well designed like dual climate controls and cleverly arranged storage compartments, but all were indicative of tremendous effort and ingenuity focused on providing me with a car I would like to own.

Nissan invested all of this into my car because they knew that Toyota and Ford and Volkswagen and a number of other companies also made cars I would like to own, just as the dealers I bought the car from knew that other dealers also carried the make and model I wanted.

Market forces don't address every potential automotive concern. There are externalities and asymmetries of information to be taken into consideration but putting those aside, competition has done a great job. The auto industry has produced a stream of innovative, appealing products because the makers and the dealers both know that I have plenty of choices.

But what would happen if customers were frequently forced to buy one particular make and model and having a choice in dealers might mean going a hundred miles out of your way? Then the automotive industry would probably look a lot like the air travel industry.

There are major constraints on where you can build an airport. Even if you put aside safety and environmental concerns, there are limits to how many airports an area can support. At the risk of stating the obvious, every flight is associated with two of these airports and your flying options are based on the worse of the two. For example, I'm based in L.A. My co-blogger, Joseph, teaches in a college town in the Southeast. I have an unusually large selection of airports, including LAX which, as far as I know, is serviced by all the major carriers, but if I want to do a face to face collaboration with Joseph I have to fly Delta because that's the only major airline that services his airport.

For the majority of itineraries, passengers have little choice as to airports and often as to airlines (market forces exert enough pressure on airlines to give a reputation for good customer service some value -- look at Southwest -- but not enough to make it standard -- look at almost everybody else). This lack of choice greatly limits the pressure market forces can exert on airport-based services. How many people would drive an extra hour or two (we're talking about a round trip) to save a few minutes in the security line and to have access to a better food court?

If anything, competition will do less to improve customer satisfaction with security than it will with the rest of the services airports provide. Whether done by the TSA or private firms, the basic procedures remain the same and it's the procedures that have people upset.

Of course you might get people driving out of their way to avoid things like full body scans, but that's an entirely different discussion.

Sunday, January 2, 2011

Maybe they're sneaking across the border to have their car accidents

Over at The Incidental Economist (with a h/t to Prof. DeLong), Aaron Carroll looked into a familiar health care meme:

Based on the comments I’ve seen over the last week, many of you are still going with that well used meme in the health care debate that people in other countries – frustrated by wait times and rationing – come to the United States for care. These are almost always anecdotal stories and you should know by now how much stock I put in anecdotes.

As always, when we can we should turn to evidence and research, and on this topic it does exist. The most comprehensive work I’ve seen on this topic was published in a manuscript in the peer-reviewed journal Health Affairs. That study looked at how Canadians cross the border for care. Most anecdotes involve Canadians, since it’s easy for those on the border to come here. And, the authors used a number of different methods to try and answer the question*:

1) First, they surveyed United States border facilities in Michigan, New York, and Washington. It makes sense that Canadians crossing the border for care would favor sites close by, right? It turns out that about 80% of such facilities saw fewer than one Canadian per month. About 40% saw none in the prior year. And when looking at the reasons for visits, more than 80% were emergencies or urgent visits (ie tourists who had to go to the ER). Only about 19% of those already few visits were for elective purposes.

2) Next, they surveyed “America’s Best Hospitals”, because if Canadians were going to travel for care, they would be more likely to go to the most well-known and highest quality facilities, right? Only one of the surveyed hospitals saw more than 60 Canadians in one year. And, again, that included both emergencies and elective care.

3) Finally, they examined data from the 18,000 Canadians who participated in the National Population Health Survey. In the previous year, only 90 of those 18,000 Canadians had received care in the United States; only 20 of them had done so electively.


Education thought for the New Year

Here is a point from Matt Yglesias:

I think the evidence suggests that one of the most important skills people learn (or don’t) in school is self-discipline rather than specific knowledge. I don’t think learning the chronology of ancient near eastern empires (Sumeria then Assyria then Babylonia then Persia then Greece then Rome) in elementary school has ever been useful to me, or even that the chronology I learned is especially accurate, but a lot of life involves semi-arbitrary tasks and it’s worth one’s while to get used to performing them.


Going down this path would suggest that a lot of curriculum might be made optional and teachers might be able to focus on what they love (and are best at inspiring the students they teach). This could be another reason why approaches like the "Freedom Writers" could still have decent results despite completely ignoring the normal educational programme.

Saturday, January 1, 2011

Unemployment

A new post by Mark Thoma fits in, I think, with recent thoughts about minimum wage. It is true that, as the time of unemployment increases, some workers will find jobs that are vastly inferior (but way better than nothing). This suggests that the unemployment rate will become a less and less reliable marker of the strength of the economy.

Now, it might be true that some of this could be an unrealistic expectation of compensation on the part of workers who had unusually good jobs during the past expansion. But I note that the financial industry (apparently where the recession began) is not necessarily hurting:

"Wall Street earned $21.4 billion during the first three quarters of 2010," Comptroller Tom DiNapoli said.

"While much less than last year's record of $61.4 billion, which was fueled by federal assistance, the securities industry is on track in 2010 for the second-highest level of profitability on record," he said.


So I think we should be sceptics about narratives that include the need for "shared sacrifice" from all segments of society. I do note that the idea that last year's record profits where fuelled by government assistance ironic given the concerns over matters like pay forK-12 teachers. I don't have a good road map forward except to note that simple solutions and metrics seem unlikely to be helpful under these conditions.