Wednesday, April 20, 2011

Cool yet useful -- typography edition

From John D. Cook, here's a site that will identify fonts in JPEGs.

Taxes and Growth

I think that this is the most insightful post I have read in ages:

Think about it this way: Grant to the tax skeptic all he wants about the idea that high taxes reduce the level of economic output. There’s an easy story to tell here. The quantity of economic output is, in part, a function of how much time and effort people want to put into doing market production. And the amount of time and effort any given person wants to put into market production is in part a feature of how much purchasing power extra time and effort put into market production will get him. Higher taxes—either on his labor or on his consumption of goods and services—reduces the purchasing power of extra time and effort on market production, and thus tend to reduce the amount of time and effort people put into it. You can tell a different, more leftwing story about this, but the point I want to make here is simply that this rightwing story about taxes and output is a story about levels not growth rates. If Americans started working the number of hours per year that South Koreans work, our per capita GDP would go way up.

But that’d be a onetime adjustment. Countries don’t grow over time by steadily increasing their number of hours worked. They grow, roughly speaking, because people think up better ways to do things and then businessmen either adopt those new better methods or else they get put out of business by those who did.


Taxes suppressing growth was a vicious argument because it suggested that it could (via compounding effects) lead to relative impoverishment over time. While it is true that less work could suppress some innovation at the margins, the story of changing absolute wealth actually seems more credible to me.

It focus the argument on what trade-offs are we willing to make between things like personal security and affluence. It also removes the idea that low taxes might spur growth levels and allow us to grow faster than thus reduce debt (as a proportion of GDP) via economic growth. But it is notable that more socialist and high tax countries (Sweden, Canada, The Netherlands) have not have a fall into relative poverty compared with the lower tax United States.

That is worth considering in these discussions.

"Great moments in marketing -- Charlie Sheen edition"

Even by the lax standards of contemporary journalism, it's hard to see how the meltdown of Charlie Sheen rises above the dog-bites-man standard. He's an actor with a history of drug abuse and manic-depressive (if not bipolar) tendencies -- not an unusual combination -- and his combination of denial and lashing out are absolutely typical for an addict.

I do, however, have to admit that the speed of development here is impressive. Back when I was doing marketing modelling, it took us forever to roll out a new product.

Monday, April 18, 2011

Cheap Beer, Paradoxical Dice, and the Unfounded Morality of Economists

Sometimes a concept can be so intuitively obvious that it actually becomes more difficult to teach and discuss. Take transitivity. We say that real numbers have the transitive property. That means that if you have three real numbers (A, B and C) and you know A > B and B > C then you also know that A > C.

Transitivity is just too obvious to get your head around. In order to think about a concept you really have to think about its opposite as well --

A > B, B > C and C > A. None too imaginatively, we call these opposite relationships intransitive or non-transitive. Non-transitive relationships are deeply counter-intuitive. We just don't expect the world to work like that. If you like butterscotch shakes better than chocolate shakes and chocolate shakes better than vanilla shakes, you expect to like butterscotch better than vanilla. If you can beat me at chess and I can beat Harry, you assume you can beat Harry. There is. of course, an element of variability here -- Harry might be having a good day or you might be in a vanilla kind of mood -- but on average we expect these relationships to hold.

The only example of a non-transitive relationship most people can think of is the game Rock, Paper, Scissors. Other games with non-transitive elements include the boomer classic Stratego where the highest ranked piece can only be captured by the lowest and my contribution, the game Kruzno which was designed specifically to give students a chance to work with these concepts.

While these games give us a chance to play around with non-transitive relationships, they don't tell us anything about how these relationships might arise in the real world. To answer that question, it's useful to look at another game.

Here are the rules. We have three dice marked as follows:

Die A {2,2,4,4,9,9}

Die B {1,1,6,6,8,8}

Die C {3,3,5,5,7,7}

Because I'm a nice guy, I'm going to let you pick the die you want. I'll then take one of the two remaining dice. We'll roll and whoever gets the higher number wins. Which die should you pick?

The surprising answer is that no matter which one you pick I'll still have the advantage because these are non-transitive dice. A beats B five out of nine times. B beats C five out of nine times. C beats A five out of nine times. The player who chooses second can always have better odds.

The dice example shows that it's possible for systems using random variables to result in non-transitive relationships. Can we still get these relationships in something deterministic like the rules of a control system or perhaps the algorithm a customer might use to decide on a product?

One way of dealing with multiple variables in a decision is to apply a threshold test to one variable while optimizing another. Here's how you might use this approach to decide between two six-packs of beer: if the price difference is a dollar or less, buy the better brand; otherwise pick the cheaper one.* For example, let's say that if beer were free you would rank beers in this order:

1. Sam Adams

2. Tecate

3. Budweiser

If these three beers cost $7.99, $6.99 and $5.99 respectively, you would pick Tecate over Bud, Sam Adams over Tecate and Bud over Sam Adams. In other words, a rock/paper/scissors relationship.

Admittedly, this example is a bit contrived but the idea of a customer having a threshold price is not outlandish, and there are precedents for the idea of a decision process where one variable is ignored as long as it stays within a certain range.

Of course, we haven't established the existence, let alone the prevalence of these relationships in economics but their very possibility raises some interesting questions and implications. Because transitivity is such an intuitively appealing concept, it often works its way unnoticed into the assumptions behind all sorts of arguments. If you've shown A is greater than B and B is greater than C, it's natural not to bother with A and C.

What's worse, as Edward Glaeser has observed, economists tend to be reductionists, and non-transitivity tends to play hell with reductionism. This makes economics particularly dependent on assumptions of transitivity. Take Glaeser's widely-cited proposal for a "moral heart of economics":

Teachers of first-year graduate courses in economic theory, like me, often begin by discussing the assumption that individuals can rank their preferred outcomes. We then propose a measure — a ranking mechanism called a utility function — that follows people’s preferences.

If there were 1,000 outcomes, an equivalent utility function could be defined by giving the most favored outcome a value of 1,000, the second best outcome a value of 999 and so forth. This “utility function” has nothing to do with happiness or self-satisfaction; it’s just a mathematical convenience for ranking people’s choices.

But then we turn to welfare, and that’s where we make our great leap.

Improvements in welfare occur when there are improvements in utility, and those occur only when an individual gets an option that wasn’t previously available. We typically prove that someone’s welfare has increased when the person has an increased set of choices.

When we make that assumption (which is hotly contested by some people, especially psychologists), we essentially assume that the fundamental objective of public policy is to increase freedom of choice.


But if these rankings can be non-transitive, then you run into all sorts of problems with the very idea of a utility function. (It would also seem to raise some interesting questions about revealed preference.) Does that actually change the moral calculus? Perhaps not but it certainly complicates things (what exactly does it mean to improve someone's choices when you don't have a well-ordered set?). More importantly, it raises questions about the other assumptions lurking in the shadows here. What if new options affect the previous ones in some other way? For example, what if the value of new options diminishes as options accumulate?

It's not difficult to argue for the assumption that additional choices bring diminishing returns. After all, the more choices you have, the less likely you are to choose the new one. This would imply that any action that takes choices from someone who has many and gives them to someone has significantly fewer represents a net gain since the choice is more likely to be used by the recipient. Let's say we weight the value of a choice by the likelihood of it being used, and if we further assume that giving someone money increases his or her choices, then taking money from a rich person and giving it to a poor person should produce a net gain in freedom.

Does this mean Glaeser's libertarian argument is secretly socialist? Of course not. The fact that he explicitly cites utility functions suggests that he is talking about a world where orders are well defined, and effects are additive and you can understand the whole by looking at the parts. In that world his argument is perfectly valid.

But as we've just seen with our dice and our beer, we can't always trust even the most intuitively obvious assumptions to hold. What's more, our examples were incredibly simple. The distribution of each die just had three equally probable values. The purchasing algorithm only used two variables and two extremely straightforward rules.

The real world is far more complex. With more variables and more involved rules and relationships, the chances of an assumption catching us off guard only get greater.



*Some economists might object at this point that this algorithm is not rational in the strict economics sense of the word. That's true, but unless those economists are also prepared to argue that all consumers are always completely rational actors, the argument still stands.

Post-weekend Gaming -- five-penny backgammon

[We haven't forgotten about games here at OE. Quite the opposite. I've been working on a post about games of perfect and imperfect information for a while now and it should be going up soon. While I was thinking about the backgammon section of the post, I remembered a variant for math teachers I've been meaning to write up for a few years now.]

FIVE-PENNY BACKGAMMON

Played exactly like traditional backgammon except:

The dice are replaced with five coins;

instead of rolling the dice, each player tosses the five coins using the cup, adds one to the number of heads then repeats the procedure a second time;

the two (h + 1) totals are treated like the results from rolling a pair of dice.

For example, tossing two heads then tossing three would be the same as rolling a three and a four.



PLAYER'S CHOICE

In this variant, the player can choose dice or coins on a turn-by-turn basis.




FIVE-PENNY BACKGAMMON IN THE CLASSROOM

Though this is largely matter of preference, I would introduce five-penny games well before any kind of formal or semi-formal introduction to the underlying probability theory. This gives the students a chance to become comfortable with these examples before they see them in lectures and it also gives them the opportunity to discover on their own that there's a difference between having the same possible outcomes and having the same probabilities associated with those outcomes.

Mark Thoma makes an important point

From Economist's View:
That was a mistake, but what is the lesson? One is that we should not necessarily ignore something just because it cannot be found in the data. Much of the empirical work prior to the crisis involved data from the early 1980s to the present (due to an assumption of structural change around that time), sometimes the data goes back to 1959 (when standard series on money end), and occasionally empirical work will use data starting in 1947. So important, infrequent events like the great Depression are rarely even in the data we use to test our models. Things that help to explain this episode may not seem important in limited data sets, but we ignore these possibilities at our own peril.

But how do we know which things to pay attention to if the data isn't always the best guide? We can't just say anything is possible no matter what the data tell us, that's not much of a guide on where to focus our attention.

The data can certainly tell us which things we should take a closer look at. If something is empirically important in explaining business cycles (or other economic phenomena ), that should draw our attention.

But things that do not appear important in data since, say, 1980 should not necessarily be ignored. This is where history plays a key role in directing our attention. If we believe that a collapse of financial intermediation was important in the Great Depression (or in other collapses in the 1800s), then we should ask how that might occur in our models and what might happen if it did. You may not find that the Bernanke, Gertler, Gilchrist model is important when tested against recent data, but does it seem to give us information that coincides with what we know about these earlier periods? We can't do formal tests in these cases, but there is information and guidance here. Had we followed it -- had we remembered to test our models not just against recent data but also against the lessons of history -- we might have been better prepared theoretically when he crisis hit.

Sunday, April 17, 2011

Removing Uncertainty

One of the arguments that has been made a lot is that it is important to remove sources of uncertainty to make business investments less susceptible to political changes. However, this need to make things less uncertain doesn't appear to apply in the public sector:

Every one of Detroit's public school teachers is receiving a layoff notice -- but that doesn't mean they will all be fired.

The layoff notices were sent to the 5,466 unionized teachers "in anticipation of a workforce reduction to match the district's declining student enrollment," according to a Detroit Public Schools statement. The layoff notices are required as part of the Detroit Teachers Federation collective-bargaining agreement. Non-Renewal notices have also been sent to 248 administrators, and the layoffs would go into effect by July 29.


Even though the risk of an actually losing a job might be low, imagine having to plan around how to pay for a mortgage or a lease if one's job might not be there? How can this possibly be a good way to organize an economy?

Because it's Saturday

Saturday, April 16, 2011

Then and now -- Paul Ryan edition

Jonathan Chait digs up an interesting Robert Novak column from 2001:
The most enthusiastic congressional supporters of President Bush's proposed tax cut consider it much too small, but that's not all. They have reason to believe that government estimators, in both the administration and Congress, are up to their old tricks and badly underestimating tax revenue.

Lawrence Hunter, chief economist of the Empower America think tank, has made calculations that lead him to believe that the Congressional Budget Office has lowballed its estimated 10-year surplus of $ 5.6 trillion. He figures the realistic number is at least $ 1 trillion higher and probably another $ 1 trillion above that. Those numbers not only would permit a considerably larger tax cut than Bush's, estimated to lose $ 1.6 trillion in revenue, but in fact would mandate it.

There are senior Bush policymakers who privately admit that Hunter and his allies in Congress have a point. But these officials claim they cannot change the rules in the middle of the game. Nor can they adjust unrealistic methods that bloat the revenue loss from Bush's cuts. Thus, Washington's high-tax establishment is able to use underestimated surplus projections and overestimated tax losses to claim the country cannot afford the president's program.

"It's too small," Rep. Paul Ryan of Wisconsin, the most junior member of the Ways and Means Committee but a leading House supply-sider, told me. "It's not big enough to fit all the policy we want."
It's possible, of course, to put too much weight on the historical record. Just because we went from deficit to surplus under Clinton and back to deficit under Bush does not mean that Ryan is wrong. It's possible that things are different now, that we should be taking the Bush tax cuts further rather than letting them expire.

But you can't just ignore history either. Ten years ago, Ryan was making basically the same recommendations he's making today based on the same economic and philosophical assumptions. The experience of the past dozen years seems to argue for doing the opposite. That puts the burden of proof squarely on his shoulders.

Friday, April 15, 2011

Weekend pop culture blogging -- comic strip edition

The birth of a medium invariably consists overwhelmingly of crap (think Sturgeon's Law raised to the next level), but new media also has a way of producing the occasional work on stunning originality.

An obvious example here is the work of Winsor McCay (arguably the first acknowledged genius of both comics and animation, though fans of Herriman and Feininger might dispute the comics part), but even McCay's astoundingly innovative work can't match the originality of the Upside-Downs by Gustave Verbeek.

Verbeek's weekly strip told a twelve panel story in a six panel space. He did this by... Oh, hell, you really need to see this for yourself.





How's that to start your weekend?

For more Verbeek weirdness check here.

Tale of two tax rates

The following came via either Mark Thoma or Felix Salmon. I don't know which but I don't see that it matters. Both write blogs that you should check on a daily basis and both hold David Cay Johnston in high esteem (Thoma cites him frequently and Salmon calls him "the dean of tax reporters").

Here's Johnston on corporate taxes:

Just as the individual income tax falls more heavily on the affluent than the super-rich, so too does the corporate income tax. The giants of American business pay at lower effective cash rates than much smaller corporations.

That regressivity is an important aspect of the general trend in U.S. tax policy, which at both the federal and state levels is focused on pushing burdens down the income ladder.

But the broader issue has gotten zero attention in the hubbub that began March 24 with The New York Times exposé on General Electric Co.'s income taxes.1

GE itself has said it paid no tax federal income tax last year, but complains it was maligned -- although it has been unable to point to a single factual error in the Times. We'll get to the dispute over how GE was treated, its response, and its statement that it is "scrupulous" about its worldwide tax compliance. But first let's look at the distribution of corporate income taxes, starting with a comparison of two of the best-known brand names in the country: GE and the New York Times Co.

Warning: You may want to take a deep breath right now because the numbers that follow may leave you gasping for air.

GE made a nearly $194 billion profit over the last 10 years and paid nearly $23 billion in income taxes. That's a real tax rate of 11.8 percent, about one-third the statutory rate of 35 percent.

The New York Times Co. made less than $2 billion in profit over the same 10 years and paid almost $1.4 billion in income taxes. That's a real tax rate of 71 percent, paid in cold, hard cash.

So the newspaper company that exposed GE paid more than twice the posted U.S. corporate rate, and its real tax rate was more than six times GE's real tax rate.

Thursday, April 14, 2011

Quote of the day

From Comrade Physioprof:

When you ask people overly simplistic and broad "gotcha" questions in a provocative and accusatory manner, you shouldn't be surprised to receive glib uninformative answers. If you develop genuine professional relationships with people within NIH and treat them like the fellow scientists they are, you will receive more thoughtful honest answers.


This concept has broad applicability in many areas of debate and it is worth keeping in mind when one tries to bring up difficult questions. Sometimes you need to be able to build bounds of trust to deal with delicate issues and, while not as "cool" as being the crusader, it may be the way towards real reform.

Prediction is hard

President George W. Bush in 2001:

Many of you have talked about the need to pay down our national debt. I listened, and I agree. We owe it to our children and our grandchildren to act now, and I hope you will join me to pay down $2 trillion in debt during the next 10 years. At the end of those 10 years, we will have paid down all the debt that is available to retire. That is more debt repaid more quickly than has ever been repaid by any nation at any time in history.


I think that the core issue here, presuming good faith on all sides, is that second order effects are really hard to model. So tax cuts (both business and individual cuts) are seen to stimulate the economy. But accurately predicting that is very hard in a large and non-linear system like the United States economy. It's even possible that tax cuts could have perverse effects of lowering growth (I am not saying that they do -- it's just that complex, non-linear systems which are sensitive to initial values are very hard to predict).

So perhaps the real lesson here is to focus on first order effects. Link tax cuts directly to program cuts. And vice versa, new programs should have taxes that are linked to them. In my world, that would include wars (notice how World Wars I and II led to large tax increases to finance) which would make the debate about military intervention a lot more involved. I don't know if this would be a complete solution to deficit woes, but I worry that the current approach relies way too heavily on statistical models to predict the consequences of tax and budget policy (and, as we know from chaos theory, these types of models are notoriously difficult to use for prediction).

Wednesday, April 13, 2011

Online gambling meets artificial intelligence, sort of...

Nice story from Marketplace. I particularly liked this quote:
Michael Bowling helped build the best bot. He runs the Computer Poker Research Group at the University of Alberta.
...

But Bowling doubts that the commercial poker bots that are usually sold for around $80 are any good. Otherwise, why would they be for sale?

Bowling: If you had created a program that could actually win money from other players, would you be selling it to somebody else or would you be using it to win money?

You have to admit it does simplify things quite a bit

From Jonathan Chait:
There are a great many questions that can be easily explained by letting go of the assumption that people in positions of wealth and power must have some intelligent reason for what they're doing. I hate to constantly bring everything back to the role of luck in market outcomes, but this is a fundamental philosophical divide with wide-ranging implications. My belief is that a capitalist economy will produce, through sheer luck, a great deal of rich dopes. Donald Trump is a great case in point.