Comments, observations and thoughts from two bloggers on applied statistics, higher education and epidemiology. Joseph is an associate professor. Mark is a professional statistician and former math teacher.
Wednesday, July 13, 2011
Tuesday, July 12, 2011
Modeling assumptions
I’ll note, however, that you might be a freshwater economist if you think it makes sense to reassure us that a deflationary spiral is impossible because your model says so even though deflationary spirals do, in fact, occur in human history. To me, a model that denies the possibility of something happening that does, in fact, happen indicates that you’re working with a flawed model.
I can't comment on whether or not this is a fair assessment of the work in question. But it is always a good idea to "reality check" model outputs and ensure that the distribution of events generated by the model looks something like real data. If important events occur in real data that your model dismisses as impossible than model misspecification or missing confounding variables should be immediately suspected.
EDIT: Noah Smith also comments and it is well worth the read. He traces these conclusions to some rather strong assumptions . . .
Sunday, July 10, 2011
Challenges of causal inference in complex systems
“Spend less money, create more jobs” is the kind of world one normally finds only in Woody Allen movies, and it’s a profoundly unserious stance for any politician to take. Spending cuts, whether they’re implemented by the public sector or the private sector, are never going to create jobs. And there’s simply no magical ju-jitsu whereby government spending cuts get reversed and amplified, becoming larger private-sector spending increases.
I think that one of the difficulties in macroeconomics is that you have complex systems that are not subject to experimentation. So you are forced to try and use observational studies and analogies with microeconomics to try and determine the causal effects of policies. Even instruments are questionable as they also rely on unverifiable, strong assumptions.
The inability to have a consensus on the counter-factual is pernicious and causes no end of trouble. Consider the tax increases passed at the beginning of the Clinton administration. Are they responsible for the late-1990's boom, unrelated to it, or did they act to slow it down (making the current economy smaller than it could have been)? How would you know this?
Cross country comparisons are possible but you have both confounding factors and effect measure modification. Changing the tax rate in Sweden might have different consequences than in the United States due to both different cultures (confounding) and to differences in current tax rates (effect modification). So, by picking different analogies and different models for the observational data, we can end up with some really strange claims being made about how economies work.
It is not an area with easy solutions. But I think to agree with Felix that the model he is critiquing is making heroic assumptions about the influence of tax levels on economic growth.
Saturday, July 9, 2011
Harnessing the power of tax evasion to address climate change
- sub-divided said...
-
The thing I know about Greek tax-evasion is that they know where you live when you hook into the electric grid. So a family I know, owning a small island, avoids paying any taxes by running everything on wind and solar power.
Friday, July 8, 2011
Case Crossover paper and time trends
"Purpose
Elevated levels of phosphorus (P) and calcium (Ca) have been shown in observational studies to be associated with an increased risk of adverse clinical outcomes including mortality. Vitamin D sterols have been shown to increase the risk of hypercalcemia and hyperphosphatemia in clinical trials. We sought to explore these risks in real-world clinical practice.
Methods
We employed a case–crossover design, which eliminates confounding by non-time-varying patient characteristics by comparing, within each patient, vitamin D doses before the event with those at an earlier period. Using this method, we estimated the risk of hypercalcemic (Ca ≥ 11 g/dL) and hyperphosphatemic (P ≥ 8 g/dL) events for patients at different dose quartiles of vitamin D relative to patients not on a vitamin D sterol.
Results
There was a dose-dependent association between vitamin D dose quartile and risk of hypercalcemia or hyperphosphatemia. In adjusted analyses, each increase in vitamin D quartile was associated with a multiple of hypercalcemia risk between 1.7 and 19 times compared with those not on vitamin D and a multiple of hyperphosphatemia risk between 1.8 and 4.
Conclusion
Use of vitamin D sterols is associated with an increased risk of hypercalcemic and hyperphosphatemic events in real-world clinical practice. Other potential predictors of these events, such as phosphate binder use and dialysate Ca levels, were not examined in this analysis."
It seems to be an interesting paper but I have one concern. If you look at the discussion section of the paper, the authors note that:
In our sensitivity analysis, we used 1-month periods to assess vitamin D exposure. In this analysis, estimates of the association between vitamin D dose and risk of events were smaller than those in the primary analysis, particularly for hypercalcemia. One possible explanation for this finding is that the average 2-month exposure measure is a superior indicator, compared with the 1-month assessment, of both the dose and duration of vitamin D exposure. As well, it could be that some dose changes in the month prior to the event had already occurred in response to increasing Ca levels and that, for this reason, the dose 2 months prior to the event is a more accurate reflection of the dose that gave rise to the hypercalcemic or hyperphosphatemic event.
Another explanation that I did not see addressed is the possibility that there is a time trend occuring. If the frequency of vitamin D administration (or the dose) increased with time then you would expect to see smaller estimates in the sensitivity analysis as well. But it would be an artefact of changing exposure over time.
That being said, it was a real pleasure to read a well justified use of the case-crossover design in a medication paper. Hopefully this is a sign that there will be more use of with-in person study designs in the future in epidemiology. The ability to handle time invariant confounders is a serious advantage of this approach.
Thursday, July 7, 2011
Transformations
The post is worth reading and the comments are really interesting. In particular, Chris Auld makes a very good case for simplicity and interpretability as a desirable property of statistical models in several of the comments.
There is also a thought provoking discussion of how to parameterize wealth that involves the sort of deep thinking about variables that we should do more of in epidemiology. In particular, in what sense is it reasonable to consider a person (especially in a country like Canada with strong entitlement programs) to truly have zero wealth.
Definitely worth the read.
Tuesday, July 5, 2011
The Chameleon XLE
Even while dealing with protests and open riots, the new Greek government is trying to change things. It is rationalizing its tax-collection system. It has simplified taxes and done away with some of the loopholes. And it has stepped up its enforcement efforts in ways large and small—tax officials have, for instance, been sending helicopters over affluent neighborhoods looking for swimming pools, as evidence of underreported wealth. These efforts have made some difference: the self-employed seem to be reporting more of their income, and the evaders have had to step up their game. (There’s now a burgeoning market in camouflage swimming-pool covers.)Perhaps there's a marketing opportunity here...
Chamelion XLE - Watch more Funny Videos
Simulation and foolish cost-cutting
According to a recent report in The Wall Street Journal, United/Continental Airlines wants to use less realistic (and less expensive) simulators for many of their pilot training tasks. Instead of simulating the motion and vibrations of the plane, these static training devices are fixed to the ground:What's notable here is that the cost of the technology behind these simulators has dropped over the past twenty-plus years and will almost certainly drop further over the next twenty, while the potential costs of this change remain high (imagine the legal and reputational consequences when one of these pilots is involved in a crash, even if the official cause is not pilot error).The disagreement within United Continental revolves around using fixed-base simulators—which don’t mimic the movements of planes in flight—rather than full-motion devices to conduct certain types of mandatory, recurrent pilot training.
A decade ago, Continental received Federal Aviation Administration regulatory approval to use such devices, costing roughly one-third less than full-motion simulators, during the last phase of periodic proficiency checks for pilots flying its Boeing Co. 777 fleet. Continental was moving to expand the practice to its Boeing 737 pilots before last year’s merger agreement with United shifted the combined airlines’ focus to integrating all FAA paperwork.
Continental believes its novel approach is superior to traditional practice by stressing human factors and cockpit interaction and thereby enhancing safety. But the position, according to some safety experts, appears to run counter to at least some of the latest guidance coming from parts of the FAA and international standard-setting groups such as the International Civil Aviation Organization, an arm of the United Nations.
“We should be aiming for the greatest possible realism to teach crews how to use both mental skills and motor skills to most effectively deal with emergencies,” according to Mark Rosenker, a former member of the U.S. National Transportation Safety Board. The NTSB continues to champion full-motion simulators for recurrent training. Except for cost considerations, Mr. Rosenker said, “why would anyone opt for anything less?”
I can only hope United/Continental has some solid research to support this proposed change, because it strikes me as a bad idea. For starters, there’s no reason to mess with success, even if it saves a few dollars. And modern pilot training is an astonishing success story. Let’s begin with the dismal history: Despite a long list of aviation reforms, from mandatory pilot layovers to increased classroom training, the percentage of crashes due to pilot error refused to budge from 1940 to 1990, holding steady at around 65 percent. It didn’t matter what type of plane was being flown, or where the plane was going. The brute fact remained: Most aviation deaths were due to bad decisions in the cockpit.
According to the latest statistics, however, mistakes by the flight crew are responsible for less than 30 percent of all plane accidents, with a 71 percent reduction in the number of accidents caused by “poor decision-making.” The end result is that flying has become safer than ever. According to the National Transportation Safety Board, flying on a commercial plane has a fatality rate of 0.04 per 100 million passenger miles, making it, by far, the least dangerous form of travel. (Driving, by contrast, has a fatality rate of 0.86.)
What caused the dramatic reduction in pilot error? One widely cited factor is the introduction of highly realistic flight simulators, starting in the mid 1980’s. (The first simulators were put into place during WWII, as the Air Force didn’t have enough real planes to train thousands of new pilots.) For the first time, pilots could practice making decisions in extremely stressful conditions. They could refine their reactions to a sudden downdraft in a thunderstorm, or practice landing with only one engine. They could learn what it’s like to fly without wing flaps, or land on a tarmac glazed with ice. And they could do all this without leaving the ground.
The benefit of a flight simulator is that it allows pilots to internalize their new knowledge. Instead of memorizing lessons on the blackboard, they were forced to exercise emotional regulation, learning how to stay calm and think clearly when bad stuff happens. (I’ve been in these realistic flight simulators and let me assure you – they can be terrifying. After I crashed my jetliner, I left the simulator drenched in sweat, all jangly with adrenaline.) The essential point here is that pilots were the first profession to realize that many of our most important decisions were inherently emotional and instinctive, which is why it was necessary to practice them in an emotional state. If we want those hours of practice to transfer to the real world – and isn’t that the point of practice? – then we have to simulate not just the exterior conditions of the cockpit but the internal mental state of the pilot as well. For far too long, we’ve assumed that expertise is about learning lots of facts, which is why we settled for the “chalk and talk” teaching method. But it’s not. True expertise occurs when we no longer need to reference facts, because we already know what to do.
Needless to say, the potential of simulators goes far beyond pilot training. In recent years, the medical profession has discovered the virtues of virtual reality. Recent studies have found big benefits when medical students use simulators to practice colonoscopies, laparoscopic surgery and even the varied techniques of general surgery. What’s not surprising is that this simulated training was even more effective when it was physically realistic, providing surgeons with “haptic feedback.”
The moral is that we need to make our simulators more realistic, not less. Pilots have pioneered a key concept in professional training. It would be a shame if airlines began to backtrack to save money.
Unless there's more to this story, shifting to less realistic simulators would seem to be a classic case of cost-cutting that defies cost-analysis. This kind of thing is not hard to find in both the private and public sector. It can usually be attributed to a lack of alignment (the people devising and championing these proposals get their bonuses and promotions almost immediately. By the time true costs become apparent, attrition and short institutional memories will rule out any significant repercussions.*) and to an understandable but dangerous tendency to believe in easy answers.
*Thank God the financial sector is immune to this sort of thing.
Dean Dad on Higher education
The whole enterprise just smells to me like the latest variation on “let’s privatize Social Security” or “let’s replace Medicare with vouchers.” It’s the wealthy and their worshippers sloughing off any social obligation, basically dropping the ladder behind them. If that weren’t the case, if they actually believed what they said, I’d expect to see the best and brightest from Choate and Philips Exeter eschewing college and doing startups or joining the military instead. Um, no.
I had not made this connection but it does seem to be a coherent interpretation of an otherwise puzzling argument. I must admit that I remain mystified with the current interest in the United States with disassembling the social infrastructure. Not only is it in the opposite direction of most countries, but the ones that have tried it seem to end up being bad places to be. Think of Russia, for example.
The real issue, to me, is that the real remedy to these types of escalating prices is the high quality public university system that countries like Canada and states like California have. The University of California is a high quality set of institutions and much less expensive than the alternatives.
Why is this approach not the one that rising prices brings us back to? Okay, we'd have to go back to the brutal taxes of the Reagan or Clinton eras, but I am not convinced that this move would lead to an immediate dystopia.
Sunday, July 3, 2011
Freelance Writing
Don’t believe me? Amazon has killed Borders. Barnes & Nobles looks like it’s next. We’re not far from a time when the only vendor for books are virtual stores. And we’re not that far off from a time when print books are so expensive thanks to shorter print runs, folks will be forced to buy electronic media or not read at all.
It does make one wonder about shorter print runs and whether publishing on demand can keep with the cost of mass print runs or not. I am not informed, either way, but if it cannot them it's going to be tough to beat eReaders.
Too much silliness, too little time
I can't, however, let this pass with getting off a quick shot just to get things started.
From Chait's column:
Just to sum up, we don't have to worry about schools dropping subjects that don't show on certain tests because these tests are only a part of the teacher evaluation.I thought David Brooks' column yesterday on education reform was generally quite good. But he conceded a point to critics of education reform that should not be conceded:
If you orient the system exclusively around a series of multiple choice accountability assessments, you distort it.If you make tests all-important, you give schools an incentive to drop the subjects that don’t show up on the exams but that help students become fully rounded individuals — like history, poetry, art and sports.The assumption that schools have had to make tests "all important" has deeply penetrated the debate, but it's not accurate. Different states have different ways of measuring teacher performance. But none of them use student test scores as more than 50% of the measure. Classroom evaluations and other methods account for half or more of the measures everywhere. I've also noticed, anecdotally, that many people assume test measures use a single, blunt scale so that poor children are measured against the same standard as wealthy ones. That's not true, either. Test measures account for socioeconomic status, and measure student improvement over the school year.
Now, this isn't to deny that some schools and teachers over-emphasize a narrow curriculum. But the non-test components of a teacher evaluation method can easily incorporate broader measures of student performance.
This is one of those, for lack of a better word, arguments that leaves you wondering if you missed something. When a superintendent and a principal try to decide whether or not to hire an art teacher, does the non-test component of teacher evaluations guarantee the hiring in some less than obvious way?
Saturday, July 2, 2011
Online Retailers and Sales Tax
Whatever scheme we can come up with to collect lost sales tax is one more step towards a level playing field for brick and mortar retailers, and a re-funding of our cities, counties and states. It's not an enormous amount of money, but it's a step in the right direction. Online retailers got a free pass in the early days, a kind of nod for fledgling e-commerce. Unfortunately, monsters were created and none is bigger than Amazon, destroyer of worlds and loser of a billion dollars of their own capital. But hey, it's an Internet company, we're willing to look the other way for a decade or so while the new world order manifests. It's time to level the playing field and make everyone play by the same rules.
The article as a whole is worth reading.
I know a lot of people argue that the loss of physical book stores is not a tragedy. But it is not clear to me thast we should be subsidizing their competition, either. Nor is it a great idea to reduce government revenue at a time where margins are tight.
So maybe I am on California's side in this discussion.
Jonathan Chait -- now to the right of David Brooks on Education
Friday, July 1, 2011
It's like Groupon, only they don't even check with the merchants first
When it comes to MoviePass, a theater subscription service that would allow moviegoers to watch an unlimited number of films in theaters of their choice for $50 a month, most theater chains have taken a pass. So many, in fact, that MoviePass has canceled its test roll-out of the service that was supposed to take place in the San Francisco area this weekend. It’s not throwing in the towel just yet, however. In an interview with Wired magazine, MoviePass co-founder Stacy Spikes said that he’s confident that once exhibitors learn more about the service “they’ll be excited. We just haven’t had that opportunity yet.” But spokespersons for several chains maintained that MoviePass executives should have made a discussion of their service with them the first order of business. Ted Mundorff, the CEO of Landmark Theatres, told TheWrap.com that he wasn’t aware of the service prior to the announcement of this weekend’s test. “We are stunned that an announcement like this was made, and they ‘forgot’ to discuss it with their clients,” Mundorff said. “We are not interested in outside entities setting ticket prices for us.”
The Orphan Works Project
From Cory Doctrow via DeLong:
By the end of the Twentieth Century, we had reached the point where a company with enough lobbyists and lawyers could do anything they wanted with copyrights, whether it was getting a de facto permanent extension for Mickey Mouse or even removing a work from the public domain. As mentioned before, this reveals some tremendous hypocrisies from major players.University of Michigan to stop worrying about lawsuits, start releasing orphan works: Bobbyg sez, "The University of Michigan Library will be sharing digital copies of their orphan works, that is, copyrighted works which have no identifiable owner, with the University community. Paul Courant, the University Librarian, says that the project is integral to the mission of the library, and that the sharing of the orphan works is a 'fair use' of the material, stating that 'sharing these orphan works does no economic harm to any person or organization, while not doing so harms scholarship and learning...'"
The Orphan Works Project is being led by the Copyright Office of the University of Michigan Library to identify orphan works. Orphan works are books that are subject to copyright but whose copyright holders cannot be identified or contacted. Our immediate focus is on digital books held by HathiTrust, a partnership of major research institutions and libraries working to ensure that the cultural record is preserved and accessible long into the future.
This effort is funded by the HathiTrust and is part of U-M Library's ongoing efforts to understand the true copyright status of works in its collection. As part of this effort, the Library will develop policies, processes, and procedures that can be used by other HathiTrust partners to replicate a task that will ultimately require the hand-checking of millions of volumes.
Bravo/a. I have no idea what will come of this, but pulling the default position of libraries, archives, and other institutions from one of debilitating fear or lawsuits to one of bravely sharing is something long past needed.