Wednesday, October 5, 2016

Inheritance Tax

This is Joseph.

An interesting comment from the Baseline Scenario (by commentator John Thacker, replying to James Kwak):
More interesting is to discuss the combination of eliminating the estate tax plus eliminating basis step-up. It would get rid of some of the biggest sob stories that people dislike about the estate tax (some closely held sole proprietorship or family farm is forced to sell out in order to pay the taxes; they’d owe nothing right now if there weren’t intending to sell, only owe capital gains when selling.) It would discourage selling and transfer of assets, to be sure, by comparison to the current situation, though I think it’s fair to consider the current situation as biased towards selling.
I think that this is actually intriguing for discussion for capital assets.  Inherited assets could be retained but, if they were ever sold, they would pay a great deal in capital gains tax.  Now, in practice, I suspect few family farms are ever really lost due to inheritance tax.  But it does seem to be a neat twist that gets at the emotional issue -- we don't necessarily want to preserve the right of people to make out like bandits when selling the beloved family farm.

Tuesday, October 4, 2016

A very brief Robert Benchley film festival

The terrestrial superstations have recently started rerunning talk shows from the pre-cable era. Tribune media started it with Johnny Carson on AntennaTV, GetTV followed with Merv Griffith, and the Weigel/CBS collaborative effort Decades has now started running the best of Dick Cavett. (I keep meaning to do a post on Decades which is exploring some cool and interesting programming ideas but that will need to wait for another day.) I have sampled all three and only Cavett is consistently watchable, due, no doubt, both to the quality of the original shows and the excellent job Weigel  does in culling the highlights.
During a wonderful interview with Bob and Ray ("the two and only"), the subject turned to comic influences and specifically the humorist Robert Benchley. They observed that while his short subjects had been hugely popular and influential when they were young, the one-reelers were at the time of the interview almost impossible to find.

As the saying goes, that was then and this is now. These days, only the most tightly guarded or vanishingly obscure is more than a click away. Benchley is neither, so it only takes a few minutes to get up to speed on one of the 20th Century's most influential comic voices.

Like most work associated with the Algonquin Roundtable, Robert Benchley's humorous pieces have not aged all that well. They do, however, retain a certain light charm of their own and can be quite interesting as historical documents.

Like many of his peers, Robert Benchley was drawn to Hollywood by the promise of easy money but unlike Dorothy Parker and Herman Mankiewicz, Benchley found his niche primarily as a performer, albeit one who wrote his own material. The first big success along these lines was a stage review written by Benchley and his peers. The big hit of the show was a monologue called "the Treasurer's Report."


Robert Benchley was a prolific writer and cranked out numerous short subjects for MGM, including the Oscar-winning "How to Sleep."

I tossed in the last two for historical interest. The first, though satiric, provides a glimpse into attitudes toward diet and nutrition circa 1935. Note in particular the concern about maintaining an appetite. "The causes of the depression" speaks for itself.





The Treasurer's Report




How to sleep




How To Eat







The Causes of the Depression









Monday, October 3, 2016

Why AP? -- another assault on conventional wisdom from your friends at West Coast Stat Views

I always got the feeling that others saw something in the advanced placement program that I didn't. It was never entirely clear to me why people who so often complained that our schools were doing a poor job teaching secondary-level courses were so damned happy about the same schools trying to teach college-level.

I did understand the argument for key prerequisite courses like calculus or statistics. Getting those out of the way in high school could be very helpful when trying to complete, say, an engineering degree in four years. Putting aside those exceptions, though, there didn't seem to be much point. We already had a program set up for self-study and testing out of courses. CLEP-based approaches are flexible, self-paced and cheap. They reward initiative and independence. They provide an excellent ready-made foundation when you're experimenting with new methods (If the people behind MOOCs were serious…). AP courses are, by comparison, expensive, tradition bound, cumbersome, difficult to schedule, and best serve students who are already well served by the conventional high school classroom approach.

From the moment they were introduced, AP courses tended to force out more varied and interesting elective courses for a standard slate of General Ed classes. In terms of quality of instruction, it was a Peter Principle anecdote waiting to happen. At best, you had teachers who were good at algebra and geometry being pushed out of their depth. At worst, you had faculty members who were good at sucking up to the administration being rewarded with plum positions.

Worse still was the inequality question. The schools that already had an unfair advantage in terms of financing and demographics were the very ones that could attract the highly qualified teachers with advanced degrees.

AP classes also play to one of the worst trends in education, the bury-the-kids-in-work approach which brings us to this recent essay from the Washington Post.

From Why I regret letting my teen sign up for an AP course by Kate Haas


My misgivings started when the homework began to pile up. I knew my son would have a lot of material to cover — the syllabus had been explicit about the required reading. But most of his homework seemed to consist of filling in charts. Night after night, I watched him spend hours scanning the pages of his textbook for relevant facts about ancient civilizations. He was not reading to learn but simply to plug correct bits of information into appropriate boxes.

“But you talk about this stuff in class, right?” I asked him. “You discuss the Code of Hammurabi, and all that?”

No, he told me, they did not. They took notes from the teacher’s slideshow presentations.

This did not remind me of college.

I graduated from an academically rigorous liberal arts school. In my freshman humanities class, I read a book a week: philosophy, literature, biographies, social science. But my classmates and I did not spend our time charting the number of syllables in Emily Dickinson’s poems or listing all the noble houses in Ssu-ma Chien’s chronicle of Chinese history. We were asked to think critically, raise questions, cite relevant passages and discuss a work’s implications in the wider world.

Nothing like that appeared to be taking place in my son’s AP history class. But I kept my mouth shut.

“I would enjoy learning about this,” he told me one night, “if the whole point wasn’t to go through it as fast as possible and then take a kajillion quizzes.”

“I’m sure that’s not the whole point,” I said.

At back-to-school night, I looked forward to meeting the teacher, who would undoubtedly put all this in perspective. Instead, she talked for 15 minutes about tests and grading policies.

At the end, my husband raised his hand. “What’s the main thing you want students to get from this class?” he asked.

I leaned forward expectantly. Now, surely, the teacher would mention an appreciation for the sweep of human history or the importance of an informed perspective on world events.

“Test-taking strategies and study skills,” she said briskly. “That’s the main thing.”


Friday, September 30, 2016

A system with multiple veto points

This is Joseph.

From Kevin Drum:
As for third parties, I'll say only this: in 1980, when I was 22, I voted for John Anderson. That sure was stupid. Eight years of Ronald Reagan because Jimmy Carter didn't quite meet my idealistic standards of excellence for presidents. I've never made that mistake again.
This is the issue with first past the post systems -- splitting up the vote from one coalition can lead to the other one being elected.  Just ask Canada about majority Conservative governments with a minority of the popular vote.

But it is worse in the United States of America.  For a law to be passed, it needs to pass the house, pass the senate, and then not be vetoed by the president (or have the veto overridden).  With two partisan parties, it already leads to a lot of gridlock.  Imagine if you needed to build coalitions in both the house and senate?

Now, some degree of gridlock might be a feature and not a bug if one is distrustful of government.  But there is probably a limit to how unresponsive we want government to be to actual problems, including those of bad government policy.

So third parties are both a chance to push the opposition past the post, but there isn't really a vision as to how a third party would work without one of the old parties collapsing.  And I am not sure how that reforms the parties -- it just shuffles the coalitions and puts new labels on them. 

Thursday, September 29, 2016

How the sugar lobby (effectively) killed RC Cola

Having recently broached the subject of the sugar industry's practice of subsidizing research that had a way of working out well for the people writing the checks, this might be a good time to revisit a characteristically fun and well written piece from our friends at Mental Floss.

I am assured that the Delaney mentioned here has no direct connection with anyone involved in this blog.




From The Tragic History of RC Cola by Jeff Wells

Slowly, steadily, RC muscled its way into soda fountains and onto grocery store shelves. To stay top-of-mind with consumers, it continued to innovate. In 1954, it became the first company to nationally distribute soda in aluminum cans. Shortly after, it began selling soda in 16-ounce bottles as an alternative size for thirsty fans. In 1959, Nehi changed its name to match its bestselling product, becoming the Royal Crown Cola Company.

But while Royal Crown had made significant progress, it would continue to trail Coke and Pepsi so long as it continued to sell a similar product. What it needed was something new. What it needed was a game changer.

In 1952, the founder of a sanitarium in Williamsburg, Brooklyn named Hyman Kirsch invented a sugar-free soda called No-Cal. Available in ginger ale and black cherry, No-Cal was made specifically for patients in Kirsch's sanitarium who were either diabetic or suffering from heart ailments. Kirsch quickly discovered that his drink had a much wider appeal, and along with his son began making other flavors, like chocolate, root beer, and cherry. The two sold No-Cal to local stores and quickly built up a distribution network that extended throughout New York and the northeast. Since Kirsch wasn’t a businessman, however, he struggled to expand beyond the regional market. He also continued marketing No-Cal mainly toward diabetic customers, further limiting his reach.

Kirsch’s success caught the eye of the Royal Crown Cola Company. In the mid '50s, it began secretly developing its own diet soft drink—one that would appeal not just to diabetics, but to an entire nation of increasingly calorie-conscious consumers. While other food and beverage companies continued to push everything sweet, salty, and delicious, RC recognized a budding demand for healthier choices.


After a few years RC came out with Diet Rite, a drink that the company believed would be the breakthrough it so desperately needed. Test markets had emphatically confirmed its appeal. One, in South Carolina, saw supermarket managers clamoring for the product. “In Greenville, S.C., where we had been running a poor third behind Coke and Pepsi, we actually had grocery store managers getting into their cars and chasing down RC trucks to get Diet Rite on their shelves,” one RC rep noted.

What could cause such a reaction? It wasn’t just that Diet Rite was nearly calorie-free—it’s that it was nearly calorie-free and tasted strikingly similar to the real thing. The key ingredient—the one Kirsch had first used in No-Cal—was an alternative sweetener called cyclamate that was 30 times sweeter than sugar. First developed by a student at the University of Illinois in 1937, it was initially sold as a tabletop sweetener. In 1958, the Food and Drug Administration gave full approval, paving the way for its use as a mass-market ingredient. The timing couldn’t have been better for Royal Crown.

In a particularly shrewd bit of marketing, the company made sure to sell Diet Rite just like real cola: In the same slender bottles for a nickel each, or as a six pack. It also made sure to put the word “cola” on its labels. Consumers wanted something different, RC executives figured, but not too different.

When Diet Rite hit shelves in 1962, it was a smashing success. Within a year and a half of its release, it had rocketed up to number four on the sales chart, behind Coke, Pepsi, and regular RC Cola. America, it turned out, was ready for what had for years seemed oxymoronic: a healthy soda. The rest of the industry was in something close to a state of shock. “So stunning was Diet-Rite Cola’s impact on the soft drink market in the early 1960s,” reported Georgia Trend, “that its acceptance could be compared to the beginnings of mighty Coca-Cola itself some 75 years earlier.”

Coke and Pepsi were caught completely off guard. Not only had they not anticipated the mainstream appeal of diet soda, they didn’t even have anything in the pipeline. Within a year, Coke would scramble to release TaB, which it also sweetened with cyclamate. Pepsi responded with Patio Cola, a diet soda aimed at women that also contained cyclamate, and which it would soon rebrand as Diet Pepsi. There were, predictably, numerous other fast followers to the market, including long-forgotten brands like LoLo, Coolo-Coolo, and Bubble-Up. In 1965, Coke came out with a citrus-flavored diet soda called Fresca.

None of them, however, could catch Diet Rite, which continued to build market share for Royal Crown Cola.

“RC had the dominant diet cola brand, and that was a very big deal,” Tristan Donovan tells mental_floss. “For RC, there was this sense of, ‘finally, we’ve broken through.’”

By the late '60s, Royal Crown owned 10 percent of the soda market. That was far from dominating, but it was still a very respectable figure, and the company was poised for further growth. By all accounts, the company that started in the basement of a small town grocery store was positioned to become a major player in the soda industry.

The rise of diet soda may have delighted soft drink manufacturers and American consumers, but it downright frightened the sugar industry. After decades of pumping its signature product into sodas, here was a comparable beverage that did away with sugar entirely. What if diet sodas continued to grow? What if all sodas became diet sodas? Ever resourceful, the industry searched for legal channels to undermine diet drinks.

In the mid-'60s, it began: the slow trickle of studies suggesting that cyclamate was hazardous. In 1964, a study linked cyclamate to cancer in animals, and raised the possibility that it could have adverse effects on humans. But the authors stopped short of linking the sweetener to specific conditions like cancer or birth defects. Royal Crown president W.H. Glenn dismissed the study as “nothing derogatory,” and other manufacturers echoed that sentiment. As the decade wore on, however, studies made more specific claims. In 1969, the decisive blow against cyclamate came in the form of two studies. One claimed that chicken eggs injected with cyclamate resulted in deformed chicks, while another found that rats given doses of cyclamate showed an increased risk of developing bladder tumors. The studies’ findings, splashed across newspapers and television screens nationwide, implicated cyclamate as a very dangerous ingredient.

“Everyone began saying, ‘Oh my god, diet soda’s going to give you cancer!’” Donovan says. “The market collapsed almost instantly.”

The FDA, meanwhile, had no choice but to remove its "generally recognized as safe" (GRAS) classification for cyclamate. The diet soda industry went into a tailspin, plummeting from 20 percent of the market to less than 3 percent. Manufacturers frantically reformulated their drinks and tried to reassure consumers, all to no avail. Overnight, the diet soda craze had come to a standstill.

The downturn hit Royal Crown particularly hard. Diet Rite had been its star performer, the one advantage it had over Coke and Pepsi. Without it, all the company had was the nation’s third favorite cola, which on its own wasn’t going to gain any ground on its rivals. After a few weeks, the company re-released Diet Rite, this time sweetened with saccharine. But the taste—saccharine has a notoriously metallic tinge to it—wasn’t the same, and many people weren’t ready to come back to diet drinks anyway. Eventually, Coke and Pepsi re-entered the market with better formulas and marketing, and once again, Royal Crown Cola had merely served as the guinea pig for its competitors.

According to Donovan, the cyclamate backlash was the direct result of the sugar industry’s meddling. That lobby, he said, provided $600,000 in funding for the studies that doomed cyclamate, both of which are now seen as controversial because they involved exposing animals to much higher levels of the ingredient than any Diet Rite or TaB drinker could ever possibly imbibe. To get the same amount of cyclamate as the rats in one of the studies, for instance, you’d have to drink more than 500 diet drinks a day. Today, cyclamate is widely used as a sweetener in countries like Australia, South Africa, and throughout the European Union. Scientists around the world say it's safe for consumption, yet the results of the 1969 studies still linger. The United States, Japan, and 45 other countries have upheld their ban on the additive.

How could such dubious results be admissible? Donovan pointed to a legal loophole called the Delaney Clause, an amendment to the Food, Drug and Cosmetic Act of 1938 established by a senator named James Delaney, who investigated insecticides and carcinogens in the food industry in the late '50s. The clause required the FDA to ban any additive found to “induce cancer in man, or, after tests, found to induce cancer in animals.” As well-meaning as the Delaney Clause was, it didn’t outline restrictions on the amount of a certain ingredient that could be tested. No matter if it was a granule or a gallon, if it proved hazardous to human or animal health, the ingredient had to be pulled.

“The Delaney Clause was a very well-intentioned but poorly thought-out law,” Donovan says.

Wednesday, September 28, 2016

What almost everyone gets wrong about Uber and driverless cars

From Rick Newman
Self-driving cars are the company’s holy grail. Morgan Stanley estimates human drivers account for half the cost of a ride-sharing trip, which means Uber may one day be able to dispense with its biggest cost, plus hassles such as lawsuits over whether Uber drivers should be treated as full-time employees. It could also become a logistics company on par with FedEx (FDX) and UPS (UPS), offering package delivery and other transportation services. If you buy this vision, Uber is the next Amazon (AMZN)—a coming goliath so transformative that 10 years of deep losses would be well worth the global domination the company will one day wield. That’s why investors have plowed nearly $12 billion into Uber, valuing it at a whopping $68 billion. Facebook (FB) raised less than $3 billion before going public in 2012.

Coverage of Uber by both business and tech journalists has been weak. Coverage of autonomous vehicles by both business and tech journalists has been weak. When you put the two together, the results are unsurprisingly pretty awful.

The majority of the stories make two fundamental mistakes.

The first involves confusion over self-driving versus driverless cars. Other than a probable improvement in safety, self-driving cars (assuming they required a human present just in case) would have very little impact on Uber's business model. While remarkable progress is being made, achieving the level of autonomy required for reliable driverless vehicles where you can simply tell your car an address and send it on its way with no humans on board remains a daunting technical challenge.

An even more basic mistake is routinely made on the business side. The confusion springs from the difference in absolute and relative impact. If companies operated in a vacuum, any development the reduced cost would be good. In the real world though, you also have to consider the impact of the development on your competitors.

Here's an example. Imagine you own one of two delivery services in a town. Both you and your competitor have roughly the same number of trucks but you have invested a great deal of money upgrading and making sure that your vehicles are as energy-efficient as possible. So far, the cost of the upgrade has been balanced out by your savings on diesel so that you are able to charge roughly the same rate as your competitor. A drop in fuel prices will reduce your operating cost. Normally that would be a good thing, but the cost for your competitor will drop by even more so that he will be able to undercut you on prices.

The Uber business model is based on the fact that there are a huge number of underemployed people who own underutilized cars (virtually all private vehicles are underutilized). Since car and driver are already just more or less sitting there most of the time, Uber is able to offer rides at a rate that would not otherwise be sufficient to cover all the assorted cost.

(Technically Uber doesn't offer the rides, but you get my drift.)

{And, yes, there are people who buy cars just to drive for Uber. There are also people who buy commemorative plates as a hedge against inflation.}

If you take drivers out of the equation, suddenly it becomes unclear what advantage Uber has over taxicab companies, car rental services, car dealerships or any business that maintains a large fleet of cars. Let's consider the Hertz example here in Southern California. Currently you have locations spread around LA and Orange counties, with each lot having to maintain a minimum stock. With truly driverless cars, you can get awfully close to 100% utilization for much of the day. Just have your extra vehicles prowl for fares and make deliveries, then send them to whatever location needs them next. Add to that maintenance facilities, purchasing power, a late model fleet and countless economies of scale.

You can imagine similar scenarios for any number of other businesses and in each of those scenarios, Uber and Lyft get screwed over by large, new, well-positioned competitors.

All of this leads us to the dirty little secret of the ride sharing industry. Though it was made possible by technological innovation (specifically the smart phone), the stability of the business model depends not on sustained disruption and transformation but on things remaining basically the same.

Tuesday, September 27, 2016

The coolest spot in town




“There was a desert wind blowing that night. It was one of those hot dry Santa Anas that come down through the mountain passes and curl your hair and make your nerves jump and your skin itch. On nights like that every booze party ends in a fight. Meek little wives feel the edge of the carving knife and study their husbands' necks. Anything can happen. You can even get a full glass of beer at a cocktail lounge.”
— Raymond Chandler, "Red Wind"


 From the LA Times:
Southern California will feel more like summer than autumn Monday, thanks to triple-digit temperatures and powerful Santa Ana winds for most of the day.

The mercury could reach 103 degrees in Burbank, 101 in Long Beach, 104 in Riverside and 105 in Azusa, Ojai and Van Nuys, forecasters say.

“This time of year you get those wild swings,” said meteorologist Kathy Hoxsie of the National Weather Service in Oxnard.

By 11:08 a.m. in Oxnard, the old daily high temperature record of 98 degrees, which was set in 1978, was broken. Temperatures had reached 103 degrees Monday and were still climbing.

Camarillo reached 102 by noon, slashing the record high of 101 set in 1963, she said.

By noon, two other record highs were teetering on the edge of being broken or tied.

Santa Maria reached 99 degrees by noon, a degree away from tying the record of 100 that was set in 1921. In San Luis Obispo, temperatures reached 100 degrees, four degrees shy of the 2010 record high.

Santa Ana winds are largely responsible for oven-like conditions and will raise temperatures some 20 degrees higher than average. 



I've mentioned before that one of the strangest things about Southern California is the microclimates. A ten mile change in position can often produce a thirty degree drop in temperature when you head toward the beach. You can drive through an snow storm in the mountains then change into a T shirt when you hit the valley.

Santa Ana's are, in a sense, the opposite. It doesn't matter where you are -- you can go as far as Santa Monica and only buy yourself four or five degrees -- but it makes a huge difference when. It's not just that the hottest days of the year can come in the fall; it's that they often come after you're sure that summer is over. A few days ago we were in the low 70s. In a few days we'll be back again. It adds a surreal quality to the experience.


This is what I meant by "a very lonely position"

As we were saying...
It is almost as if Spayd thinks it's 2000, when the NYT could set the conventional wisdom, could decide which narratives would followed and which public figures would be lauded or savaged. Spayd does understand that there is a battle going on for the soul of journalism, but she does not seem to understand that the alliances have changed, and the New York Times is about to find itself in a very lonely position.
It's been a few days and this is a fast-moving campaign, so is there evidence that the NYT's coverage is falling out of sync with the rest of the press?

Here's what the gray lady had to say about the debate. There are a couple of brief shots at Trump, but the overall tone is one of neutrality with a touch of bothsiderism and the inevitable lament for the lack of civility.





Based on a quick survey of other publications, pretty much everyone else followed a different narrative: Trump loses his temper/takes the bait and Clinton wins the debate.


















Not surprisingly, one of the papers that deviated most sharply from the NYT narrative was the Washington Post (which just just uncovered another Trump scandal, described by Josh Marshall as "real big," which seems to suggest major tax evasion).











Monday, September 26, 2016

“Transformative” remains a word better suited to Hogwarts than to the Harvard Business School

The thinking of business writers has become so muddled and, in places, so overtly mystical that the important fundamental drivers are completely lost in the discussion. Words like "disruptor" or "transformative" have such tremendous emotional resonance for the writers (and investors) that they blind them to the underlying business forces.

Case in point from Rick Newman (with the important caveat that Newman may be more describing than endorsing the common wisdom here)
If you buy this vision, Uber is the next Amazon (AMZN)—a coming goliath so transformative that 10 years of deep losses would be well worth the global domination the company will one day wield. That’s why investors have plowed nearly $12 billion into Uber, valuing it at a whopping $68 billion. Facebook (FB) raised less than $3 billion before going public in 2012.

Putting aside for the moment the question of whether Amazon or Facebook are themselves overvalued, there is a simple business reason why first mover advantage was so important for these companies.

Amazon took advantage of that early lead and all of the tremendous capital that went with it to establish a monumental distribution system. This presents a huge brick-and-mortar barrier to entry for any potential competitor. As for Facebook (or for that matter LinkedIn), the single most important factor when you are thinking about joining a social network is "are the people I want to connect with on this network?" Once again, unless the company screws up big time, early dominance can present an almost insurmountable obstacle for potential compeetitors.

Then there's the question of scale. Amazon and Facebook have business models that only makes sense on a national or better yet international level. There is no plausible scenario where local or regional players can significantly eat away at their market share let alone grow to the extent that they become real threats.

Barring a massive and truly unprecedented piece of regulatory capture, Uber will always be vulnerable to new players as long as we're talking about its current business model. All that is required is a big enough marketing and recruiting budget to establish brand and to line up a sufficient number of drivers for a given area. That's not a cheap or trivial task but it is most certainly doable. Perhaps more to the point, if this sector ever becomes profitable enough to justify the proposed value of Uber, then there will be enough money flowing to guarantee plenty of competitors.

As mentioned before, a locally based Amazon or Facebook does not make a great deal of sense. By comparison, you can make a very good case for a locally based ridesharing service. The start up costs are a small fraction of a national launch. You can prioritize highly lucrative markets. Business models can be tweaked. Specific demographics can be targeted. Special deals can be cut with local institutions and organizations. Regulatory issues can be dealt with much more easily.

Consider Austin, Texas. When Uber and Lyft pulled out in protest last May, a non-profit created an alternative ride sharing service and had it up and running in about a month. Keep in mind, we're talking about an independent with little capital starting from scratch. A well financed company that's gone through this a couple of times could certainly match or even beat that turnaround time.

If I were one of the investors who had poured nearly $12 billion into the company, that would make me nervous, even if it is a transformative disruptor.


Friday, September 23, 2016

Damn, we have been hammering at this one for a long time.

This is a reprint of a post from 2012 discussing the lengths James Stewart of the New York Times went to portray Paul Ryan's budget as fiscally responsible. The bigger story here, which was made explicit in the piece, was that converging on convenient narratives and insisting that both parties were currently making comparably reasonable proposals was bad journalism, bad enough to undermine the democratic process.

Four years later, Paul Ryan has abandoned any pretense of being fiscally responsible. As for the bigger point about bad journalism threatening democracy, I'm sure we could think of an example if we tried really hard.

 ______________________________________

Following up on Joseph's latest, I actually think the problem here is more James Stewart than Paul Ryan. Ryan's budgets have been fairly obvious attempts to form a more Randian union. That's not surprising coming from an avowed follower of Ayn Rand. Ryan also comes from a Straussian tradition so I'm not exactly shocked that he would try to sell proposals that are likely to increase the deficit as a path to fiscal responsibility.

But that's OK. The Ryan plan is exactly the kind of bad idea that our national immune system ought to be able to handle. Liberals should savage its underlying values (Rand is always a hard sell); centrists and independents should spend their time pointing out the endless ways that the numbers don't add up and the evidence contradicts the basic arguments; respectable conservatives should damn it with faint praise or simply avoid the topic. The Republicans would then come back with a new budget, hopefully a proposal based on valid numbers and defensible assumptions, but at the very least one that obscures its flaws and makes a cosmetic effort at advancing its stated goals.

For Ryan's proposals to maintain their standing as serious and viable, the system has to have broken down in an extraordinary way. Specifically, the centrists such as James Stewart have had to go to amazing lengths to make the budget look reasonable, up to and including claiming that Ryan intends to take steps that Ryan explicitly rules out (from James Kwak):

Stewart is at least smart enough to realize that a 25 percent rate is only a tax increase if you eliminate preferences for investment income (capital gains and dividends, currently taxed at a maximum rate of 15 percent):
“Despite Mr. Ryan’s reluctance to specify which tax preferences might have to be curtailed or eliminated, there’s no mystery as to what they would have to be. Looking only at the returns of the top 400 taxpayers, the biggest loophole they exploit by far is the preferential tax rate on capital gains, carried interest and dividend income.”
So give Stewart credit for knowing the basics of tax policy. But he is basically assuming that Ryan must be proposing to eliminate those preferences: “there’s no mystery as to what they would have to be.”
Only they aren’t. Stewart quotes directly from the FY 2012 budget resolution authored by Ryan’s Budget Committee. But apparently he didn’t notice this passage:
“Raising taxes on capital is another idea that purports to affect the wealthy but actually hurts all participants in the economy. Mainstream economics, not to mention common sense, teaches that raising taxes on any activity generally results in less of it. Economics and common sense also teach that the size of a nation’s capital stock – the pool of saved money available for investment and job creation – has an effect on employment, productivity, and wages. Tax reform should promote savings and investment because more savings and more investment mean a larger stock of capital available for job creation.”
In other words, taxes on capital gains should not be increased, but if anything should be lowered.
These distortions aren't just journalistic laziness or rhetorically overkill on Stewart's part; it's essential to a narrative that writers like Stewart have built their careers on.

Here's Paul Krugman:
But the “centrists” who weigh in on policy debates are playing a different game. Their self-image, and to a large extent their professional selling point, depends on posing as high-minded types standing between the partisan extremes, bringing together reasonable people from both parties — even if these reasonable people don’t actually exist. And this leaves them unable either to admit how moderate Mr. Obama is or to acknowledge the more or less universal extremism of his opponents on the right.
The point about self-image and professional selling points is remarkably astute and when you combine those with the decline in fact-checking, diminishing penalties for errors, and a growing trend toward group-think, you get a journalistic system that loses much of its ability to evaluate policy ideas.

And for a democracy that's a hell of a loss.

Thursday, September 22, 2016

"Never trust a superintendent. They'll lie to your face"

In case you missed it, this is definitely the week's best heroic lunch lady story.
PITTSBURGH — A cafeteria employee says she has quit her job over something being called “lunch shaming,” where some students are being denied hot lunches over a new policy.

Stacy Koltiska, a former employee at Canon McMillan School District, told CBS Pittsburgh the school is essentially shaming students in an effort to get parents to pay for overdrawn lunch accounts.

She was so upset by the new policy, she quit just a few weeks into the new school year.

Koltiska spent two years working in the cafeteria at Wylandville Elementary, but last Thursday, she resigned after she had to take away hot meals from two children.

“His eyes welled up with tears. I’ll never forget his name, the look on his face,” she said.

The new policy at Canon McMillan, which was passed over the summer for grades K-6, says the hot meal item will be replaced by a sandwich if $25 or more is owed to the district for lunches.



Superintendent Matthew Daniels says this is about collecting money owed, noting that parents are notified weekly of lunch balances.

He says, “There has never been the intent with the adoption of this policy to shame or embarrass a child.”


This is, if not a lie, then at least a nasty piece of equivocation on Daniels' part. While it is true that the final objective was to collect on more accounts receivable, the plan was to do so by using the pain and humiliation of small children to pressure their parents to pay up. What's more, we only know about this because the lunch ladies of the school district have more integrity and basic decency than the administrators.

I apologize to longtime readers who have heard the story before, but when I first decided to go into teaching, I sought out the advice of the father of a friend who was a highly respected veteran superintendent in Arkansas. He was friendly and glad to help, but he insisted that the only piece of advice that was really important was the warning I used as the title for this post.

As always I want to be clear on this next point. In my experience, a strong, perhaps even overwhelming majority of education administrators are dedicated professionals who are genuinely motivated by a desire to help kids. Unfortunately, it has also always been a position plagued with serious perverse incentives and the education reform movement has only served to make these much, much worse. The movement tends to favor proposals that give administrators more authority while it also (thanks to its management consultant roots) places great faith in the ability of big compensation packages to attract the best people. It is an easily gameable system and if you're good at working that system, it can mean hundreds of thousands of dollars and candy-sweet sinecures down the line.

I don't want to oversell the connection here. This sort of thing has been happening since long before the education reform movement took hold. What the movement has done, completely unintentionally, is to increase the rewards for being a mealy-mouth weasel who puts his or her own interests ahead of those of the children.

Even by the sleazy standards of the debt collection industry, humiliating or even traumatizing small children because their parents can't pay your bill is beyond the pale. The fact that this even needs to be pointed out his unspeakably depressing.

Wednesday, September 21, 2016

"Why do you hate us for caring too much?" – – Dispatches from a besieged institution

Public Editor
From Wikipedia

The job of the public editor is to supervise the implementation of proper journalism ethics at a newspaper, and to identify and examine critical errors or omissions, and to act as a liaison to the public. They do this primarily through a regular feature on a newspaper's editorial page. Because public editors are generally employees of the very newspaper they're criticizing, it may appear as though there is a possibility for bias. However, a newspaper with a high standard of ethics would not fire a public editor for a criticism of the paper; the act would contradict the purpose of the position and would itself be a very likely cause for public concern.

I don't want to impose a template, but generally one expects public editors to serve as the internal representative of external critical voices, or at least to see to it that these voices get a fair hearing. A typical column might start with acknowledging complaints about something like the paper's lack of coverage of poor neighborhoods. The public editor would then discuss some possible lapses on the paper's part, get some comments from the editor in charge, and then, as a rule, either encourage the paper to improve its coverage in this area or, at the very least, take a neutral position acknowledging that both the critics and the paper have a point.

Here are some examples from two previous public editors of the New York Times.


Clark Hoyt
The short answer is that a television critic with a history of errors wrote hastily and failed to double-check her work, and editors who should have been vigilant were not. But a more nuanced answer is that even a newspaper like The Times, with layers of editing to ensure accuracy, can go off the rails when communication is poor, individuals do not bear down hard enough, and they make assumptions about what others have done. Five editors read the article at different times, but none subjected it to rigorous fact-checking, even after catching two other errors in it. And three editors combined to cause one of the errors themselves.

Margaret Sullivan

Mistakes are bound to happen in the news business, but some are worse than others.

What I’ll lay out here was a bad one. It involved a failure of sufficient skepticism at every level of the reporting and editing process — especially since the story in question relied on anonymous government sources, as too many Times articles do.



The Times needs to fix its overuse of unnamed government sources. And it needs to slow down the reporting and editing process, especially in the fever-pitch atmosphere surrounding a major news event. Those are procedural changes, and they are needed. But most of all, and more fundamental, the paper needs to show far more skepticism – a kind of prosecutorial scrutiny — at every level of the process.

Two front-page, anonymously sourced stories in a few months have required editors’ notes that corrected key elements – elements that were integral enough to form the basis of the headlines in both cases. That’s not acceptable for Times readers or for the paper’s credibility, which is its most precious asset.

If this isn’t a red alert, I don’t know what will be.

But these are strange days at the New York Times and the new public editor is writing columns that are not only a sharp break with those of her predecessors, but seem to violate the very spirit of the office.

In particular, Liz Spayd is catching a great deal of flak for a piece that almost manages to invert the typical public editor column. It starts by grossly misrepresenting widespread criticisms of the paper, goes on to openly attack the critics making the charges, then pleads with the paper's staff to toe the editorial line and ignore the very voices that a public editor would normally speak for .


[Emphasis added]

The Truth About ‘False Balance’
False balance, sometimes called “false equivalency,” refers disparagingly to the practice of journalists who, in their zeal to be fair, present each side of a debate as equally credible, even when the factual evidence is stacked heavily on one side.

There has been a great deal of speculation as to what drives false equivalency, with the leading contenders being a desire to maintain access to high-placed sources, long-standing personal biases against certain politicians, a fear of reprisal, a desire to avoid charges of liberal bias, and simple laziness (a cursory both-sides-do-it story is generally much easier to write than a well investigated piece). Caring too much about fairness hardly ever makes the list and it certainly has no place in the definition.

Spayd then accuses the people making these charges of being irrational, shortsighted, and partisan.

I can’t help wondering about the ideological motives of those crying false balance, given that they are using the argument mostly in support of liberal causes and candidates. CNN’s Brian Stelter focused his show, “Reliable Sources,” on this subject last weekend. He asked a guest, Jacob Weisberg of Slate magazine, to frame the idea of false balance. Weisberg used an analogy, saying journalists are accustomed to covering candidates who may be apples and oranges, but at least are still both fruits. In Trump, he said, we have not fruit but rancid meat. That sounds like a partisan’s explanation passed off as a factual judgment.

But, as Jonathan Chait points out, Weisberg has no record of being a Hillary Clinton booster. The charge here is completely circular. He is partisan because he made a highly critical comment about Donald Trump and he made a highly critical comment about Donald Trump because he is partisan.

But the most extraordinary part of the piece and one which reminds us just how strange the final days of 2016 are becoming is the conclusion.

I hope Times journalists won’t be intimidated by this argument. I hope they aren’t mindlessly tallying up their stories in a back room to ensure balance, but I also hope they won’t worry about critics who claim they are. What’s needed most is forceful, honest reporting — as The Times has produced about conflicts circling the foundation; and as The Washington Post did this past week in surfacing Trump’s violation of tax laws when he made a $25,000 political contribution to a campaign group connected to Florida’s attorney general as her office was investigating Trump University.

Fear of false balance is a creeping threat to the role of the media because it encourages journalists to pull back from their responsibility to hold power accountable. All power, not just certain individuals, however vile they might seem.

Putting aside the curious characterization of the Florida AG investigation as a tax evasion story (which is a lot like describing the Watergate scandal as a burglary story or Al Capone as a tax evader), equating her paper's pursuit of the Clinton foundation with the Washington Post's coverage of Trump is simply surreal on a number of levels.

For starters, none of the Clinton foundation stories have revealed significant wrongdoing. Even Spayd, who is almost comically desperate to portray her employer in the best possible light, had to concede that “some foundation stories revealed relatively little bad behavior, yet were written as if they did.” By comparison, the Washington Post investigation continues to uncover self-dealing, misrepresentation, tax evasion, misuse of funds, failure to honor obligations, ethical violations, general sleaziness and blatant quid prop quo bribery.

More importantly, the Washington Post has explicitly attacked and implicitly abandoned Spayd's position. Here's how the Post summed it up in an editorial that appeared two days before the NYT column.
Imagine how history would judge today’s Americans if, looking back at this election, the record showed that voters empowered a dangerous man because of . . . a minor email scandal. There is no equivalence between Ms. Clinton’s wrongs and Mr. Trump’s manifest unfitness for office.


Charles Pierce's characteristically pithy response to this editorial was "The Washington Post Just Declared War on The New York Times -- And with good reason, too."

If is almost as if Spayd thinks it's 2000, when the NYT could set the conventional wisdom, could decide which narratives would followed and which public figures would be lauded or savaged. Spayd does understand that there is a battle going on for the soul of journalism, but she does not seem to understand that the alliances have changed, and the New York Times is about to find itself in a very lonely position.

Tuesday, September 20, 2016

“And now the piano stylings of Richard Nixon”

I know this bothers me more than it should but I wish journalists would start coming up with some new  and better examples. I was listening to an NPR talk show discussing the backlash to Fallon's softball interview of Trump (which also fits in with yesterday's topic). The host ran through the standard spiel on candidates and talk shows, starting with Bill Clinton.

While it is true that Clinton made excellent use of late-night talk shows including Arsenio Hall and even more to the point, Johnny Carson (though that's a tale for another time), a better example and one far more applicable to Trump happened quite a bit earlier.

When people think of Nixon and television, there is a tendency to focus on the debate with John F Kennedy. That provides a horribly lopsided picture. Nixon used the medium brilliantly with the Checkers speech and did awfully well with Jack Parr.

Here was the great loser attack dog of the Republican Party, a figure best known for scandal, defeat, and perhaps the most petulant exit line of any American politician, sitting in America's living room, making relaxed and charming conversation, and not only playing piano, but performing his own composition.

Parr himself would later write:
He had been on the Tonight program with me, and against his own judgment and that of his many advisers, I got him to play the piano. It was an unusual moment, with Richard Nixon playing a ricky-ticky tune that he had composed. Marshall McLuhan, the media analyst, had written in his first book that if Nixon had played the piano on the Tonight program in the 1960 campaign, he would have won the election.





RICHARD NIXON ON "THE JACK PAAR PROGRAM" (MARCH 8, 1963)





Monday, September 19, 2016

Yes, things are better

A lot of my Democratic friends (especially those over 40) had a bad feeling about the nomination of Hillary Clinton, not because they had a problem with her, but because they felt the press corp did. Memories of Whitewater and particularly the 2000 election remain strong. The moment her candidacy was announced, these people started having flashbacks of Vince Foster rumors and Maureen Dowd columns explaining why Al Gore's choice of sweaters disqualified him from the presidency. Their concerns sound a great like this excerpt from a recent Paul Krugman post.
No, it’s something special about Clinton Rules. I don’t really understand it. But it has the feeling of a high school clique bullying a nerdy classmate because it’s the cool thing to do.

And as I feared, it looks as if people who cried wolf about non-scandals are now engaged in an all-out effort to dig up or invent dirt to justify their previous Clinton hostility.

Hard to believe that such pettiness could have horrifying consequences. But I am very scared.

My advice to my friends (and please feel free to pass this along to Professor Krugman) is not to worry because things are different. Don't get me wrong, they may still turn in a very ugly direction, but things won't turn ugly the same way they did sixteen years ago.

If you have time, you should take a break now and read this essential Vanity Fair piece by John Russell from 2007. You should go through the whole thing but this will give you a taste:

Al Gore couldn't believe his eyes: as the 2000 election heated up, The New York Times, The Washington Post, and other top news outlets kept going after him, with misquotes ("I invented the Internet"), distortions (that he lied about being the inspiration for Love Story), and strangely off-the-mark needling, while pundits such as Maureen Dowd appeared to be charmed by his rival, George W. Bush. For the first time, Gore and his family talk about the effect of the press attacks on his campaign—and about his future plans—to the author, who finds that many in the media are re-assessing their 2000 coverage.



Eight years ago, in the bastions of the "liberal media" that were supposed to love Gore—The New York Times, The Washington Post, The Boston Globe, CNN—he was variously described as "repellent," "delusional," a vote-rigger, a man who "lies like a rug," "Pinocchio." Eric Pooley, who covered him for Time magazine, says, "He brought out the creative-writing student in so many reporters.… Everybody kind of let loose on the guy."

How did this happen? Was the right-wing attack machine so effective that it overwhelmed all competing messages? Was Gore's communications team outrageously inept? Were the liberal elite bending over backward to prove they weren't so liberal?

Eight years later, journalists, at the prompting of Vanity Fair, are engaging in some self-examination over how they treated Gore. As for Gore himself, for the first time, in this article, he talks about the 2000 campaign and the effect the press had on him and the election. (In the interest of full disclosure, I should say that my father, Martin Peretz, was his teacher at Harvard and is an ardent, vocal Gore backer. I contributed to his campaign in February 1999. Before reporting this article, however, I'd had maybe two passing exchanges with Gore in my life.) Gore wasn't eager to talk about this. He doesn't blame the media for his loss in 2000. Yet he does believe that his words were distorted and that certain major reporters and outlets were often unfair.



Building on the narrative established by the Love Story and Internet episodes, Seelye, her critics charge, repeatedly tinged what should have been straight reporting with attitude or hints at Gore's insincerity. Describing a stump speech in Tennessee, she wrote, "He also made an appeal based on what he described as his hard work for the state—as if a debt were owed in return for years of service." Writing how he encouraged an audience to get out and vote at the primary, she said, "Vice President Al Gore may have questioned the effects of the internal combustion engine, but not when it comes to transportation to the polls. Today he exhorted a union audience in Knoxville, Iowa, to pile into vans—not cars, but gas-guzzling vans—and haul friends to the Iowa caucuses on January 24." She would not just say that he was simply fund-raising. "Vice President Al Gore was back to business as usual today—trolling for money," she wrote. In another piece, he was "ever on the prowl for money."

The disparity between her reporting and Bruni's coverage of Bush for the Times was particularly galling to the Gore camp. "It's one thing if the coverage is equal—equally tough or equally soft," says Gore press secretary Chris Lehane. "In 2000, we would get stories where if Gore walked in and said the room was gray we'd be beaten up because in fact the room was an off-white. They would get stories about how George Bush's wing tips looked as he strode across the stage." Melinda Henneberger, then a political writer at the Times, says that such attitudes went all the way up to the top of the newspaper. "Some of it was a self-loathing liberal thing," she says, "disdaining the candidate who would have fit right into the newsroom, and giving all sorts of extra time on tests to the conservative from Texas. Al Gore was a laughline at the paper, while where Bush was concerned we seemed to suffer from the soft bigotry of low expectations." (Seelye's and Bruni's then editors declined to be interviewed for this article.)


I would argue that the worst decade in modern American journalism started sometime in the early 1990s (and yes, I am including yellow journalism and the red scare in the period we're discussing). It was largely defined by three major stories:

Whitewater;

The 2000 election;

The build up to the Iraq war.

A number of commentators such as Charles Pierce have noted that the New York Times and a handful of other players seem determined to follow exactly the same path with this election. Normally, that would be a profoundly frightening turn of events, but as I mentioned earlier, this time things are different.

We previously brought the idea of cognitive dissonance into the discussion. One of the implications of this framing is that challenging believes will tend to produce one of two more or less opposite outcomes. People will either back away from a discredited idea or will double down on it.

As you work your way through the Vanity Fair piece, it becomes obvious that even as far back as 2007, many if not most of the journalists involved had come to question their previous beliefs and approaches, and those were the people deep in the bubble. The generation of journalists and satirists who emerged since then mostly see Bush v Gore and the Iraq War build-up as huge journalistic
debacles. It is worth noting that Talking Points Memo started during the Florida recount.

This is a good time to take another page from the social psych book and talk about social norming. When you read the accounts of the more self-aware members of the group like Margaret Carlson, you will see that, even at the time, they knew what they were doing was, on some level, wrong but they still went along with the group. There were smart, independent voices pointing out the absurdity of the coverage but they were then-obscure outsiders like Josh Marshall who were easy to ignore.

The landscape has changed radically and what was the norm is now an unpopular, even besieged position. Consider the case of poor Matt Lauer. The pre-debate played out much like something from 2000. The Democrat was pestered about pseudo-scandals and discouraged (“be brief”) from discussing substantive issues; the Republican sailed unchallenged through a string of questionable statements including some that were demonstrably false.

The aftermath, though, was an entirely different story. Social media's treatment of Lauer was brutal and by the next morning, instead of a disparaging narrative about Clinton's body language/facial expression/whatever, everyone had converged on this.





Along similar lines, there are still numerous players in mainstream journalism trying to play by 2000 rules only to find themselves in a very lonely place. The best example is, of course, the New York Times, which was clearly the leader of the pack sixteen years ago with the Washington Post and the rest of the press corps following in lockstep. The NYT's coverage today looks very much like it did then, but the response could hardly be more different. The same sort of work that once merited praise and respect now prompts derision and punchline status.

Josh Marshall has a good thumbnail of the situation:
We've had a number of looks recently at how The New York Times appears to be revisiting its 'whitewater' glory days with its increasingly parodic coverage of the "Clinton Foundation" - I'm adding scare quotes to match the dramatic effect, even though of course the Clinton Foundation is a named legal entity. Beyond the 'clouds' and 'shadows' TPM Reader AR flagged to our attention, as Paul Glastris explains here, the latest installment from the Times explains how Bill Clinton's request for diplomatic passports for aides accompanying him on a mission to secure the release of two US journalists held captive in North Korea constitutes the latest damning revelations about the corrupt ties between the Foundation and the Clinton State Department.

The Times uniquely, though only as a leading example for the rest of the national press, has a decades' long history of being lead around by rightwing opposition researchers into dead ends which amount to journalistic comedy - especially when it comes to the Clintons. But here, while all this is happening we have a real live specimen example of direct political and prosecutorial corruption, misuse of a 501c3 nonprofit and various efforts to conceal this corruption and the underlying corruption of Trump's 'Trump University' real estate seminar scam. It's all there - lightly reported here and there - but largely ignored.

The core information here isn't new and it's definitely not based on my reporting. Much of it stems form the on-going and seemingly indefatigable work of Washington Post reporter David A. Fahrenthold who's been chronicling Trump's long list of non-existent or promised but non-existent charitable contributions. In this case, it goes to a $25,000 contribution Trump made to the reelection campaign of Florida Attorney General Pam Bondi in 2013. The neglected story has only popped up again now because Trump was penalized by the IRS for a relatively technical part of the corrupt act.


It's that part about the Washington Post that has to sting the most. Throughout Whitewater and the 2000 election, the two papers functioned as a tag team. Now, rather than lending support to the besieged New York Times, the Washington Post is leading the assault both through declaration and example.

As the inimitable Mr. Pierce recently put it,

Washington Post declares war on New York Times

We've opened up a lot of questions here, certainly more than can be answered in a single post (this one is already running twice the length I'd intended). For now, though, let's leave it at this:

You have every reason to be concerned about a possible Trump victory. You have every reason to be angry about the role that bad journalism is playing in the process. But if you're having flashbacks of Bush v Gore, you should relax. Things really are different, and I mean that entirely in a good way.

Friday, September 16, 2016

Sugar versus Fat: which is worse for your heart?

This is Joseph.

Obviously this report in JAMA Internal Medicine is alarming.  The full abstract is:
Early warning signals of the coronary heart disease (CHD) risk of sugar (sucrose) emerged in the 1950s. We examined Sugar Research Foundation (SRF) internal documents, historical reports, and statements relevant to early debates about the dietary causes of CHD and assembled findings chronologically into a narrative case study. The SRF sponsored its first CHD research project in 1965, a literature review published in the New England Journal of Medicine, which singled out fat and cholesterol as the dietary causes of CHD and downplayed evidence that sucrose consumption was also a risk factor. The SRF set the review’s objective, contributed articles for inclusion, and received drafts. The SRF’s funding and role was not disclosed. Together with other recent analyses of sugar industry documents, our findings suggest the industry sponsored a research program in the 1960s and 1970s that successfully cast doubt about the hazards of sucrose while promoting fat as the dietary culprit in CHD. Policymaking committees should consider giving less weight to food industry–funded studies and include mechanistic and animal studies as well as studies appraising the effect of added sugars on multiple CHD biomarkers and disease development.
I think that this brings up two issues, related but different.  One, is whether the role of sugar in heart disease is important and I think that the evidence for this is pretty good.  Recent recommendations from the American Heart Association allow for 150 calories of added sugar for men and 100 calories of added sugar for women.  That's a pretty strong indictment of added sugar given that a 500 ml coke probably has > 50 grams  of added sugar (that's 200 calories).  So it is good that attention is being paid to this issue.

What I am less comfortable about is giving less weight to industry funded studies.  It's not immediately clear to me that the median industry study isn't really interested in learning about food and there are a lot of studies where you need industry partnership for (absent a staggering amount of government inspection and interaction).  What I prefer to do is to focus on transparency and replication.  Nobody wants to have a mistake published and the researchers I work with are fanatical about doing their best.  Do biases enter?  Absolutely.  But this is why we disclose interests, a trend that I have been reassured to see being more and more standard in the academic publishing world.

In physics, long ago, it used to be a good thing to partner with industry to create a new process, invention, or to learn more about a standard tool (I once worked with concrete).  I think there can be a lot of benefit from these interactions and they allow both sides to leverage strengths.

So instead of removing weight from these studies, I see this as a cautionary tale about how we can do better and should do better.