Friday, July 24, 2015

REPOST -- Maybe the [2012] Republican primary [was] going just as we should [have] expect[ed]

[This article by Sam Wang got me thinking about some posts I've been meaning to write about how most popular poll analyses could use more complex assumptions an about how everyone, including political scientists might benefit from more orthogonal data. I hit some of these topics four years ago so I thought I'd do a repost. Other than the correction of one typo, I'm leaving everything the way it was despite having some misgivings about the post.

One thing I do want to emphasize is that this is not a serious proposal; I'm just playing around with the idea that the interaction between desirability and perceived electability might explain some of the weirdness we saw in the last Republican Presidential primary (and the batshit craziness that we are, no doubt, about to see.]

I don't mean that in a snarky way. This is a completely non-snide post. I was just thinking about how even a quick little model with a few fairly intuitive assumptions can fit seemingly chaotic data surprisingly well. This probably won't look much like the models political scientists use (they have expertise and real data and reputations to protect). I'm just playing around.

But it can be a useful thought experiment, trying to explain all of the major data points with one fairly simple theory. Compare that to this bit of analysis from Amity Shlaes:
The answer is that this election cycle is different. Voters want someone for president who is ready to sit down and rewrite Social Security in January 2013. And move on to Medicare repair the next month. A policy technician already familiar with the difference between defined benefits and premium supports before he gets to Washington. What voters remember about Newt was that some of his work laid the ground for balancing the budget. He was leaving the speaker's job by the time that happened, but that experience was key.
This theory might explain Gingrich's recent rise but it does a poor job with Bachmann and Perry and an absolutely terrible job with Cain. It's an explanation that covers a fraction of the data. Unfortunately, it's no worse than much of the analysis we've been seeing from professional political reporters and commentators.

Surely we can do better than that.

Let's say that voters assign their support based on which candidate gets the highest score on a formula that looks something like this (assume each term has a coefficient and that those coefficients vary from voter to voter):

Score = Desirability + Electability(Desirability)

Where desirability is how much you would like to see that candidate as president and electability is roughly analogous to the candidate's perceived likelihood of making it through the primary and the general election.

Now let's make a few relatively defensible assumptions about electability:

electability is more or less a zero sum game;

it is also something like Keynes' beauty contest, an iterative process with everyone trying to figure out who everyone else is going to pick and throwing their support to the leading acceptable candidate;

desirability tends to be more stable than electability.

I almost added a third assumption that electability has momentum, but I think that follows from the iterative aspect.

What can we expect given these assumptions?

For starters, there are two candidates who should post very stable poll numbers though for very different reasons: Romney and Paul. Romney has consistently been seen as number one in general electability so GOP voters who find him acceptable will tend strongly to list him as their first choice even if they may not consider him the most desirable. While Romney's support comes mostly from the second term, Paul's comes almost entirely from the first. Virtually no one sees Paul as the most electable candidate in the field, but his supporters really, really like him.

It's with the rest, though, that the properties of the model start to do some interesting things. Since the most electable candidate is not acceptable to a large segment of the party faithful, perhaps even a majority, a great deal of support is going to go to the number two slot. If there were a clear ranking with a strong second place, this would not be a big deal, but this is a weak field with a relatively small spread in general electability. The result is a primary that's unstable and susceptible to noise.

Think about it this way: let's say the top non-Romney has a twelve percent perceived chance of getting to the White House, the second has eleven and the third has ten. Any number of trivial things can cause a three point shift which can easily cause first and third to exchange places. Suddenly the candidate who was polling at seven is breaking thirty and the pundits are scrambling to come up with an explanation that doesn't sound quite so much like guessing.

What the zero property and convergence can't explain, momentum does a pretty good job with. Take Perry. He came in at the last minute, seemingly had the election sewn up then dropped like a stone. Conventional wisdom usually ascribes this to bad debate performances and an unpopular stand on immigration but primary voters are traditionally pretty forgiving toward bad debates (remember Bush's Dean Acheson moment?) and most of the people who strongly disagreed with Perry's immigration stand already knew about it.

How about this for another explanation? Like most late entries, Perry was a Rorschach candidate and like most late entries, as the blanks were filled in Perry's standing dropped. The result was a downward momentum which Perry accelerated with a series of small but badly timed missteps. Viewed in this context, the immigration statement takes on an entirely different significance. It didn't have to lower Perry's desirability in order to hurt him in the polls; instead, it could have hurt his perceived electability by reminding people who weren't following immigration that closely that Perry had taken positions that other Republicans would object to.

Of course, showing how a model might possibly explain something doesn't prove anything, but it can make for an interesting thought experiment and it does, I hope, at least make a few points, like:

1. Sometimes a simple model can account for some complex and chaotic behavior;

2. Model structure matters. D + ED gives completely different results than D + E;

3. Things like momentum, zero sum constraints, convergence, and shifting to and from ordinal data can have some surprising implications, particularly when;

4. Your data hits some new extreme.

[For a look at what a real analysis of what's driving the poll numbers, you know where to go.]

Thursday, July 23, 2015

The Apple Tax

From the Onion: Al Franken and the FTC are investigating the so-called “Apple Tax” for rival streaming services
In a sentence that would make frighteningly little sense to a someone who fell into a coma in 1995 and just awakened today, [As a side note, if I were writing for that publication, I don't think I'd open with a "things were sure different twenty years ago" gag. As a friend of mine mentioned in a conversation recently, twenty years ago, the Onion was the place to go for smart, fresh humor writing while Cracked was a tired magazine your father used to read. -- MP] Saturday Night Live-writer-turned-senator Al Franken has called on the Federal Trade Commission and the Justice Department to investigate whether successful computer manufacturer and music provider Apple may have engaged in anti-competitive behavior against rival music streaming services like Spotify or Rdio.

...
The crux of the investigation comes down to the multi-faceted relationship between Apple and the streaming services it both supports and competes against. As the proprietor of iOS’s App Store, the company has a huge amount of control over those streamers’ access to their consumer base, many of whom use their iPhones to play music while on the go. But with the advent of the company’s own Apple Music service, Apple is now in direct competition with those same companies, who it assigns a 30 percent surcharge to operate in the Store.

...
The company was previously suggested to have manipulated music licensees into dropping out of Spotify’s free streaming service, a practice that also invited investigation from the FTC.
It's too late to go into a big discussion of anti-trust and vertical integration and monopsony and all that jazz (or, more accurately, too late for me to read through all of the Wikipedia pages on anti-trust and vertical integration and monopsony so I can sound knowledgeable about all that jazz), so I'll leave it to the readers to draw their own conclusions about the concentration of economic power in media and finish up with this clip from College Humor.

[I assume by this point everyone knows these aren't safe for work.]










Wednesday, July 22, 2015

Let's all take a moment to close our eyes and picture ourselves desecrating the grave of William Proxmire

It hit me the other day that, while I frequently go after Republicans for taking cheap shots at science and research for personal and political game, I don't think I have ever mentioned that the politician who perfected the art was a Democrat. So the next time that John McCain and Maureen Dowd go all giggly over agricultural research, we should all take a moment and thank William Proxmire for getting things started.
Golden Fleece Award

Proxmire was noted for issuing his Golden Fleece Award.,[5] which was presented monthly between 1975 and 1988, in order to focus media attention on projects Proximire viewed as self-serving and wasteful of taxpayer dollars.[1] The first Golden Fleece Award was awarded in 1975 to the National Science Foundation, for funding an $84,000 study on why people fall in love.[1] Other Golden Fleece awards over the years were awarded to the Justice Department for conducting a study on why prisoners wanted to get out of jail, the National Institute of Mental Health to study a Peruvian brothel ("The researchers said they made repeated visits in the interests of accuracy," reported The New York Times), and the Federal Aviation Administration, for studying "the physical measurements of 432 airline stewardesses, paying special attention to the 'length of the buttocks.'"[1]

Proxmire's critics claimed that some of his awards went to basic science projects that led to important breakthroughs. In some circles his name has become a verb, meaning to unfairly obstruct scientific research for political gain, as in "the project has been proxmired". In 1987, Stewart Brand accused Proxmire of recklessly attacking legitimate research for the crass purpose of furthering his own political career, with gross indifference as to whether his assertions were true or false as well as the long-term effects on American science and technology policy.[13] Proxmire later apologized for several cancelled projects, including SETI.

One winner of the Golden Fleece Award, Ronald Hutchinson, sued Proxmire for defamation in 1976. Proxmire claimed that his statements about Hutchinson's research were protected by the Speech or Debate Clause of the U.S. Constitution. The U.S. Supreme Court ruled that that clause does not immunize members of Congress from liability for defamatory statements made outside of formal congressional proceedings (Hutchinson v. Proxmire, 443 U.S. 111 (1979)). The case was eventually settled out of court.[14]
If you read some of the descriptions of the awards, it becomes obvious that, like McCain and Dowd after him, Proxmire didn't care about the potential of the research or even the magnitude of the waste; all that mattered was whether or not he could frame the project in a way that made it sound silly. Here's my favorite example: "He gave the award to a study of the sex life of the screw-worm fly. The results were used to create sterile screw-worms that were released into the wild and eliminated this major cattle parasite from the US and reducing the cost of beef across the globe."

Proxmire wasn't even an ethical whore. When a project he'd mocked became too popular (such as SETI), his principled opposition suddenly vanished. He was relentlessly and transparently self-serving, but he was able to get away with it because there were plenty of reporters willing to print a good story even if it wasn't actually true.

In a sense, he is still getting away with it. The ongoing war on data owes a great deal to the late senator.

Tuesday, July 21, 2015

"The Fallen of World War II"

I like this video a lot, both as an example of visualizing data and for the way it tells its story. It also brings up a question I wondered about over the years but lacked the historical background to answer:

How much of the Russian Post-War national character, the nationalism and the dogmatism, can be traced back to the shared trauma that the Russian people went through in the second world war?







Monday, July 20, 2015

Let's see how many people I can piss off with this one: Fox News is not all that conservative

Feel free to post angry comments but please make sure to read a few paragraphs first. What follows is by no stretch of the imagination a defense of Fox News; rather it is an appeal for more precise language when we discuss it.

In his recent paper on Fox news, Bruce Bartlett made an important distinction between ideological and partisan. These two concepts, while closely related, are quite different and yet people conflate them all the time and, as a result, most discussions of press bias don't make a lot of sense.
Political scientist Jonathan Bernstein: “It’s a real mistake to call Fox a conservative channel. It’s not. It’s a partisan channel…. To begin with, bluntly, Fox is part of the Republican Party. American political parties are made up of both formal organizations (such as the RNC) and informal networks. Fox News Channel, then, is properly understood as part of the expanded Republican Party.”
Ideologues support positions that align most closely with their belief system. Partisans support positions that they see as furthering the interest of their party. I'd argue that when we talk about "liberals" in the media we are almost always referring to ideological positions while when we refer to "conservatives" in the media we are generally referring to partisan positions. The Tea Party muddies the question somewhat but we're going to put that aside for the moment.

I realize there is a lot of gray area here, but, just as a thought experiment, try thinking about Fox News stories in relation to three continuous variables:

Emphasis ;

Ideology;

Partisanship.

If you tune in regularly to Fox News, you will see a lots of stories with significant partisan and ideological components like marriage equality (which though a losing issue nationwide is still useful for energizing the base). You will also see a lot of stories like Benghazi with little apparent ideological components but with huge partisan ones. What you will very seldom see is a story in heavy rotation without a partisan component.

This Ideology vs, Partisanship distinction is particularly notable when a relatively conservative idea is adopted by a Democratic president and suddenly becomes unacceptable. In 2008, you could see cinservative pundits talking up Mitt Romney and listing his healthcare plan as a major selling point.

Coming from the Bible Belt (where Fox is enormously influential), there are a few other examples that strike me as particularly dramatic. Historically, there are few things that evangelicals hate more than Mormonism, Catholicism and the standard celebration of Christmas.

[Courtesy of Joe Bob Briggs]



From a partisan standpoint, there are huge advantages to building denominational unity and to using Santa and Rudolph to attack "political correctness," and that is consistently the approach Fox and conservative media in general have taken despite the ideological concerns of the audience. [There's another big story here about the way the center of power shifted in the conservative movement, but that's a tale for another campfire.]

It is easy to conflate ideology and partisanship -- they often overlap and there is a great deal of collinearity -- but confusing them can lead to bad analysis, particularly when discussing journalistic bias and balance.

Sunday, July 19, 2015

Euro-area thought of the day

This is Joseph.

When even Greg Mankiw has decided that austerity is probably not the best way forward (he suggests that it would be wise to show "mercy"), then you have probably reached the point where the morality narrative has reached its logical limits.  It is also worth noting that the more painful this experience ends up being for Greece, the more likely it is that the Euro group has maximized its size and can only decline from here.

Because who would want to risk ending up like Greece?

Saturday, July 18, 2015

Update on the Washington Post piece

Jill Diniz (Director of Eureka Math/Great Minds) has a response to the WP post. It's very much from the MBA damage-control playbook -- dismiss the problem as minimal, insist everything is OK now, ignore the remaining problems, shift the conversation. I'll get to the rest later.

Before I get to the full reply, though, I do want to take a look at her first paragraph[emphasis added]:
The missing parentheses noted by the blogger, when introducing the concept of raising a negative number to a positive integer, was caused by converting the online curriculum to PDFs. This has been corrected. A benefit of open educational resources, such as Eureka Math, is they are easier than traditional instructional resources to improve upon quickly.  

But I'm still seeing this when I download the PDF:



Here's a crop of that screen capture.



Apparently, my plans to retire the Eureka thread were premature.

Friday, July 17, 2015

Why Eureka (and implementation in general) belongs in the Common Core debate




Clyde Schechter had an extended reply to a recent post.
Let's follow the lead of the mathematicians here and first be clear about our definitions. Common Core is a set of standards: it is a list of behaviors that students are supposed to achieve at each grade level. And it is the intention that those who attain those standards will, at the end of high school, be prepared for college or for certain non-college-degree-requiring careers.

That is quite a separate matter from issues of textbooks and teacher-training. These are key for successful implementation of the Common Core standards, and I do not deny the importance of these things. But unless you want to argue (and perhaps you do) that the Common Core standards are inherently impossible to implement, you cannot rationally attack the standards by criticizing specific textbooks, or even the present lack of any adequate textbooks.

I think it would be helpful to your readers if you would make it clear whether you disagree in any substantial way with the Common Core Math Standards. I personally have read them and they strike me as quite appropriate. Do you agree or not? If not, what are your concerns?

Then if you want to blog about the inadequacies of Eureka math or other textbooks, do so--but don't cast it as a problem with the Common Core standards. My own daughter is learning math under the new Common Core standards--and, in plain English, her textbook sucks! So I'm with you on this.

But let's be clear what we're talking about: the standards themselves, the implementation of the standards in the classroom, or the assessments of achievement of those standards, or the utilization of those assessments to evaluate students, teachers, and schools. These are all separate issues and nobody is truly served by conflating them.
I'd take the opposite position that the issues involving the standards and those involving implementation are so tightly intertwined that they can and should be discussed as a unit.

1. Virtually no one discusses Common Core in narrowly defined terms. Not Wu. Not Coleman. Nobody. This is largely because the standards have no direct impact on the students. Their effect is felt only through their influence on curriculum and assessment. Pretty much everything you've read about the impact of Common Core was defining the initiative broadly. (Add to this the fact that, to anyone but another math teacher, actual math standards are as boring as dirt.)

2. Nor does treating the standards and their implementation separately make sense from an institutional point of view. Many of the same people and processes are behind both, and all phases were presumably approached with an eye to what would come next. This yet another reason for treating the standards, the lessons and the tests as an integrated unit.

3. If we are going to consider implementation when discussing Common Core, we will have to talk about Eureka Math. Not only is it held up as the gold standard by supporters; its success and wide acceptance make it the default template for other publishers. Barring big changes, this is the form Common Core is likely to take in the classroom.

4. All of this leaves open the hypothetical question: how much of the Eureka debacle could've been avoided had someone else handled this stage of the implementation of the Core math standards? The big problem with that question is that the education reform establishment still sees Eureka as a great success. That indicates a systemic failure. Unless you could find someone with sufficient distance from the establishment, I don't see any potential for a better outcome.

5. Finally, speed kills. The backers of Common Core have pushed a narrative of urgency and dire consequences so hard for so long that I am sure they now believe it themselves. The result is a hurried and unrealistic timeline that is certain to be massively expensive and generate tons of avoidable errors, particularly when combined with processes that lack adequate mechanisms for self-correction and a culture that tends to dismiss external criticism. On the whole, my impression is that the Common Core standards are generally slightly worse than the system of state standards and de facto national standards which they are replacing, but the difference, frankly, is not that great. However, even if the standards represented a big step forward, that would not justify implementing them at a breakneck speed that all but guarantees shoddy work (not to mention being massively expensive).


And as a footnote, the phrase "And it is the intention that those who attain those standards will, at the end of high school, be prepared for college or for certain non-college-degree-requiring careers" is deeply problematic on at least two levels:

First, we already have standards in place with basically these same objectives and which aren't all that different from CCS (the fact that we still have an unacceptable number of unprepared students is just another reminder of the limited impact of standards). If we were just interested in improving college and career readiness, it would be far easier and cheaper to simply tweak what we have (cover this earlier, spend more time on this, raise the test cut-off for this);

We don't see this because these reforms are about more. In a classic case of not letting a crisis go to waste, Coleman et al are looking to make sweeping administrative and pedagogical changes to the educational system and while I'm sure that they believe those changes will improve readiness, that's not the focus. If this were just a get-kids-through-college conversation, we would not be talking about mathematical formalism and close reading.

Thursday, July 16, 2015

Godzilla vs. Rodan -- digital media edition

When giant, hideous monsters clash it's difficult deciding who to root for. 



Questions of team loyalty aside, this Slate article by Will Oremus raises interesting questions about attitudes toward and incentives for copyright infringement.
Last year on his podcast Hello Internet, the Australian filmmaker Brady Haran coined the term freebooting to describe the act of taking someone’s YouTube video and re-uploading it on a different platform for your own benefit....

Unlike sea pirates, Facebook freebooters don’t directly profit from their plundering. That’s because, unlike YouTube, Facebook doesn’t run commercials before its native videos—not yet, at least. That’s part of why they spread like wildfire. What the freebooter gains is attention, whether in the form of likes, shares, or new followers for its Facebook page. That can be valuable, sure, especially for brands and media outlets. But it might seem like a relatively small booty compared with the legal risk involved. Sandlin’s lawyer, Stephen Heninger, told me he believes Facebook freebooting amounts to copyright infringement, though he also said the phenomenon is new enough that the legal precedent is limited.
...
Freebooting, to be clear, is not the same as simply sharing a link to someone’s YouTube video on Facebook. When you do that, Facebook embeds the YouTube video, and all the views—and advertising revenues—are properly credited to its original publisher. No one has a problem with that, including Sandlin. It’s how the system is supposed to work.

But it doesn’t work that way anymore—not well, anyway. That’s because, over the past year, Facebook has decided it’s no longer content to be a venue for sharing links to articles and videos found elsewhere on the Internet. Facebook now wants to host the content itself—and, in so doing, control the advertising revenue that flows from it....

To that end, Facebook has built its own video platform and given it a decisive home-field advantage in the News Feed. Share a YouTube video on Facebook, and it will appear in your friends’ feeds as a small, static preview image with a “play” button on it—that is, if it appears in your friends’ News Feeds at all. Those who do see it will be hesitant to click on it, because they know it’s likely to be preceded by an ad. But take that same video and upload it directly to Facebook, and it will appear in your friends’ feeds as a full-size video that starts playing automatically as they scroll past it. (That’s less annoying than it sounds.) Oh, and it will appear in a lot of your friends’ feeds. Anecdotal evidence—and guidance from Facebook itself—suggests native videos perform orders of magnitude better on Facebook than those shared from other platforms.

Facebook’s video push has produced stunning results. In September, the company announced that its users were watching 1 billion videos a day on the social network. By April, that number had quadrupled to 4 billion. An in-depth Fortune story in June on “Facebook’s Video-Traffic Explosion” reported that publishers such as BuzzFeed have seen their Facebook video views grow tenfold in the past year. One caveat is that a view of a Facebook video might not mean quite the same thing as a view of a YouTube video, because Facebook videos play in your feed whether you click on them or not.
That caveat might be worth a post of it own on apples-to-oranges data comparisons. Maybe next time

Wednesday, July 15, 2015

On the plus side, a holographic instant replay machine would be really cool

Not to put too fine a point on it, but this is a story of profitable businesses operating under a monopoly and owned by the fantastically rich taking billions of dollars of tax-payer money. This ties in with all sorts of our ongoing threads.




(Not to mention the fact that some of that money eventually goes to this guy.)

Over at the Monkey Cage...

I wrap up the Eureka Math thread.

Tuesday, July 14, 2015

Sentence of the day: constructive critcism

This is Joseph.

Mark Evanier:

He strikes a chord with me when he writes, "In life, what matters most isn't how a decision compares to your ideal outcome. It's how it compares to the alternative at hand."

I'm a big believer in that. Increasingly as I get older, I get annoyed by harsh criticisms that are unaccompanied by alternatives. It's fine to say, "I don't think this will work but I don't have anything better to offer at the moment." It's not fine, at least with me, to say, "This idea stinks and it will be an utter and total disaster and whoever thought of it is a moron…" and then to not have at least some of a better plan to offer in its stead. Or to offer an impossible, impractical alternative. Anyone can say, "That sucks."
I rather like this point, because it really does run through a lot of themes on this blog.  When I am an active blogger, I often find that many of my topics don't consider what would be the alternative to the current policy.  So they note that something is inefficient.  But if you can't come up with a good alternative (that is scalable) then it isn't all that exciting to point out that there are a lot of limitations in life and much that is not perfect. 

P.S. Anyone have any idea if Evanier is Evan-yah (French) or Evan-yer (English)? 

From the ashes of New Math

[Previously posted at the teaching blog]

One of my big concerns with the education reform debate, particularly as it regards mathematics, is that a great deal of the debate consist of words being thrown around that have a positive emotional connotation, but which are either vague or worse yet mean different things to different participants in the discussion.

As a result, you have a large number of "supporters" of common core who are, in fact, promoting entirely different agendas and probably not realizing it (you might be able to say the same about common core opponents but, by nature, opposition is better able to handle a lack of coherence) . I strongly suspect this is one of the causes behind the many problems we've seen in Eureka math and related programs. The various contributors were working from different and incompatible blueprints.

There's been a great deal of talk about improving mathematics education, raising standards, teaching problem-solving, and being more rigorous. All of this certainly sounds wonderful, but it is also undeniably vague. When you drill down, you learn that different supporters are using the same words in radically different senses .

For David Coleman and most of the non-content specialists, these words mean that all kids graduating high school should be college and career-ready, especially when it comes to the STEM fields which are seen as being essential to future economic growth.

(We should probably stop here and make a distinction between STEM and STEAM – science technology engineering applied mathematics. Coleman and Company are definitely talking about steam)

Professor Wu (and I suspect many of the other mathematicians who have joined into the initiative) is defining rigor much more rigorously. For him, the objective is to teach mathematics in a pure form, an axiomatic system where theorems build upon theorems using rules of formal logic. This is not the kind of math class that most engineers advocate; rather it is the kind of math class that most engineers complain about. (Professor Wu is definitely not a STEAM guy.)

In the following list taken from this essay from Professor Wu, you can get a feel for just how different his philosophy is from David Coleman's. The real tip-off is part 3. The suggestion that every formula or algorithm be logically derived before it can be used has huge implications, particularly as we move into more applied topics. (Who here remembers calculus? Okay, and who here remembers how to prove the fundamental theorem of calculus?)

All of Professor Wu's arguments are familiar to anyone who has studied the history of New Math in the 60s. There is no noticeable daylight between the two approaches.

I don't necessarily mean this as a pejorative. Lots of smart people thought that new math was a good idea in the late 50s and early 60s; I'm sure that quite a few smart people still think so today. I personally think it's a very bad idea but that's a topic for another post. For now though, the more immediate priority is just understand exactly what we're arguing about.
The Fundamental Principles of Mathematics

I believe there are five interrelated, fundamental principles of mathematics.
They are routinely violated in school textbooks and in the math education
literature, so teachers have to be aware of them to teach well.

1.  Every concept is precisely defined, and definitions furnish the basis for logical
deductions. At the moment, the neglect of definitions in school mathematics has reached the point at which many teachers no longer know the difference between a definition and a theorem. The general perception among the hundreds of teachers I have worked with is that a definition is “one more thing to memorize.” Many bread-and-butter concepts of K–12 mathematics are not correctly defined or, if defined, are not put to use as integral parts of reasoning. These include number, rational number (in middle school), decimal (as a fraction in upper elementary school), ordering of fractions, product of fractions, division of fractions, length-area-volume (for different grade levels), slope of a line, half-plane of a line, equation, graph of an equation, inequality between functions, rational exponents of a positive number, polygon, congruence, similarity, parabola, inverse function, and polynomial.

2.  Mathematical statements are precise. At any moment, it is clear what is known and what is not known. There are too many places in school mathematics in which textbooks and other education materials fudge the boundary between what is true and what is not. Often a heuristic argument is conflated with correct logical reasoning. For example, the identity √a√b = √ab for positive numbers a and b is often explained by assigning a few specific values to a and b and then checking for these values with a calculator. Such an approach is a poor substitute for mathematics because it leaves open the possibility that there are other values for a and b for which the identity is not true.

3.  Every assertion can be backed by logical reasoning. Reasoning is the lifeblood of mathematics and the platform that launches problem solving. For example, the rules of place value are logical consequences of the way we choose to count. By choosing to use 10 symbols (i.e., 0 to 9), we are forced to use no more than one position (place) to be able to count to large numbers. Given the too frequent absence of reasoning in school mathematics, how can we ask students to solve problems if teachers have not been prepared to engage students in logical reasoning on a consistent basis?

4.  Mathematics is coherent; it is a tapestry in which all the concepts and skills are logically interwoven to form a single piece. The professional development of math teachers usually emphasizes either procedures (in days of yore) or intuition (in modern times), but not the coherent structure of mathematics. This may be the one aspect of mathematics that most teachers (and, dare I say, also math education professors) find most elusive. For instance, the lack of awareness of the coherence of the number systems in K–12 (whole numbers, integers, fractions, rational numbers, real numbers, and complex numbers) may account for teaching fractions as “different from” whole numbers such that the learning of fractions becomes almost divorced from the learning of whole numbers. Likewise, the resistance that some math educators (and therefore teachers) have to explicitly teaching children the standard algorithms may arise from not knowing the coherent structure that underlies these algorithms: the essence of all four standard algorithms is the reduction of any whole number computation to the computation of single-digit numbers.

5.  Mathematics is goal oriented, and every concept or skill has a purpose. Teachers who recognize the purposefulness of mathematics gain an extra tool to make their lessons more compelling. For example, when students see the technique of completing the square merely as a trick to get the quadratic formula, rather than as the central idea underlying the study of quadratic functions, their understanding of the technique is superficial. Mathematics is a collection of interconnecting chains in which each concept or skill appears as a link in a chain, so that each concept or skill serves the purpose of supporting another one down the line. Students should get to see for themselves that the mathematics curriculum moves forward with a purpose.
At the risk of putting too fine of a point on it, this approach tends to produce extremely formal and dense prose such the following (from a company Professor Wu was involved with):
Dilation: A transformation of the plane with center O and scale factor r(r > 0). If
D(O) = O and if P ≠ O, then the point D(P), to be denoted by Q, is the point on the ray OP so that |OQ| = r|OP|. If the scale factor r ≠ 1, then a dilation in the coordinate plane is a transformation that shrinks or magnifies a figure by multiplying each coordinate of the figure by the scale factor.

Congruence: A finite composition of basic rigid motions—reflections, rotations,
translations—of the plane. Two figures in a plane are congruent if there is a congruence that maps one figure onto the other figure.

Similar: Two figures in the plane are similar if a similarity transformation exists, taking one figure to the other.

Similarity Transformation: A similarity transformation, or similarity, is a composition of a finite number of basic rigid motions or dilations. The scale factor of a similarity transformation is the product of the scale factors of the dilations in the composition; if there are no dilations in the composition, the scale factor is defined to be 1.

Similarity: A similarity is an example of a transformation.

Sentence of the day: Greece edition

This is Joseph.

The recent Eurozone stuff requires a bit more blogging than I am prepared for.  But I think that this comment from Ezra Klein puts in perspective just how wrong it all went:
Syriza's strategy, insofar as there was one, uncovered a method of failing that was much more complete and all-encompassing than anyone had thought possible at the start of the process.
The reason that this is bad news is the the European Union has been sold as a partnership.  In a partnership, it is actually bad for one side to lose very, very badly in negotiations.  Not because the person that won will not be objectively better off.  But because a partnership requires mutual benefit, and so a bad deal undermines the strength of the partnership.

Monday, July 13, 2015

Opposite day at the Common Core debate

{previously posted at the teaching blog]

I recently came across this defense of Common Core by two Berkeley mathematicians, Edward Frenkel and Hung-Hsi Wu. Both are sharp and highly respected and when you hear about serious mathematicians supporting the initiative, there's a good chance these two names will be on the list that follows.

Except they don't support it. They support something they call Common Core, but what they describe is radically different than what the people behind the program are talking about. The disconnect is truly amazing. Wu and Frenkel's description of common core doesn't just disagree with that used by David Coleman and pretty much everyone else involved with the enterprise; it openly contradicts it.

The case that Coleman made to Bill Gates and stuck with since then is that "academic standards varied so wildly between states that high school diplomas had lost all meaning". Furthermore, Coleman argued that having a uniform set of national standards would allow us to use a powerful set of administrative tools. We could create metrics, track progress, set up incentive systems, and generally tackle the problem like management consultants.

Compare that to this excerpt from Wu and Frenkel's essay [emphasis added]:
Before the CCSSM were adopted, we already had a de facto national curriculum in math because the same collection of textbooks was (and still is) widely used across the country. The deficiencies of this de facto national curriculum of "Textbook School Mathematics" are staggering. The CCSSM were developed precisely to eliminate those deficiencies, but for CCSSM to come to life we must have new textbooks written in accordance with CCSSM. So far, this has not happened and, unfortunately, the system is set up in such a way that the private companies writing textbooks have more incentive to preserve the existing status quo maximizing their market share than to get their math right. The big elephant in the room is that as of today, less than a year before the CCSSM are to be fully implemented, we still have no viable textbooks to use for teaching mathematics according to CCSSM!

The situation is further aggravated by the rush to implement CCSSM in student assessment. A case in point is the recent fiasco in New York State, which does not yet have a solid program for teaching CCSSM, but decided to test students according to CCSSM anyway. The result: students failed miserably. One of the teachers wrote to us about her regrets that "the kids were not taught Common Core" and that it was "tragic" how low their scores were. How could it be otherwise? Why are we testing students on material they haven't been taught? Of course, it is much easier and more fun, in lieu of writing good CCSSM textbooks, to make up CCSSM tests and then pat each other on the back and wave a big banner: "We have implemented Common Core -- Mission accomplished." But no one benefits from this. Are we competing to create a Potemkin village, or do we actually care about the welfare of the next generation? What happened in New York State will happen next year across the country if we don't get our act together.

[As a side remark, we note that even in the best of circumstances, it's a big question how to effectively test students in math on a large scale. Developing such tests is an art form still waiting to be perfected, and in any case, it's not clear how accurately students' scores on these tests can reflect students' learning. Unfortunately, our national obsession with the test scores has forced teachers to teach to the test rather than teach the material for learning. While we consider some form of standardized assessment to be necessary (just as driver's license tests are necessary), we deplore this obsession. It is time to put the emphasis back on student learning inside the classroom.]

These misguided practices give a bad name to CCSSM, which is being exploited by the standards' opponents. They misinform the public by equating CCSSM with ill-fated assessments, such as the one in New York State, when in fact the problem is caused mostly by the disconnect between the current Textbook School Mathematics and CCSSM. It is for this reason that having the CCSSM is crucial, because this is what will ensure that students are taught correct mathematics rather than the deficient and obsolete Textbook School Mathematics.

It is possible and necessary to create mathematics textbooks that do better than Textbook School Mathematics. One such effort by commoncore.org holds promise: its Eureka Math series will make online courses in K-12 math available at a modest cost. The series will be completed sometime in 2014. [Full disclosure: one of us is an author of the 8th grade textbook in that series.]
The authors have contradicted both major components of Coleman's argument. They insist that we already have a relatively consistent national system of mathematics standards and furthermore they question the reliability of the metrics which Coleman's entire system is based upon.

How can proponents of common core hold such mutually exclusive use and yet be largely unaware of the contradictions?

I suspect it is some combination of poor communication and wishful thinking on both sides. As spelled out in this essay by Wu, the authors desperately want to see mathematics education returned to some kind of Euclidean ideal. A rigorous axiomatic approach where all lessons start with precise definitions and proceed through a series of logical deductions. They have convinced themselves that the rest of the Common Core establishment is in sympathy with them just as they have convinced themselves that the lessons being produced by Eureka math are rigorous and accurate.