Thursday, December 13, 2018

At last, a political scientist protagonist

From This American Life:

Ben Calhoun

This election cycle, it wasn't strange for voters to have to wait for races to be called. Seems like there were so many squeakers. Among the squeakiest, still unresolved a month after the election, North Carolina's 9th congressional district. The district is this long stretch of eight counties along the state's southern border. It's so gerrymandered, it looks like a hockey stick.
In that district, a Republican former Baptist pastor named Mark Harris narrowly beat his Democratic opponent. The Democrat was this Boy Scouty, former Marine named Dan McCready. The margin of victory in that race-- 905 votes-- crazy close, but a win.
Until the North Carolina State Board of Elections had a meeting-- the board is four Democrats, four Republicans, one unaffiliated member-- and the board decided in a bipartisan unanimous vote not to approve the results in the ninth congressional district.

Michael Bitzer

That late Tuesday afternoon decision by the board not to certify the ninth really kind of sent shockwaves through the state.

Ben Calhoun

This is Michael Bitzer, PolySci professor at Catawba College in North Carolina.

Michael Bitzer

To say, this is something that looks pretty serious.

Ben Calhoun

Trouble in River City.

Michael Bitzer

Yes.

Ben Calhoun

Bitzer says he can't remember this ever happening before. It turns out, behind this bipartisan emergency break-throwing-- voter fraud allegations, specifically funny business with mail-in absentee ballots. So Bitzer did what PolySci professors do in a crisis like this. He dove into the data, downloaded it from the state. And in it, he saw one thing that didn't look like the others.
One county, Bladen county, only 19% of the people voting by mail were registered Republicans. But among the mail-in ballots, the Republican candidate got 61% of the vote. Mathematically, this just seems super unlikely. He'd have to win all the Republicans, and all the independents, and some Democrats.
Normally, professors quantify how unusual something is in statistics, standard deviation and that kind of thing. But I have trouble following that.

Ben Calhoun

If you were Luke Skywalker in this situation, how big was the disturbance in the force?

Michael Bitzer

Alderaan.

Ben Calhoun

For those slightly less nerdy than Professor Bitzer and myself, that's the planet that gets destroyed by the Death Star.

Ben Calhoun

The destruction of a planet?

Michael Bitzer

Yes. And just eyeballing it, this is not normal.

Ben Calhoun

So Bitzer writes a blog post explaining what he was reading in the data that most people had not. Then it spreads rapidly through the internet. And then around the same time, news starts to trickle in.
There's stories of voters who say there were people coming and telling them to give them their mail-in absentee ballots before they filled them in. And they handed them over, and then they don't know what happened to their ballot.

Wednesday, December 12, 2018

For the name alone...

From Wikipedia:
We Were Promised Jetpacks are a Scottish indie rock band from Edinburgh, formed in 2003. The band consists of Adam Thompson (vocals, guitar), Michael Palmer (guitar), Sean Smith (bass), and Darren Lackie (drums). Stuart McGachan (keyboards, guitar) was a member of the band from 2013 to 2015.








Tuesday, December 11, 2018

"Global, U.S. Growth in Smartphone Growth Starts to Decline"

Excellent article on the state of the smart phone market by Jake Swearingen of New York magazine. There's nothing particularly surprising about the story it tells. When first introduced, the iPhone integrated and made fully portable some of the most popular and important technologies of the late 20th and early 21st centuries, PCs and the Internet, GPS, digital cameras, and cell phones. The result was an incredibly appealing product with an abundance of low hanging fruit for improvements that would keep people rushing out to buy each new model for a while. Inevitably though, these improvements started approaching the asymptote of discernibility. As the author points out, you often have to hold them side-by-side to tell the difference between your old phone and the new and considerably more expensive upgrade. The S-curve levels off and improvements in functionality start taking a second-place to reductions in price.

What is surprising about this story is that so many people found it surprising.

But just like every Scorsese movie, the party ends. Smartphone growth began to slow starting in 2013 or 2014. In 2016, it was suddenly in the single digits, and in 2017 global smartphone shipments, for the first time, actually declined — fewer smartphones were sold than in 2017 than in 2016.

Every smartphone manufacturer is now facing a world where, at best, they can hope for single-digit growth in smartphone sales — and many seem to be preparing for a world where they face declines.

...

If you’ve bought flagship phone in this year, you likely won’t need to buy a replacement until the next decade. “Most people have more phone than they can handle, or need,” says Gartner senior principal analyst Tuong Nguyen. “It’s similar to what you saw in the PC market for while — people had really powerful PCs but they barely used it for anything. It’s the same with phones.”

Your smartphone camera is good to great, and you mainly share those photos on social media, where photo quality doesn’t matter much anyway. Barring a few high-end 3-D games or technologies like augmented reality, your processor can handle everything you throw at it, and will for a while. Your screen is bright and sharp, and while there may be slightly better screens out there, you’d only be able to tell by holding the two phones side-by-side. Durability has vastly improved; waterproofing is now standard on smartphones, so a brief dip in the sink or toilet doesn’t mean you need a new phone, and the weakest links in smartphone hardware — batteries which tend lose their ability to hold a charge over time and screens that crack and shatter — have improved.
...



As the market reaches maturity, smartphones are verging on becoming a commodity — a fate the major smartphone manufacturers like Samsung and Apple desperately want to avoid.

“Commoditization is the normal cycle for most products,” says Willy Shih, a professor of management practice at Harvard Business School. “When the first Xerox plain-paper copier came out, they were really cool and Xerox became a fabulously successful company. But then their patents expired, and other companies like Canon came in and introduced low-cost office copiers. Now, copier machines are a dirt-order commodity.” Think very hard about your own office — can you name the brand of your office copier?

Or take televisions, another commodity where consumers show little brand loyalty, allowing for upstarts like Vizio, TLC, and Hisense to strip market share away from established players like Sony or Panasonic — which, of course, had displaced established players in television like Magnavox or RCA.

“Once you get driven into the commodity space, you start to think, ‘Oh, I’ve just got gotta come up with the next great feature that will cause people to buy my product over the others,’” says Shih. “But at some point, you way exceed what consumers need or are willing to pay for. And then you become a commodity.”

Monday, December 10, 2018

Our annual Toys-for-Tots post

A good Christmas can do a lot to take the edge off of a bad year both for children and their parents (and a lot of families are having a bad year). It's the season to pick up a few toys, drop them by the fire station and make some people feel good about themselves during what can be one of the toughest times of the year.

If you're new to the Toys-for-Tots concept, here are the rules I normally use when shopping:

The gifts should be nice enough to sit alone under a tree. The child who gets nothing else should still feel that he or she had a special Christmas. A large stuffed animal, a big metal truck, a large can of Legos with enough pieces to keep up with an active imagination. You can get any of these for around twenty or thirty bucks at Wal-Mart or Costco;*

Shop smart. The better the deals the more toys can go in your cart;

No batteries. (I'm a strong believer in kid power);**

Speaking of kid power, it's impossible to be sedentary while playing with a basketball;

No toys that need lots of accessories;

For games, you're generally better off going with a classic;

No movie or TV show tie-ins. (This one's kind of a personal quirk and I will make some exceptions like Sesame Street);

Look for something durable. These will have to last;

For smaller children, you really can't beat Fisher Price and PlaySkool. Both companies have mastered the art of coming up with cleverly designed toys that children love and that will stand up to generations of energetic and creative play.

*I previously used Target here, but their selection has been dropping over the past few years and it's gotten more difficult to find toys that meet my criteria.

** I'd like to soften this position just bit. It's okay for a toy to use batteries, just not to need them. Fisher Price and PlaySkool have both gotten into the habit of adding lights and sounds to classic toys, but when the batteries die, the toys live on, still powered by the energy of children at play.

Friday, December 7, 2018

Thursday, December 6, 2018

Oklahoma, school funding, and the meta-perceptions pre-thread (proto thread?)

One of the questions I would love to see some social science researchers dig into is the apparent increase in people not only espousing extreme and even offensive beliefs (particularly on the right), but assuming that these positions are acceptable and in some cases widely held. I don't have enough background to intelligently discuss the topic, but I do (as always) have some theories, some involving social media and social norming, others focused on the conservative movement's media strategy and its sometimes unintended consequences.

Coming out against the very concept of publicly funded education is certainly an extreme position. Oklahoma Republicans basically looked at the Kansas experiment and said "hold my beer" and are now facing the same backlash as the other states that recently tried this combination of supply-side economics and Randian social policy. Opposing increased funding for schools under these conditions is politically risky; opposing funding period would seem to be suicidal, but Lopez and presumably the rest of the county party leadership appear to consider this a mainstream Republican position.

OKLAHOMA CITY (AP) — Republican leadership in one of Oklahoma's most populous counties has sent a letter to the state's lawmakers calling for an end to government-run public schools, or if that is too much, to at least find alternative funding sources for the system besides tax revenue.

Other GOP leaders have rebuked the letter, saying its views are outside the state party's mainstream, while looking toward next year's legislative session, when classroom funding is likely to again be a major focus.

Andrew Lopez, Republican Party chair for suburban Oklahoma City's Canadian County, signed the letter sent last week. It requested that the state no longer manage the public school system, or at least consider consolidating school districts. Public schools should seek operational money from sponsorships, advertising, endowments and tuition fees instead of taxes, the letter says.

The letter itself can't force policy changes, but the swift criticism from fellow Republicans shows continued grappling for power in the state's dominant political party. Education funding played a big role in this year's legislative elections following a spring teacher walkout that closed public schools throughout Oklahoma for two weeks. Several Republican lawmakers who opposed tax increases for teacher salaries were ousted, including some targeted by a key GOP House leader and an out-of-state super PAC.
...

Oklahoma Republican Party Chair Pam Pollard said Lopez's letter doesn't reflect the party's position.

But Lopez said the GOP lawmakers are betraying party principles, including through increasing the size of government. His letter also called for abolishing abortion and eliminating unnecessary business-licensing agencies.

"In government we have a system that says we believe it's a good idea to take (money) from you by force to educate other people's children," Lopez said. "That doesn't appear to be a fair deal to me."

Wednesday, December 5, 2018

Some mid-week retro-future

From the Internet Archive's Galaxy Magazine collection.







Tuesday, December 4, 2018

After the swans and tipping points – – a few quick and half-assed thoughts on post-relevancy.

I think I've made this point before, one of the advantages of a blog like this is that – – due to the flexibility of the form, the ability to respond to events in real time, the small and generally supportive audience who often provide (either through the comment section, off-line exchanges, or online multi-blog debates) useful feedback, and both the freedom and the pressure that come with having to fill all the space – – it can be and ideal place to collect your thoughts and try things out.

Post-relevancy is a topic we might want to come back to. It's interesting on at least a couple of levels. First, there are the individual responses to the realization that they are no long an important part of the discussion. Some simply keep rolling out their greatest hits, at some point descending into self-parody and essentially becoming their own tribute band. Others (though there is considerable overlap here) become bitter and desperately seek out validation for their new or often "new" material.

[I'm mainly thinking of public intellectuals in this post, but there are certainly comparable examples in other fields of entertainment. Dennis Miller is probably the first name to come to mind but certainly not the last.]

I saw a couple of things online recently that got me thinking about this subject. One was a nasty Twitter exchange between Nate Silver and Nassim Nicholas Taleb.

Here's a taste.

I might be a bit more measured in my tone (Silver has a way of going off on twitter), but, if anything, I'm inclined toward even harsher criticism. Maybe what we should take away from this is not that Taleb has gotten less interesting, but that perhaps he was never all that interesting to begin with. Maybe the ideas that made him famous were never especially original or profound, merely repackaged in a facile way to impress gullible and shallow journalists.

Speaking of Malcolm Gladwell. This article from Giri Nathan of Deadspin pulls no punches (given the context, I'm obligated to use at least one sports metaphor) when describing just how tired Gladwell's shtick has become.
Again, these are just a few selected highlights. The conversation went on for a very long time, and any person who spent any of the last decade gassing up Gladwell’s pseudo-intellectual yammering should be forced to listen to it. Tune in next time to hear the phrenology takes of a hopped-up thinkovator barely suppressing his self-satisfied laughter.
.

A couple of songs came to mind while I was writing this. The first while I was dictating the title. The first few words suggested a vaguely remembered tune. The rest of the line doesn't work with the rhythm. I could have tweaked it to make it scan (after the swans have flown past, after the points have tipped), but that would've been too obscure even for me.

The second, from the great and still relevant Joe Jackson, obviously came to mind when talking about greatest hits.




Monday, December 3, 2018

The politics of that pile of old comics -- repost



The response to the death of Stan Lee has been truly remarkable, particularly when compared to his collaborators Steve Ditko and even more notably Jack Kirby, who had a longer and more influential career as a creator. Though we should not entirely discount Lee's carefully crafted persona as the longstanding face of Marvel, his notable work as a writer and editor was largely limited to the Silver Age of comics.

Lee's cultural and commercial impact has been immense, but many of the tributes have still managed to praise him for things he didn't actually do. Part of this has to do with our increasingly dysfunctional attitudes toward fame and success that, among other things, tends to concentrate all of the credit for a collaborative accomplishment on to whoever has the best name recognition.

The bigger part, however, is probably due to the extremely personal relationship that many of the eulogizers have with the medium. Given the impact that comics, particularly the superheroes of Marvel and DC, have had, the genre would seem to be an ideal starting point for a discussion of politics and culture, but it is extraordinarily difficult to maintain critical detachment when discussing a work that means a great deal to you. It requires serious and sustained effort to keep yourself from seeing significance and profundity that aren't really there. This by no means is limited to comics; it may well be worse with popular music.

A lot of this comes down to the tendency to confuse what Pauline Kael might call good trash with great art. This is not to say that comic books and other media and genres such as audience pleasing comedies, spy novels, TV shows, top 40 songs, etc. can't aspire to something higher. Not being a self-loathing middlebrow, I have never bought into the mid-cult bullshit and I will go to the mat for the artistic quality of popular creators such as Buster Keaton, Will Eisner, John LeCarre, Bob Dylan, Duke Ellington, not to mention TV shows like the Twilight Zone, NYPD Blue, Doctor Who, the Americans, etc. but (once again channeling Kael) we shouldn't try to convince ourselves that everything we like is a work of artistic importance.

Along similar lines, when a work means a great deal to us, there is a natural desire to see its creators as kindred spirits, sharing our worldview and championing our deeply held beliefs. While Stan Lee is in many ways a tremendously admirable figure, the attempt to reinvent him as a progressive icon has always been an embarrassing retcon, even by comic book standards.

The politics of that pile of old comics

As mentioned before, writer and historian Mark Evanier is arguably the go-to guy for pop culture when it comes to both comics and television. One of his areas of particular expertise is the career of his friend, Jack Kirby.

The following excerpt confirms some assumptions I've had about the politics of Silver Age Marvel.
So when someone asks what Captain America would have felt about some topic, the first question is, "Which Captain America?" If the character's been written by fifty writers, that makes fifty Captain Americas, more or less…some closely in sync with some others, some not. And even a given run of issues by one creator or team is not without its conflicts. When Jack was plotting and pencilling the comic and Stan Lee was scripting it, Stan would sometimes write dialogue that did not reflect what Jack had in mind. The two men occasionally had arguments so vehement that Jack's wife made him promise to refrain. As she told me, "For a long time, whenever he was about to take the train into town and go to Marvel, I told him, 'Remember…don't talk politics with Stan.' Neither one was about to change the other's mind, and Jack would just come home exasperated." (One of Stan's associates made the comment that he was stuck in the middle, vis-a-vis his two main collaborators. He was too liberal for Steve Ditko and too conservative for Kirby.)

Jack's own politics were, like most Jewish men of his age who didn't own a big company, pretty much Liberal Democrat. He didn't like Richard Nixon and he really didn't like the rumblings in the early seventies of what would later be called "The Religious Right." At the same time, he thought Captain America represented a greater good than the advancement of Jack Kirby's worldview.

During the 1987 Iran-Contra hearings, Jack was outraged when Ollie North appeared before Congress and it wasn't just because North lied repeatedly or tried to justify illegal actions. Jack thought it was disgraceful that North wore his military uniform while testifying. The uniform, Jack said, belonged to every man and woman who had every worn it (including former Private First Class Jack Kirby) and North had no right to exploit it the way he did. I always thought that comment explained something about the way Kirby saw Captain America. Cap, obviously, should stand for the flag and the republic for which it stands but — like the flag — for all Americans, not merely those who wish to take the nation in some exclusionary direction.
We've already been over Ditko's Randian views.

I also knew that Lee, who is a bit of a revisionist, had overstated some of the progressive positions he had taken on issues like racism while downplaying the red-baiting and sexism. Marvel apologists have also tried to explain away the more reactionary aspects of these stories but they are pretty difficult to ignore and it appears that most of them can be credited to Lee. (Kirby never had Lee's gift for self-promotion or reinvention and he preferred to let his work speak for itself -- always a risky approach in a collaborative medium.)

For more thoughts on the subject, check out this piece by one of my favorite critics/pop historians, Bob Chipman (more from Chipman later).





 You should note that the red-baiting version of the character was done by Lee with no involvement from Kirby.

Friday, November 30, 2018

Having a wonderful time...

For years, whenever you saw a list of the top rated original cable shows, there would be a foot note that would say something like "excluding sports and children's programming." The reason was that an accurate list would have been nothing but sports and kids shows, and about half of those ten would have been different airings of SpongeBob SquarePants.

SpongeBob was, for a while, arguably the biggest thing on cable:

Within its first month on air, SpongeBob SquarePants overtook Pokémon as the highest rated Saturday-morning children's series on television. It held an average national Nielsen rating of 4.9 among children aged two through eleven, denoting 1.9 million viewers. Two years later, the series had firmly established itself as Nickelodeon's second highest rated children's program, after Rugrats. That year, 2001, SpongeBob SquarePants was credited with helping Nickelodeon take the "Saturday-morning ratings crown" for the fourth straight season. The series had gained a significant adult audience by that point – nearly 40 percent of its 2.2 million viewers were aged 18 to 34. In response to this weekend-found success, Nickelodeon gave SpongeBob SquarePants time slots at 6 PM and 8 PM, Monday through Thursday, to increase exposure of the series. By the end of that year SpongeBob SquarePants boasted the highest ratings for any children's series, on all of television. Weekly viewership of the series had reached around fifteen million, at least five million of whom were adults.

Seldom has television success been more richly deserved. SpongeBob, particularly in its early seasons, was a wickedly funny show. Playfully surreal and subtly subversive with an aesthetic that recalled the great Max Fleischer cartoons of the 30s. Holding it all together was the wonderfully offkilter sensibility of creator and initial show runner Stephen Hillenburg. There was always an unexpected rightness about his choices, like staging the climax of the pilot to the wonderfully obscure "Living in the Sunlight" covered by tiny Tim.




Here's the original  Maurice Chevalier.


Thursday, November 29, 2018

Some cool old airplane pictures for a Thursday

Steam powered airplanes were the very definition of a technological dead end, but you have to admit they had style. 












Wednesday, November 28, 2018

More perspective on the atomic age mindset.



In an earlier post, we discussed Willy Ley's observation that, from a 1930s standpoint, a successful moon landing seemed far more of a reach than an atomic bomb, suggesting that the modern usage of "moonshot" – – committing yourself to the an ambitious bordering on impossible objective – – would actually apply better to the Manhattan project.

It's useful at this point to consider just how rapidly this field was advancing.

From Wikipedia (pay close attention to the dates):
In 1932 physicist Ernest Rutherford discovered that when lithium atoms were "split" by protons from a proton accelerator, immense amounts of energy were released in accordance with the principle of mass–energy equivalence. However, he and other nuclear physics pioneers Niels Bohr and Albert Einstein believed harnessing the power of the atom for practical purposes anytime in the near future was unlikely, with Rutherford labeling such expectations "moonshine."

The same year, his doctoral student James Chadwick discovered the neutron, which was immediately recognized as a potential tool for nuclear experimentation because of its lack of an electric charge. Experimentation with bombardment of materials with neutrons led Frédéric and Irène Joliot-Curie to discover induced radioactivity in 1934, which allowed the creation of radium-like elements at much less the price of natural radium. Further work by Enrico Fermi in the 1930s focused on using slow neutrons to increase the effectiveness of induced radioactivity. Experiments bombarding uranium with neutrons led Fermi to believe he had created a new, transuranic element, which was dubbed hesperium.

In 1938, German chemists Otto Hahn and Fritz Strassmann, along with Austrian physicist Lise Meitner and Meitner's nephew, Otto Robert Frisch, conducted experiments with the products of neutron-bombarded uranium, as a means of further investigating Fermi's claims. They determined that the relatively tiny neutron split the nucleus of the massive uranium atoms into two roughly equal pieces, contradicting Fermi. This was an extremely surprising result: all other forms of nuclear decay involved only small changes to the mass of the nucleus, whereas this process—dubbed "fission" as a reference to biology—involved a complete rupture of the nucleus. Numerous scientists, including Leó Szilárd, who was one of the first, recognized that if fission reactions released additional neutrons, a self-sustaining nuclear chain reaction could result. Once this was experimentally confirmed and announced by Frédéric Joliot-Curie in 1939, scientists in many countries (including the United States, the United Kingdom, France, Germany, and the Soviet Union) petitioned their governments for support of nuclear fission research, just on the cusp of World War II, for the development of a nuclear weapon.

First nuclear reactor

In the United States, where Fermi and Szilárd had both emigrated, the discovery of the nuclear chain reaction led to the creation of the first man-made reactor, known as Chicago Pile-1, which achieved criticality on December 2, 1942. This work became part of the Manhattan Project, a massive secret U.S. government military project to make enriched uranium and by building large production reactors to produce (breed) plutonium for use in the first nuclear weapons. The United States would test an atom bomb in July 1945 with the Trinity test, and eventually two such weapons were used in the atomic bombings of Hiroshima and Nagasaki. 


From the perspective of well over a half-century later, the advances in nuclear energy obviously represent a very sharp S curve. At the time, though, there was an entirely natural impulse to extrapolate along a linear or even exponential path.

In August 1945, the first widely distributed account of nuclear energy, in the form of the pocketbook The Atomic Age, discussed the peaceful future uses of nuclear energy and depicted a future where fossil fuels would go unused. Nobel laureate Glenn Seaborg, who later chaired the Atomic Energy Commission, is quoted as saying "there will be nuclear powered earth-to-moon shuttles, nuclear powered artificial hearts, plutonium heated swimming pools for SCUBA divers, and much more".

Tuesday, November 27, 2018

Space exploration is hard.

Yes, I realize that's probably not the most controversial claim I'll make this week, but in this age of hype and bullshit, it's important to occasionally remind ourselves of these basic facts. This is what we go through to put an unoccupied payload roughly the size of a minivan on Mars.


NASA's Mars probe lands Monday after 'seven minutes of terror'


For the eighth time ever, humanity has achieved one of the toughest tasks in the solar system: landing a spacecraft on Mars.

The InSight lander, operated by NASA and built by scientists in the United States, France and Germany, touched down in the vast, red expanse of Mars’ Elysium Planitia just before 3 p.m. Eastern on Monday.

...

The interminable stretch from the moment a spacecraft hits the Martian atmosphere to the second it touches down on the Red Planet’s rusty surface is what scientists call “the seven minutes of terror."

More than half of all missions don’t make it safely to the surface. Because it takes more than eight minutes for light signals to travel 100 million miles to Earth, scientists have no control over the process. All they can do is program the spacecraft with their best technology and wait.

“Every milestone is something that happened 8 minutes ago,” Bridenstine said. “It’s already history.”

The tension was palpable Monday morning in the control room at JPL, where InSight was built and will be operated. At watch parties around the globe — NASA’s headquarters in Washington, the Nasdaq tower in Times Square, the grand hall of the Museum of Sciences and Industry in Paris, a public library in Haines, Alaska — legs jiggled and fingers were crossed as minutes ticked toward the beginning of entry, descent and landing.

At about 11:47 a.m., engineers received a signal indicating that InSight had entered the Martian atmosphere. The spacecraft plummeted to the planet’s surface at a pace of 12,300 mph. Within two minutes, the friction roasted InSight’s heat shield to a blistering 2,700 degrees.

Grover released a deep breath: “That’s hot.”

In another two minutes, a supersonic parachute deployed to help slow down the spacecraft. Radar was powered on.

From there, the most critical descent checklist unfolded at a rapid clip: 15 seconds to separate the heat shield. Ten seconds to deploy the legs. Activate the radar. Jettison the back shell. Fire the retrorockets. Orient for landing.

One of the engineers leaned toward her computer, hands clasped in front of her face, elbows on her desk.

“400 meters,” came a voice over the radio at mission control. “300 meters. 80 meters. 30 meters. Constant velocity."

Engineer Kris Bruvold’s eyes widened. His mouth opened in an “o.” He bounced in his seat.

“Touchdown confirmed.”




Saturday, November 24, 2018

Kevin Drum makes a good point

This is Joseph.

There has been a lot of concern about recent comments by Hillary Clinton about Europe curbing refugee admissions.  Kevin Drum looked at just how many refugees Europe is actually taking and compared it to a reader survey about how many refugees the US should take in:

I don’t want anyone to take my survey too seriously. It’s obviously just a casual thing. However, I think it’s fair to say that the responses are almost entirely from a left-leaning readership, and even at that a solid majority thought the US shouldn’t take in more than half a million refugees in a single year. Adjusted for population, Germany took in nearly ten times that many.
This is a growing problem with mass population displacement.  It strains any system to take in a lot of refugees.  Wanting to be compassionate is very important and we should not allow xenophobia to interfere with saving people who need to be saved.  But it opens up a very important conversation about how one deals with extremely large population displacement and, in a democracy, there may be a limit to the rate that the populace is comfortable with integrating at once.  If climate change drives a longer term issue here, then we need to think about ways to smooth out the process.