Thursday, December 6, 2018

Oklahoma, school funding, and the meta-perceptions pre-thread (proto thread?)

One of the questions I would love to see some social science researchers dig into is the apparent increase in people not only espousing extreme and even offensive beliefs (particularly on the right), but assuming that these positions are acceptable and in some cases widely held. I don't have enough background to intelligently discuss the topic, but I do (as always) have some theories, some involving social media and social norming, others focused on the conservative movement's media strategy and its sometimes unintended consequences.

Coming out against the very concept of publicly funded education is certainly an extreme position. Oklahoma Republicans basically looked at the Kansas experiment and said "hold my beer" and are now facing the same backlash as the other states that recently tried this combination of supply-side economics and Randian social policy. Opposing increased funding for schools under these conditions is politically risky; opposing funding period would seem to be suicidal, but Lopez and presumably the rest of the county party leadership appear to consider this a mainstream Republican position.

OKLAHOMA CITY (AP) — Republican leadership in one of Oklahoma's most populous counties has sent a letter to the state's lawmakers calling for an end to government-run public schools, or if that is too much, to at least find alternative funding sources for the system besides tax revenue.

Other GOP leaders have rebuked the letter, saying its views are outside the state party's mainstream, while looking toward next year's legislative session, when classroom funding is likely to again be a major focus.

Andrew Lopez, Republican Party chair for suburban Oklahoma City's Canadian County, signed the letter sent last week. It requested that the state no longer manage the public school system, or at least consider consolidating school districts. Public schools should seek operational money from sponsorships, advertising, endowments and tuition fees instead of taxes, the letter says.

The letter itself can't force policy changes, but the swift criticism from fellow Republicans shows continued grappling for power in the state's dominant political party. Education funding played a big role in this year's legislative elections following a spring teacher walkout that closed public schools throughout Oklahoma for two weeks. Several Republican lawmakers who opposed tax increases for teacher salaries were ousted, including some targeted by a key GOP House leader and an out-of-state super PAC.
...

Oklahoma Republican Party Chair Pam Pollard said Lopez's letter doesn't reflect the party's position.

But Lopez said the GOP lawmakers are betraying party principles, including through increasing the size of government. His letter also called for abolishing abortion and eliminating unnecessary business-licensing agencies.

"In government we have a system that says we believe it's a good idea to take (money) from you by force to educate other people's children," Lopez said. "That doesn't appear to be a fair deal to me."

Wednesday, December 5, 2018

Some mid-week retro-future

From the Internet Archive's Galaxy Magazine collection.







Tuesday, December 4, 2018

After the swans and tipping points – – a few quick and half-assed thoughts on post-relevancy.

I think I've made this point before, one of the advantages of a blog like this is that – – due to the flexibility of the form, the ability to respond to events in real time, the small and generally supportive audience who often provide (either through the comment section, off-line exchanges, or online multi-blog debates) useful feedback, and both the freedom and the pressure that come with having to fill all the space – – it can be and ideal place to collect your thoughts and try things out.

Post-relevancy is a topic we might want to come back to. It's interesting on at least a couple of levels. First, there are the individual responses to the realization that they are no long an important part of the discussion. Some simply keep rolling out their greatest hits, at some point descending into self-parody and essentially becoming their own tribute band. Others (though there is considerable overlap here) become bitter and desperately seek out validation for their new or often "new" material.

[I'm mainly thinking of public intellectuals in this post, but there are certainly comparable examples in other fields of entertainment. Dennis Miller is probably the first name to come to mind but certainly not the last.]

I saw a couple of things online recently that got me thinking about this subject. One was a nasty Twitter exchange between Nate Silver and Nassim Nicholas Taleb.

Here's a taste.

I might be a bit more measured in my tone (Silver has a way of going off on twitter), but, if anything, I'm inclined toward even harsher criticism. Maybe what we should take away from this is not that Taleb has gotten less interesting, but that perhaps he was never all that interesting to begin with. Maybe the ideas that made him famous were never especially original or profound, merely repackaged in a facile way to impress gullible and shallow journalists.

Speaking of Malcolm Gladwell. This article from Giri Nathan of Deadspin pulls no punches (given the context, I'm obligated to use at least one sports metaphor) when describing just how tired Gladwell's shtick has become.
Again, these are just a few selected highlights. The conversation went on for a very long time, and any person who spent any of the last decade gassing up Gladwell’s pseudo-intellectual yammering should be forced to listen to it. Tune in next time to hear the phrenology takes of a hopped-up thinkovator barely suppressing his self-satisfied laughter.
.

A couple of songs came to mind while I was writing this. The first while I was dictating the title. The first few words suggested a vaguely remembered tune. The rest of the line doesn't work with the rhythm. I could have tweaked it to make it scan (after the swans have flown past, after the points have tipped), but that would've been too obscure even for me.

The second, from the great and still relevant Joe Jackson, obviously came to mind when talking about greatest hits.




Monday, December 3, 2018

The politics of that pile of old comics -- repost



The response to the death of Stan Lee has been truly remarkable, particularly when compared to his collaborators Steve Ditko and even more notably Jack Kirby, who had a longer and more influential career as a creator. Though we should not entirely discount Lee's carefully crafted persona as the longstanding face of Marvel, his notable work as a writer and editor was largely limited to the Silver Age of comics.

Lee's cultural and commercial impact has been immense, but many of the tributes have still managed to praise him for things he didn't actually do. Part of this has to do with our increasingly dysfunctional attitudes toward fame and success that, among other things, tends to concentrate all of the credit for a collaborative accomplishment on to whoever has the best name recognition.

The bigger part, however, is probably due to the extremely personal relationship that many of the eulogizers have with the medium. Given the impact that comics, particularly the superheroes of Marvel and DC, have had, the genre would seem to be an ideal starting point for a discussion of politics and culture, but it is extraordinarily difficult to maintain critical detachment when discussing a work that means a great deal to you. It requires serious and sustained effort to keep yourself from seeing significance and profundity that aren't really there. This by no means is limited to comics; it may well be worse with popular music.

A lot of this comes down to the tendency to confuse what Pauline Kael might call good trash with great art. This is not to say that comic books and other media and genres such as audience pleasing comedies, spy novels, TV shows, top 40 songs, etc. can't aspire to something higher. Not being a self-loathing middlebrow, I have never bought into the mid-cult bullshit and I will go to the mat for the artistic quality of popular creators such as Buster Keaton, Will Eisner, John LeCarre, Bob Dylan, Duke Ellington, not to mention TV shows like the Twilight Zone, NYPD Blue, Doctor Who, the Americans, etc. but (once again channeling Kael) we shouldn't try to convince ourselves that everything we like is a work of artistic importance.

Along similar lines, when a work means a great deal to us, there is a natural desire to see its creators as kindred spirits, sharing our worldview and championing our deeply held beliefs. While Stan Lee is in many ways a tremendously admirable figure, the attempt to reinvent him as a progressive icon has always been an embarrassing retcon, even by comic book standards.

The politics of that pile of old comics

As mentioned before, writer and historian Mark Evanier is arguably the go-to guy for pop culture when it comes to both comics and television. One of his areas of particular expertise is the career of his friend, Jack Kirby.

The following excerpt confirms some assumptions I've had about the politics of Silver Age Marvel.
So when someone asks what Captain America would have felt about some topic, the first question is, "Which Captain America?" If the character's been written by fifty writers, that makes fifty Captain Americas, more or less…some closely in sync with some others, some not. And even a given run of issues by one creator or team is not without its conflicts. When Jack was plotting and pencilling the comic and Stan Lee was scripting it, Stan would sometimes write dialogue that did not reflect what Jack had in mind. The two men occasionally had arguments so vehement that Jack's wife made him promise to refrain. As she told me, "For a long time, whenever he was about to take the train into town and go to Marvel, I told him, 'Remember…don't talk politics with Stan.' Neither one was about to change the other's mind, and Jack would just come home exasperated." (One of Stan's associates made the comment that he was stuck in the middle, vis-a-vis his two main collaborators. He was too liberal for Steve Ditko and too conservative for Kirby.)

Jack's own politics were, like most Jewish men of his age who didn't own a big company, pretty much Liberal Democrat. He didn't like Richard Nixon and he really didn't like the rumblings in the early seventies of what would later be called "The Religious Right." At the same time, he thought Captain America represented a greater good than the advancement of Jack Kirby's worldview.

During the 1987 Iran-Contra hearings, Jack was outraged when Ollie North appeared before Congress and it wasn't just because North lied repeatedly or tried to justify illegal actions. Jack thought it was disgraceful that North wore his military uniform while testifying. The uniform, Jack said, belonged to every man and woman who had every worn it (including former Private First Class Jack Kirby) and North had no right to exploit it the way he did. I always thought that comment explained something about the way Kirby saw Captain America. Cap, obviously, should stand for the flag and the republic for which it stands but — like the flag — for all Americans, not merely those who wish to take the nation in some exclusionary direction.
We've already been over Ditko's Randian views.

I also knew that Lee, who is a bit of a revisionist, had overstated some of the progressive positions he had taken on issues like racism while downplaying the red-baiting and sexism. Marvel apologists have also tried to explain away the more reactionary aspects of these stories but they are pretty difficult to ignore and it appears that most of them can be credited to Lee. (Kirby never had Lee's gift for self-promotion or reinvention and he preferred to let his work speak for itself -- always a risky approach in a collaborative medium.)

For more thoughts on the subject, check out this piece by one of my favorite critics/pop historians, Bob Chipman (more from Chipman later).





 You should note that the red-baiting version of the character was done by Lee with no involvement from Kirby.

Friday, November 30, 2018

Having a wonderful time...

For years, whenever you saw a list of the top rated original cable shows, there would be a foot note that would say something like "excluding sports and children's programming." The reason was that an accurate list would have been nothing but sports and kids shows, and about half of those ten would have been different airings of SpongeBob SquarePants.

SpongeBob was, for a while, arguably the biggest thing on cable:

Within its first month on air, SpongeBob SquarePants overtook Pokémon as the highest rated Saturday-morning children's series on television. It held an average national Nielsen rating of 4.9 among children aged two through eleven, denoting 1.9 million viewers. Two years later, the series had firmly established itself as Nickelodeon's second highest rated children's program, after Rugrats. That year, 2001, SpongeBob SquarePants was credited with helping Nickelodeon take the "Saturday-morning ratings crown" for the fourth straight season. The series had gained a significant adult audience by that point – nearly 40 percent of its 2.2 million viewers were aged 18 to 34. In response to this weekend-found success, Nickelodeon gave SpongeBob SquarePants time slots at 6 PM and 8 PM, Monday through Thursday, to increase exposure of the series. By the end of that year SpongeBob SquarePants boasted the highest ratings for any children's series, on all of television. Weekly viewership of the series had reached around fifteen million, at least five million of whom were adults.

Seldom has television success been more richly deserved. SpongeBob, particularly in its early seasons, was a wickedly funny show. Playfully surreal and subtly subversive with an aesthetic that recalled the great Max Fleischer cartoons of the 30s. Holding it all together was the wonderfully offkilter sensibility of creator and initial show runner Stephen Hillenburg. There was always an unexpected rightness about his choices, like staging the climax of the pilot to the wonderfully obscure "Living in the Sunlight" covered by tiny Tim.




Here's the original  Maurice Chevalier.


Thursday, November 29, 2018

Some cool old airplane pictures for a Thursday

Steam powered airplanes were the very definition of a technological dead end, but you have to admit they had style. 












Wednesday, November 28, 2018

More perspective on the atomic age mindset.



In an earlier post, we discussed Willy Ley's observation that, from a 1930s standpoint, a successful moon landing seemed far more of a reach than an atomic bomb, suggesting that the modern usage of "moonshot" – – committing yourself to the an ambitious bordering on impossible objective – – would actually apply better to the Manhattan project.

It's useful at this point to consider just how rapidly this field was advancing.

From Wikipedia (pay close attention to the dates):
In 1932 physicist Ernest Rutherford discovered that when lithium atoms were "split" by protons from a proton accelerator, immense amounts of energy were released in accordance with the principle of mass–energy equivalence. However, he and other nuclear physics pioneers Niels Bohr and Albert Einstein believed harnessing the power of the atom for practical purposes anytime in the near future was unlikely, with Rutherford labeling such expectations "moonshine."

The same year, his doctoral student James Chadwick discovered the neutron, which was immediately recognized as a potential tool for nuclear experimentation because of its lack of an electric charge. Experimentation with bombardment of materials with neutrons led Frédéric and Irène Joliot-Curie to discover induced radioactivity in 1934, which allowed the creation of radium-like elements at much less the price of natural radium. Further work by Enrico Fermi in the 1930s focused on using slow neutrons to increase the effectiveness of induced radioactivity. Experiments bombarding uranium with neutrons led Fermi to believe he had created a new, transuranic element, which was dubbed hesperium.

In 1938, German chemists Otto Hahn and Fritz Strassmann, along with Austrian physicist Lise Meitner and Meitner's nephew, Otto Robert Frisch, conducted experiments with the products of neutron-bombarded uranium, as a means of further investigating Fermi's claims. They determined that the relatively tiny neutron split the nucleus of the massive uranium atoms into two roughly equal pieces, contradicting Fermi. This was an extremely surprising result: all other forms of nuclear decay involved only small changes to the mass of the nucleus, whereas this process—dubbed "fission" as a reference to biology—involved a complete rupture of the nucleus. Numerous scientists, including Leó Szilárd, who was one of the first, recognized that if fission reactions released additional neutrons, a self-sustaining nuclear chain reaction could result. Once this was experimentally confirmed and announced by Frédéric Joliot-Curie in 1939, scientists in many countries (including the United States, the United Kingdom, France, Germany, and the Soviet Union) petitioned their governments for support of nuclear fission research, just on the cusp of World War II, for the development of a nuclear weapon.

First nuclear reactor

In the United States, where Fermi and Szilárd had both emigrated, the discovery of the nuclear chain reaction led to the creation of the first man-made reactor, known as Chicago Pile-1, which achieved criticality on December 2, 1942. This work became part of the Manhattan Project, a massive secret U.S. government military project to make enriched uranium and by building large production reactors to produce (breed) plutonium for use in the first nuclear weapons. The United States would test an atom bomb in July 1945 with the Trinity test, and eventually two such weapons were used in the atomic bombings of Hiroshima and Nagasaki. 


From the perspective of well over a half-century later, the advances in nuclear energy obviously represent a very sharp S curve. At the time, though, there was an entirely natural impulse to extrapolate along a linear or even exponential path.

In August 1945, the first widely distributed account of nuclear energy, in the form of the pocketbook The Atomic Age, discussed the peaceful future uses of nuclear energy and depicted a future where fossil fuels would go unused. Nobel laureate Glenn Seaborg, who later chaired the Atomic Energy Commission, is quoted as saying "there will be nuclear powered earth-to-moon shuttles, nuclear powered artificial hearts, plutonium heated swimming pools for SCUBA divers, and much more".

Tuesday, November 27, 2018

Space exploration is hard.

Yes, I realize that's probably not the most controversial claim I'll make this week, but in this age of hype and bullshit, it's important to occasionally remind ourselves of these basic facts. This is what we go through to put an unoccupied payload roughly the size of a minivan on Mars.


NASA's Mars probe lands Monday after 'seven minutes of terror'


For the eighth time ever, humanity has achieved one of the toughest tasks in the solar system: landing a spacecraft on Mars.

The InSight lander, operated by NASA and built by scientists in the United States, France and Germany, touched down in the vast, red expanse of Mars’ Elysium Planitia just before 3 p.m. Eastern on Monday.

...

The interminable stretch from the moment a spacecraft hits the Martian atmosphere to the second it touches down on the Red Planet’s rusty surface is what scientists call “the seven minutes of terror."

More than half of all missions don’t make it safely to the surface. Because it takes more than eight minutes for light signals to travel 100 million miles to Earth, scientists have no control over the process. All they can do is program the spacecraft with their best technology and wait.

“Every milestone is something that happened 8 minutes ago,” Bridenstine said. “It’s already history.”

The tension was palpable Monday morning in the control room at JPL, where InSight was built and will be operated. At watch parties around the globe — NASA’s headquarters in Washington, the Nasdaq tower in Times Square, the grand hall of the Museum of Sciences and Industry in Paris, a public library in Haines, Alaska — legs jiggled and fingers were crossed as minutes ticked toward the beginning of entry, descent and landing.

At about 11:47 a.m., engineers received a signal indicating that InSight had entered the Martian atmosphere. The spacecraft plummeted to the planet’s surface at a pace of 12,300 mph. Within two minutes, the friction roasted InSight’s heat shield to a blistering 2,700 degrees.

Grover released a deep breath: “That’s hot.”

In another two minutes, a supersonic parachute deployed to help slow down the spacecraft. Radar was powered on.

From there, the most critical descent checklist unfolded at a rapid clip: 15 seconds to separate the heat shield. Ten seconds to deploy the legs. Activate the radar. Jettison the back shell. Fire the retrorockets. Orient for landing.

One of the engineers leaned toward her computer, hands clasped in front of her face, elbows on her desk.

“400 meters,” came a voice over the radio at mission control. “300 meters. 80 meters. 30 meters. Constant velocity."

Engineer Kris Bruvold’s eyes widened. His mouth opened in an “o.” He bounced in his seat.

“Touchdown confirmed.”




Saturday, November 24, 2018

Kevin Drum makes a good point

This is Joseph.

There has been a lot of concern about recent comments by Hillary Clinton about Europe curbing refugee admissions.  Kevin Drum looked at just how many refugees Europe is actually taking and compared it to a reader survey about how many refugees the US should take in:

I don’t want anyone to take my survey too seriously. It’s obviously just a casual thing. However, I think it’s fair to say that the responses are almost entirely from a left-leaning readership, and even at that a solid majority thought the US shouldn’t take in more than half a million refugees in a single year. Adjusted for population, Germany took in nearly ten times that many.
This is a growing problem with mass population displacement.  It strains any system to take in a lot of refugees.  Wanting to be compassionate is very important and we should not allow xenophobia to interfere with saving people who need to be saved.  But it opens up a very important conversation about how one deals with extremely large population displacement and, in a democracy, there may be a limit to the rate that the populace is comfortable with integrating at once.  If climate change drives a longer term issue here, then we need to think about ways to smooth out the process.

Friday, November 23, 2018

Roy Clark and friend

Seems like an appropriate way to kick off the weekend.














Thursday, November 22, 2018

"As God as my witness..." is my second favorite Thanksgiving episode line [Repost]





If you watch this and you could swear you remember Johnny and Mr. Carlson discussing Pink Floyd, you're not imagining things. Hulu uses the DVD edit which cuts out almost all of the copyrighted music. [The original link has gone dead, but I was able to find the relevant clip.]

As for my favorite line, it comes from the Buffy episode "Pangs" and it requires a bit of a set up (which is a pain because it makes it next to impossible to work into a conversation).

Buffy's luckless friend Xander had accidentally violated a native American grave yard and, in addition to freeing a vengeful spirit, was been cursed with all of the diseases Europeans brought to the Americas.

Spike: I just can't take all this mamby-pamby boo-hooing about the bloody Indians.
Willow: Uh, the preferred term is...
Spike: You won. All right? You came in and you killed them and you took their land. That's what conquering nations do. It's what Caesar did, and he's not goin' around saying, "I came, I conquered, I felt really bad about it." The history of the world is not people making friends. You had better weapons, and you massacred them. End of story.
Buffy: Well, I think the Spaniards actually did a lot of - Not that I don't like Spaniards.
Spike: Listen to you. How you gonna fight anyone with that attitude?
Willow: We don't wanna fight anyone.
Buffy: I just wanna have Thanksgiving.
Spike: Heh heh. Yeah... Good luck.
Willow: Well, if we could talk to him...
Spike: You exterminated his race. What could you possibly say that would make him feel better? It's kill or be killed here. Take your bloody pick.
Xander: Maybe it's the syphilis talking, but, some of that made sense.



Wednesday, November 21, 2018

Fifty years late and the Russians are the ones doing it, but otherwise...

That was fortuitous timing. We just ran a post on Willy Ley (circa 1959) discussing the possibility of using nuclear powered rockets for space exploration. Now we get thee following announcement.
Speaking with reporters, Vladimir Koshlakov explained that Elon Musk and SpaceX pose no real threat to the group’s plans. Musk, Koshlakov says, is relying on technology that will soon be antiquated, while Russia is looking towards shaping the future of spaceflight.

The Russian researchers say that their nuclear-powered rocket platform will be able to make it to Mars seven months after launch, and that its reusable rocket stages can be put back into service after just 48 hours.

“Reusability is the priority,” Koshlakov reportedly said. “We must develop engines that do not need to be fine-tuned or repaired more than once every ten flights. Also, 48 hours after the rocket returns from space, it must be ready to go again. This is what the market demands.”

...

“Elon Musk is using the existing tech, developed a long time ago,” he noted. “He is a businessman: he took a solution that was already there, and applied it successfully. Notably, he is also doing his work with help from the government.”

That last paragraph is a bit of Musk-trolling but it's consistent with a point I've heard repeatedly from engineers in the field. While SpaceX has made some serious advances, the underlying tech is decades-old, dating back at least to the lunar lander.

Tuesday, November 20, 2018

The point that no one wants to make about the bad news from Facebook.



This was the year that a lot of people woke up to what Josh Marshall among others had been pointing out for a long time, that while all of the tech giants have accumulated and sometimes abused an extraordinary amount of power, Facebook stood alone as a genuinely bad actor doing a great deal of damage to a wide array of stakeholders.

What's notably absent from all of these analyses is an acknowledgment of the role that the press played in building and maintaining the myths of Zuckerberg and others as tech messiahs. Major news outlets and venerable publications, particularly the New York Times, willingly helped spread the bullshit. We should never forget that when Silicon Valley billionaires went after their toughest (and, in retrospect, most clear eyed) critic, Gawker, the NYT not only failed to stand up for journalism, they actually gave an op-ed spot to Peter Thiel so he could better spin his side of the story.

As you can see, we've been on this beat for a long time.

Wednesday, June 15, 2011

"How To Party Your Way Into a Multi-Million Dollar Facebook Job" -- the sad state of business journalism

Andrew Gelman (before his virtual sabbatical) linked to this fascinating Gawker article by Ryan Tate:

If you want Facebook to spend millions of dollars hiring you, it helps to be a talented engineer, as the New York Times today [18 May 2011] suggests. But it also helps to carouse with Facebook honchos, invite them to your dad's Mediterranean party palace, and get them introduced to your father's venture capital pals, like Sam Lessin did. Lessin is the poster boy for today's Times story on Facebook "talent acquisitions." Facebook spent several million dollars to buy Lessin's drop.io, only to shut it down and put Lessin to work on internal projects. To the Times, Lessin is an example of how "the best talent" fetches tons of money these days. "Engineers are worth half a million to one million," a Facebook executive told the paper.
We'll let you in on a few things the Times left out: Lessin is not an engineer, but a Harvard social studies major and a former Bain consultant. His file-sharing startup drop.io was an also-ran competitor to the much more popular Dropbox, and was funded by a chum from Lessin's very rich childhood. Lessin's wealthy investment banker dad provided Facebook founder Mark Zuckerberg crucial access to venture capitalists in Facebook's early days. And Lessin had made a habit of wining and dining with Facebook executives for years before he finally scored a deal, including at a famous party he threw at his father's vacation home in Cyprus with girlfriend and Wall Street Journal tech reporter Jessica Vascellaro. (Lessin is well connected in media, too.) . . .
To get the full impact, you have to read the original New York Times piece by Miguel Helft. It's an almost perfect example modern business reporting, gushing and wide-eyed, eager to repeat conventional narratives about the next big thing, and showing no interest in digging for the truth.
It is not just that Helft failed to do even the most rudimentary of fact-checking (twenty minutes on Google would have uncovered a number of major holes); it is that he failed to check an unconvincing story that blatantly served the interests of the people telling it.

Let's start with the credibility of the story. While computer science may well be the top deck of the Titanic in this economy, has the industry really been driven to cannibalization by the dearth of talented people? There are certainly plenty of people in related fields with overlapping skill sets who are looking for work and there's no sign that the companies like Facebook are making a big push to mine these rich pools of labor. Nor have I seen any extraordinary efforts to go beyond the standard recruiting practices in comp sci departments.

How about self-interest? From a PR standpoint, this is the kind of story these companies want told. It depicts the people behind these companies as strong and decisive, the kind of leaders you'd want when you expect to encounter a large number of Gordian Knots. When the NYT quotes Zuckerberg saying “Someone who is exceptional in their role is not just a little better than someone who is pretty good. They are 100 times better,” they are helping him build a do-what-it-takes-to-be-the-best image.

The dude-throws-awesome-parties criteria for hiring tends to undermine that image, as does the quid pro quo aspect of Facebook's deals with Lessin's father.

Of course, there's more at stake here than corporate vanity. Tech companies have spent a great deal of time and money trying to persuade Congress that the country must increase the number of H-1Bs we issue in order to have a viable Tech industry. Without getting into the merits of the case (for that you can check out my reply to Noah Smith on the subject), this article proves once again that one easily impressed NYT reporter is worth any number of highly paid K Street lobbyists.

The New York Times is still, for many people, the paper. I've argued before that I didn't feel the paper deserved its reputation, that you can find better journalism and better newspapers out there, but there's no denying that the paper does have a tremendous brand. People believe things they read in the New York Times. It would be nice if the paper looked at this as an obligation to live up to rather than laurels to rest on.

Monday, November 19, 2018

"The Case Against Quantum Computing"


I am approaching this one cautiously both out of concern for confirmation bias and because I know so little about the subject, but this pessimistic take by Mikhail Dyakonov on the short-term prospects of quantum computing raises troubling questions about the coverage of this field and about the way hype undermines the allocation of resources.

The pattern here is disturbingly familiar. We've seen it with AI, fusion reactors, maglev vactrains, subliminal framing, just to name a few. Credulous reporters seek out optimistic sources. Theoretical possibilities are treated as just-around-the-corner developments. Decades of slow progress, false starts, and sometimes outright failure are ignored.

Those who can claim some association with the next big thing are richly rewarded. Entrepreneurs get enormous piles of venture capital. Business lines and academic departments get generous funding. Researchers who can pull off a slick TED Talk get six-figure book deals and fawning celebrity treatment.

Just to be clear, Dyakonov's is not the consensus opinion. Lots of his colleagues are very optimistic, but these concerns do seem to be valid. The fact that almost all of the coverage glosses over that part of the picture tells us something about the state of science journalism.

From The Case Against Quantum Computing [emphasis added]
Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it.

We’ve been told that quantum computers could “provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence.” We’ve been assured that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been told that “the encryption that protects the world’s most sensitive data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.

Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.

It’s become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world’s top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.

In light of all this, it’s natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, “Not in the foreseeable future.” Having spent decades conducting research in quantum and condensed-matter physics, I’ve developed my very pessimistic view. It’s based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.



In the early 2000s, at the request of the Advanced Research and Development Activity (a funding agency of the U.S. intelligence community that is now part of Intelligence Advanced Research Projects Activity), a team of distinguished experts in quantum information established a road map for quantum computing. It had a goal for 2012 that “requires on the order of 50 physical qubits” and “exercises multiple logical qubits through the full range of operations required for fault-tolerant [quantum computation] in order to perform a simple instance of a relevant quantum algorithm….” It’s now the end of 2018, and that ability has still not been demonstrated.