Tuesday, November 20, 2018

The point that no one wants to make about the bad news from Facebook.



This was the year that a lot of people woke up to what Josh Marshall among others had been pointing out for a long time, that while all of the tech giants have accumulated and sometimes abused an extraordinary amount of power, Facebook stood alone as a genuinely bad actor doing a great deal of damage to a wide array of stakeholders.

What's notably absent from all of these analyses is an acknowledgment of the role that the press played in building and maintaining the myths of Zuckerberg and others as tech messiahs. Major news outlets and venerable publications, particularly the New York Times, willingly helped spread the bullshit. We should never forget that when Silicon Valley billionaires went after their toughest (and, in retrospect, most clear eyed) critic, Gawker, the NYT not only failed to stand up for journalism, they actually gave an op-ed spot to Peter Thiel so he could better spin his side of the story.

As you can see, we've been on this beat for a long time.

Wednesday, June 15, 2011

"How To Party Your Way Into a Multi-Million Dollar Facebook Job" -- the sad state of business journalism

Andrew Gelman (before his virtual sabbatical) linked to this fascinating Gawker article by Ryan Tate:

If you want Facebook to spend millions of dollars hiring you, it helps to be a talented engineer, as the New York Times today [18 May 2011] suggests. But it also helps to carouse with Facebook honchos, invite them to your dad's Mediterranean party palace, and get them introduced to your father's venture capital pals, like Sam Lessin did. Lessin is the poster boy for today's Times story on Facebook "talent acquisitions." Facebook spent several million dollars to buy Lessin's drop.io, only to shut it down and put Lessin to work on internal projects. To the Times, Lessin is an example of how "the best talent" fetches tons of money these days. "Engineers are worth half a million to one million," a Facebook executive told the paper.
We'll let you in on a few things the Times left out: Lessin is not an engineer, but a Harvard social studies major and a former Bain consultant. His file-sharing startup drop.io was an also-ran competitor to the much more popular Dropbox, and was funded by a chum from Lessin's very rich childhood. Lessin's wealthy investment banker dad provided Facebook founder Mark Zuckerberg crucial access to venture capitalists in Facebook's early days. And Lessin had made a habit of wining and dining with Facebook executives for years before he finally scored a deal, including at a famous party he threw at his father's vacation home in Cyprus with girlfriend and Wall Street Journal tech reporter Jessica Vascellaro. (Lessin is well connected in media, too.) . . .
To get the full impact, you have to read the original New York Times piece by Miguel Helft. It's an almost perfect example modern business reporting, gushing and wide-eyed, eager to repeat conventional narratives about the next big thing, and showing no interest in digging for the truth.
It is not just that Helft failed to do even the most rudimentary of fact-checking (twenty minutes on Google would have uncovered a number of major holes); it is that he failed to check an unconvincing story that blatantly served the interests of the people telling it.

Let's start with the credibility of the story. While computer science may well be the top deck of the Titanic in this economy, has the industry really been driven to cannibalization by the dearth of talented people? There are certainly plenty of people in related fields with overlapping skill sets who are looking for work and there's no sign that the companies like Facebook are making a big push to mine these rich pools of labor. Nor have I seen any extraordinary efforts to go beyond the standard recruiting practices in comp sci departments.

How about self-interest? From a PR standpoint, this is the kind of story these companies want told. It depicts the people behind these companies as strong and decisive, the kind of leaders you'd want when you expect to encounter a large number of Gordian Knots. When the NYT quotes Zuckerberg saying “Someone who is exceptional in their role is not just a little better than someone who is pretty good. They are 100 times better,” they are helping him build a do-what-it-takes-to-be-the-best image.

The dude-throws-awesome-parties criteria for hiring tends to undermine that image, as does the quid pro quo aspect of Facebook's deals with Lessin's father.

Of course, there's more at stake here than corporate vanity. Tech companies have spent a great deal of time and money trying to persuade Congress that the country must increase the number of H-1Bs we issue in order to have a viable Tech industry. Without getting into the merits of the case (for that you can check out my reply to Noah Smith on the subject), this article proves once again that one easily impressed NYT reporter is worth any number of highly paid K Street lobbyists.

The New York Times is still, for many people, the paper. I've argued before that I didn't feel the paper deserved its reputation, that you can find better journalism and better newspapers out there, but there's no denying that the paper does have a tremendous brand. People believe things they read in the New York Times. It would be nice if the paper looked at this as an obligation to live up to rather than laurels to rest on.

Monday, November 19, 2018

"The Case Against Quantum Computing"


I am approaching this one cautiously both out of concern for confirmation bias and because I know so little about the subject, but this pessimistic take by Mikhail Dyakonov on the short-term prospects of quantum computing raises troubling questions about the coverage of this field and about the way hype undermines the allocation of resources.

The pattern here is disturbingly familiar. We've seen it with AI, fusion reactors, maglev vactrains, subliminal framing, just to name a few. Credulous reporters seek out optimistic sources. Theoretical possibilities are treated as just-around-the-corner developments. Decades of slow progress, false starts, and sometimes outright failure are ignored.

Those who can claim some association with the next big thing are richly rewarded. Entrepreneurs get enormous piles of venture capital. Business lines and academic departments get generous funding. Researchers who can pull off a slick TED Talk get six-figure book deals and fawning celebrity treatment.

Just to be clear, Dyakonov's is not the consensus opinion. Lots of his colleagues are very optimistic, but these concerns do seem to be valid. The fact that almost all of the coverage glosses over that part of the picture tells us something about the state of science journalism.

From The Case Against Quantum Computing [emphasis added]
Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it.

We’ve been told that quantum computers could “provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence.” We’ve been assured that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been told that “the encryption that protects the world’s most sensitive data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.

Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.

It’s become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world’s top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.

In light of all this, it’s natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, “Not in the foreseeable future.” Having spent decades conducting research in quantum and condensed-matter physics, I’ve developed my very pessimistic view. It’s based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.



In the early 2000s, at the request of the Advanced Research and Development Activity (a funding agency of the U.S. intelligence community that is now part of Intelligence Advanced Research Projects Activity), a team of distinguished experts in quantum information established a road map for quantum computing. It had a goal for 2012 that “requires on the order of 50 physical qubits” and “exercises multiple logical qubits through the full range of operations required for fault-tolerant [quantum computation] in order to perform a simple instance of a relevant quantum algorithm….” It’s now the end of 2018, and that ability has still not been demonstrated.

Friday, November 16, 2018

This John Oliver segment has a way of popping back into the relevant category

In case you haven't been following the news.















Thursday, November 15, 2018

Launching the USS Holland -- my big regret is that I couldn't work in a reference to the dynamite gun


Perhaps even more than the airplane, submarines are the perfect example of how a wave of enabling technologies at the end of the 19th century suddenly made the long dreamed of both possible and practical. Experiments in the field went back literally hundreds of years.

But it wasn't until the last third of the 19th century that a set of four advances – – one revolutionary in the field of naval warfare, the other three revolutionary period – – would make submarines a major military factor. Whitehead torpedoes, Bessemer steel, electric batteries and motors, and internal combustion made the modern version of the craft possible.

The models being developed by most of the major powers around 1900 were, in broad strokes, the same basic configuration as those that would patrol the oceans for more than 50 years until the launch of the Nautilus. There would, of course, be great progress. The subs of World War I would be far more sophisticated than those of 15 years earlier, just as the subs of World War II would surpass those of the previous generation, but the underlying approach would remain fundamentally the same.

The following article, complete with very cool illustrations, comes from Scientific American (December 28, 1901). Just to give you an idea how quickly things were moving at the time, the same issue has two news items on major advances in wireless telegraphy including Marconi's announcement of the first successful transatlantic radio transmission, accepted as authentic by "Mr. Edison" and prompting a cable of congratulations from "Prof. Bell" who graciously offered his house on the coast of Nova Scotia as a site for future experiments.











Wednesday, November 14, 2018

There's still nothing there (and other lessons journalists refuse to learn about Elon Musk)


[See comments]

From Ars Technica

Similarly, Musk told mayors on Thursday that he wants The Boring Company to dig sewers, water transport, and electrical tunnels under cities, in addition to the transportation-focused tunnels he hopes to dig to house electric skate systems.

Musk mentioned this alternate use for his boring machines at the National League of Cities' City Summit, during a "fireside chat" with Los Angeles mayor Eric Garcetti. According to Forbes, Musk told the audience, "The Boring Company is also going to do tunneling for, like, water transport, sewage, electrical. We're not going to turn our noses up at sewage tunnels. We're happy to do that too."

The Boring Company is built on the premise that tunneling technology has not been adequately developed. Musk claims that his boring machines will tunnel faster than the industry's best machines.

Elon Musk is good copy. Perhaps more than anything else, that is the one thing you need to keep reminding yourself of when trying to make sense of why reporters remain so hopelessly credulous on this story. As Upton Sinclair might have told you, the press is remarkably willing to accept dubious claims when they drive traffic and reinforce rather than challenge the standard narrative.

In this case, "reinforce" is far too weak a term. Elon Musk has fashioned himself to personify the cherished tech messiahs narrative. Like Abraham Lincoln in the old Bob Newhart monologue, if he hadn't existed, they would've had to invent him.

Musk is, to his credit, an exceptionally gifted promoter, particularly adept at the art of misdirection. No one is better at distraction, dramatically changing the focus of attention just long enough for goalposts to be moved, promises to be forgotten, and "I'll address that later" to become "we've already covered that."

These distractions are nested. Less like a "real life Tony Stark" and more like a modern day Scheherazade, Elon Musk tells stories within stories, constantly shifting back and forth so that all but the most careful and critical listener will lose the thread and get swept up in the fantasy. When it becomes increasingly obvious that Tesla is unlikely to ever justify its stock price, he announces that construction will soon begin on a long-distance maglev vactrains running along the East Coast. When the buzz fades from that, he very publicly launches a company that claims to be able to increase tunneling speed and decreased costs by an order of magnitude. When the lack of actual breakthroughs start to become noticeable, he releases cool CGI videos of giant slot cars racing underneath Los Angeles.

A key part of this magic show is the ability to make the ordinary seem wondrous. People have been digging tunnels for thousands of years and there is no reason at this point for us to believe that the excavation which is about to be announced with such fanfare employed methods in any way more sophisticated than those used on construction projects around the world.

The press has become so docile on this point that Musk doesn't even have to lie about having made some major advance in the technology. He can just pretend that the enormous superiority of his system was a proven fact, confident in the assumption that no reporter will point out the truth. As far as I can tell (and I've read all that I had time and stomach for), the few specifics he has provided have been either meaninglessly vague (blah blah blah automation blah blah blah) or have displayed a fundamental lack of understanding about engineering and infrastructure (making projects cheaper by making tunnels smaller in situations where the reduction in capacity would actually drive up costs).

Tuesday, November 13, 2018

Another reason to have mixed feelings about the franchise model

From Bloomberg:
All’s fair in the bitter, protracted war between 7-Eleven and its franchisees. The tensions have built steadily in the years since DePinto, a West Point-educated veteran, took charge and began demanding more of franchisees—more inventory, more money, more adherence in matters large and small. Some franchisees have responded by organizing and complaining and sometimes suing.

As detailed in a series of lawsuits and court cases, the company has plotted for much of DePinto’s tenure to purge certain underperformers and troublemakers. It’s targeted store owners and spent millions on an investigative force to go after them. The corporate investigators have used tactics including tailing franchisees in unmarked vehicles, planting hidden cameras and listening devices, and deploying a surveillance van disguised as a plumber’s truck. The company has also given the names of franchisees to the government, which in some cases has led immigration authorities to inspect their stores, according to three officials with Homeland Security Investigations, which like ICE is under the jurisdiction of the Department of Homeland Security.

Monday, November 12, 2018

Some cool old pictures to start the week


From Scientific American 1867/03/30















Friday, November 9, 2018

If Elon Musk had a radium drill, he could really go to town

Not a good movie, but a sometimes interesting look at attitudes toward the future in the first part of the 20th Century, based on a popular 1913 book. It's worth noting that advances in radiation and (more importantly) metallurgy -- particularly the development of Bessemer steel -- had been major parts of the late 19th Century spike of progress.




From Wikipedia:

A group of wealthy industrialists gather in the home of Mr. Lloyd, a millionaire who introduces them to Richard "Mack" McAllan, the engineer who successfully spearheaded the construction of the Channel Tunnel (the story takes place in the unspecified near future, though it is noted in the film that the Channel Tunnel is built "in 1940"). McAllan informs the group that the "Allanite steel" he developed, along with a "radium drill" developed by his friend Frederick "Robbie" Robbins, makes it possible to construct an undersea tunnel linking England with the United States. Though the group is initially sceptical, the backing of Lloyd and his associate Mostyn convinces the group to buy shares in the project.











Thursday, November 8, 2018

A few points on Willy Ley and "the Conquest of Space"

To understand the 21st century narrative around technology and progress, you need to go back to two eras of extraordinary advances, the late 19th/early 20th centuries and the postwar era. Virtually all of the frameworks, assumptions, imagery, language, and iconography we use to discuss and think about the future can be traced back to these two periods.

The essential popularizer of science in the latter era was Willy Ley. In terms of influence and popularity, it is difficult to think of a comparable figure. Carl Sagan and Neil Degrasse Tyson hold somewhat analogous positions, but neither can claim anywhere near the impact. When you add in Ley's close association with Werner von Braun, it is entirely reasonable to use his books as indicators of what serious people in the field of aerospace were thinking at the time. The excerpt below comes with a 1949 copyright and gives us an excellent idea of what seemed feasible 70 years ago.

There is a lot to digest here, but I want to highlight two points in particular.

First is the widespread assumption at the time that atomic energy would play a comparable role in the remainder of the 20th century to that of hydrocarbons in the previous century and a half, certainly for power generation and large-scale transportation. Keep in mind that it took a mere decade to go from Hiroshima to the launch of the Nautilus and there was serious research (including limited prototypes) into nuclear powered aircraft. Even if fusion reactors remained out of reach, a world where all large vehicles were powered by the atom seemed, if anything, likely.

Second, check out Ley's description of the less sophisticated, non-atomic option and compare it to the actual approach taken by the Apollo program 20 years later.

I think we have reversed the symbolic meaning of a Manhattan project and a moonshot. The former has come to mean a large, focus, and dedicated commitment to rapidly addressing a challenging but solvable problem. The second has come to mean trying to do something so fantastic it seems impossible. The reality was largely the opposite. Building an atomic bomb was an incredible goal that required significant advances in our understanding of the underlying scientific principles. Getting to the moon was mainly a question of committing ourselves to spending a nontrivial chunk of our GDP on an undertaking that was hugely ambitious in terms of scale but which relied on technology that was already well-established by the beginning of the Sixties.

________________________________________________

The conquest of space by Willy Ley 1949
Page 48.

In general, however, the moon messenger [and unmanned test rocket designed to crash land on the moon – – MP] is close enough to present technological accomplishments so that its design and construction are possible without any major inventions. Its realization is essentially a question of hard work and money.

The manned moonship is a different story. The performance expected of it is, naturally, that it take off from the earth, go to the moon, land, takeoff from the moon, and return to earth. And that, considering known chemical fuels and customary design and construction methods, is beyond our present ability. But while the moon ship can make a round-trip is unattainable with chemical fuels, a moon ship which can land on the moon with a fuel supply insufficient for the return is a remote possibility. The point here is that one more attention of the step principle is possible three ships which landed might have enough fuel left among them for one to make the return trip.

This, of course, involves great risk, since the failure of one ship would doom them all. Probably the manned moon ship will have to be postponed until there is an orbital nation. Take off from the station, instead of from the ground, would require only an additional 2 mi./s, so that the total works out to about 7 mi./s, instead of the 12 mi./s mentioned on page 44.

Then, of course, there is the possibility of using atomic energy. If some 15 years ago, a skeptical audience had been polled as to which of the two "impossibilities" – – moon ship and large scale controlled-release of atomic energy – – they considered less fantastic, the poll would probably have been 100% in favor of the moon ship. As history turned out, atomic energy came first, and it is now permissible to speculate whether the one may not be the key to the other.

So far, unfortunately, we only know that elements like uranium, plutonium, etc., contain enough energy for the job. We also know that this energy is not completely accessible, that it can be released. He can't even be released in two ways, either fast in the form of a superexplosion, or slowly in a so-called "pile" where the energy appears mainly as he. But we don't know how to apply these phenomena to rocket propulsion. Obviously the fissionable matter should not form the exhaust; there should be an additional reactant, a substance which is thrown out: plain water, perhaps, which would appear as skiing, possibly even split up into its component atoms of hydrogen and oxygen, or perhaps peroxide.

The "how" is still to be discovered, but it will probably be based on the principle of using eight fissionable element's energy for the ejection of a relatively inert reactant. It may be that, when that problem has been solved, we will find a parallel to the problem of pumps in an ordinary liquid fuel rocket. When liquid fuel rockets were still small – – that was only about 17 years ago and I remember the vividly – – the fuels were forced into the rocket motor by pressurizing the whole fuel tank. But everybody knew then that this would not do for all time to come. The tank that had to stand the feeding pressure had to have strong walls. Consequently it was heavy. Consequently the mass ratio could not be I. The idea then was that the tank be only strong enough to hold the fuels, in the matter of the gasoline tank of a car or truck or an airplane, and that the feeding pressure should be furnished by a pop. Of course the pump had to weigh less than the saving in tank wall weight which they brought about. Obviously there was a minimum size and weight for a good home, and if that minimum weight was rather large, a rocket with pumps would have to be a big rocket.

It happened just that way. Efficient pumps were large and heavy and the rocket with pumps was the 46 foot the two. The "atomic motor" for rockets may also turn out to be large, the smallest really reliable and efficient model may be a compact little 7 ton unit. This would make for a large rocket – – but the size of a vehicle is no obstacle if you have the power to move it. Whatever the exhaust velocity, it will be high – – an expectation of 5 mi./s may be conservative. With such an exhaust velocity the mass ratio of the moon ship would be 11:1; with an exhaust velocity of 10 mi./s the mass ratio would drop .3:1!

The moon ship shown in the paintings of the second illustration section is based on the assumption of a mass ratio of this order of magnitude, which in turn is based on the assumption of an atomic rocket motor.

Naturally there would be some trouble with radioactivity in an atomic propelled rocket. But that is not quite as hard to handle as the radioactivity which would accompany atomic energy propulsion under different circumstances. A seagoing vessel propelled by time and energy could probably be built right now. It would operate by means of an atomic pile running at the center high enough to burden and water steam. The steam would drive a turbine, which would be coupled to the ships propeller. While all this mechanism would be reasonably small and light as ship engines go, it would have to be encased in many tons of concrete to shield the ships company against the radiation that would escape from the pile and from the water and the skiing the coolant. For a spaceship, no all-around shielding needed, only a single layer, separating the pilot's or crew's cabin in the nose from the rest of the ship. On the ground a ship which had grown "hot" through service would be placed inside a shielding structure, something like a massive concrete walls, open at the top. That would provide complete shielding or the public, but a shielding that the ship would not have to carry.
The problem that may be more difficult to handle is that of the radioactivity of the exhaust. A mood ship taking off with Lee behind a radioactive patch, caused by the ground/. Most likely that radioactivity would not last very long, but it would be a temporary danger spot. Obviously moon ship for some time to come will begin their journeys from desolate places. Of course they might take off by means of booster units producing nothing more dangerous in their exhaust them water vapor, carbon dioxide, and maybe a sulfurous smell.

Wednesday, November 7, 2018

Transportation cost overruns are nothing new

From Scientific American 1896








Tuesday, November 6, 2018

Monday, November 5, 2018

This nearly century and a quarter old discussion about rapid transit has a remarkably contemporary feel to it, starting with the phrase "rapid transit."

I always assumed it was a 20th Century term, but...



It's this paragraph, however, that struck me as particularly modern:











Friday, November 2, 2018

You should be concerned about the quality of the polls, but it's likely voter models that should worry you the most.

I've been meaning to do a good, substantial, well reasoned piece on fundamental misunderstandings about political polling. This is not that post. Things have been, let us say, busy of late and I don't have time to get this right, but I do need to get it written. I really want to get this one out in the first five days of November.

So here's the short version.

When the vast majority of journalists (even most data journalists) talk about polls being wrong, they tend to screw up the discussion on at least two levels. First because they do not grasp the distinction between data and model and second because they don't understand how either is likely to go kerplooie (okay, how would you spell it?).

The term "polls of registered voters" describes more or less raw data. A complete and detailed discussion would at this point mention weighting, stratification, and other topics but – – as previously mentioned – – this is not one of those discussions. For now, we will treat those numbers you see in the paper as summary statistics of the data.

Of course, lots of things can go wrong in the collecting. Sadly, most journalists are only aware of the least worrisome issue, sampling error. Far more troubling are inaccurate/dishonest responses and, even more importantly, nonrepresentative samples (a topic we have looked into at some depth earlier). For most reporters, "inside the margin of error" translates to "revealed word of God" and when this misunderstanding leads to disaster, they conclude that "the polls were wrong."

The term "likely voter" brings in an entirely different concept, one which is generally even less well understood by the people covering it because now we are talking not just about data, but about models. [Quick caveat: all of my experience with survey data and response models has been on the corporate side. I'm working under the assumption that the same basic approaches are being used here, but you should always consult your physician or political scientist before embarking on prognostications of your own.]

First off, it's worth noting that the very designation of "likely" is arbitrary. A model has been produced that attempts to predict the likelihood that a given individual will vote in an upcoming election, but the cut off between likely and unlikely is simply a number that the people in the field decided was reasonable. There's nothing scientific, let alone magical about it.

Far more important, particularly in the upcoming election, is the idea of range of data. Certain concepts somehow managed to be both painfully obvious and frequently forgotten. Perhaps the best example in statistics is that a model only describes the relationships found in the sample. When we try to extrapolate beyond the range of data, we can only hope that the relationships will continue to hold.

By their very nature, this is always a problem with predictive modeling, but it becomes a reason for skepticism bordering on panic when the variables you included in or perhaps more to the point, left out of your model start taking on values far in excess of anything you saw on the sample. 2018 appears to be a perfect example.

Will the relationships we've seen in the past hold? If not, will the shift favor the Democrats? The  Republicans? Or will the relationships break down in such a way that they cancel each other out? I have no intention of speculating. What I am saying is that we are currently so far out of the range of data on so many factors that I'm not sure it makes sense to talk about likely voters at all.

Thursday, November 1, 2018

Our regular repost on drinking from the wrong pipe

From Josh Marshall:

I managed to involve myself this weekend in a tiny eddy in the storm around the Pittsburgh synagogue massacre. As you can see below, early yesterday evening I happened upon this interview on Lou Dobbs’ Fox Business News show in which a guest, Chris Farrell, claimed the migrant caravan in southern Mexico was being funded and directed by the “Soros-occupied State Department.” This is, as I explained, straight out of The Protocols of the Elders of Zion, the foundational anti-Semitic tract, first circulated and perhaps authored by the Czarist secret police in the first years of the 20th century.

If you’re not familiar with this world, “ZOG” is a staple of white supremacist and neo-Nazi literature and websites. It stands for “Zionist Occupied Government” and is a shorthand for the belief that Jews secretly control the US government. Chris Farrell’s phrasing was no accident. All of this is straight out of the most rancid anti-Semitic propaganda. Rob Bowers, the shooter in the Pittsburgh massacre, appears to have been specifically inspired by this conspiracy theory. Indeed, Bowers had also reposted references to “ZOG” on his social media accounts.


All of the conspiracy theories around the caravan, particularly those involving George Soros and voter fraud, have a weird underwear gnomes quality to them. They make emotional sense for those deep in the conservative media bubble, but there's no way to make any kind of plausible argument for any of them.

It can be useful for the Republican Party if certain segments of the population believe these fantasies, even disseminate them as long as the discussion remains far enough on the recognized fringe to allow party leaders plausible deniability. It is not useful to have ranking politicians and influential conservative voices saying these things out loud on what are supposed to be respectable outlets.

Or as we said exactly two years ago...

Tuesday, November 1, 2016

In retrospect, it's surprising we don't use more sewage metaphors

A few stray thoughts on the proper flow of information (and misinformation) and a functional organization.

I know we've been through all of this stuff about Leo Strauss and the conservative movement before so I'm not going to drag this out into great detail except to reiterate that if you want to have a functional  institution that makes extensive use of internal misinformation, you have to make sure things move in the right direction.

With misinformation systems as with plumbing, when the flow starts going the wrong way, the results are seldom pretty. This has been a problem for the GOP for at least a few years now. A number of people in positions of authority, (particularly in the tea party wing) have bought into notions that were probably intended simply to keep the cannon-fodder happy. This may also partly explain the internal polling fiasco at the Romney campaign.

As always, though, it is Trump who takes things to a new level. We now have a Republican nominee who uses the fringier parts of the Twitter verse as briefings.

From Josh Marshall:


Here's what he said ...
Wikileaks also shows how John Podesta rigged the polls by oversampling democrats, a voter suppression technique. That's happening to me all the time. When the polls are even, when they leave them alone and do them properly, I'm leading. But you see these polls where they're polling democrats. How is Trump doing? Oh, he's down. They're polling democrats. The system is corrupt, rigged and broken. And we're going to change it. [ Cheers and applause ]
Thank you, thank you. In an e-mail podesta says he wants oversamples for our polling in order to maximize what we get out of our media polling. It's called voter suppression because people will say, oh, gee, Trump's down. Folks, we're winning. We're winning. We're winning. These thieves and crook, the immediate, yeah not all of it, not all of it, but much of it -- they're the most crooked -- they're almost as crooked as Hillary. They may even be more crooked than Hillary because without the media, she would be nothing.
Now this immediately this grabbed my attention because over the weekend I was flabbergasted to see this tweet being shared around the Trumposphere on Twitter.
I don't know who Taylor Egly is. But he has 250,000 followers - so he has a big megaphone on Twitter. This tweet and this new meme is a bracing example of just how many of the "scoops" from the Podesta emails are based on people simply not knowing what words mean.
Trump had already mentioned 'over-sampling' earlier. But here he's tying it specifically to the Podesta emails released by Wikileaks. This tweet above is unquestionably what he's referring to.
There are several levels of nonsense here. Let me try to run through them.
...

 More importantly, what Tom Matzzie is talking about is the campaign/DNC's own polls. Campaigns do extensive, very high quality polling to understand the state of the race and devise strategies for winning. These are not public polls. So they can't affect media polls and they can't have anything to do with voter suppression.

Now you may be asking, why would the Democrats skew their own internal polls? Well, they're not.
The biggest thing here is what the word 'oversampling' means. Both public and private pollsters will often over-sample a particular demographic group to get statistically significant data on that group.
...  You need to get an 'over-sample' to get solid numbers.

Whether it's public or private pollsters, the 'over-sample' is never included in the 'topline' number. So if you get 4 times the number of African-American voters as you got in a regular sample, those numbers don't all go into the mix for the total poll. They're segmented out. The whole thing basically amounts to zooming in on one group to find out more about them. To do so, to zoom in, you need to 'over-sample' their group as what amounts to a break-out portion of the poll.

What it all comes down to is that you're talking about a polling concept the Trumpers don't seem to understand (or are relying on supporters not understanding), about polls that are by definition secret (campaign polls aren't shared) and about an election eight years ago.

Wednesday, October 31, 2018

Halloween Greetings form College Humor

"The Internet Goes Trick-or-Treating"