Comments, observations and thoughts from two bloggers on applied statistics, higher education and epidemiology. Joseph is an associate professor. Mark is a professional statistician and former math teacher.
Friday, October 26, 2018
More spooky stuff (In no way chosen as filler because I had a busy week)
Who better than Goldsmith and Hermann to send us off.
Thursday, October 25, 2018
A Mercury Theatre Halloween
[repost]
The debut production of the Mercury Theatre of the Air, Dracula.
And, of course, the Mercury production of War of the Worlds.
While we're at it, here's a tour de force from Welles' favorite, Agnes Moorehead (don't let the corny intro turn you off) Sorry, Wrong Number.
The debut production of the Mercury Theatre of the Air, Dracula.
And, of course, the Mercury production of War of the Worlds.
While we're at it, here's a tour de force from Welles' favorite, Agnes Moorehead (don't let the corny intro turn you off) Sorry, Wrong Number.
Wednesday, October 24, 2018
We'll largely skip over the author's fixation on casual nudity.
That said, it's a mistake to ignore them entirely, both because of the close relationship between the scientific and the science-fiction community, and because of the influence SF has played on the way we think about technology today either directly through books, film, television and indirectly through writers who alternated between science fact and science fiction (Asimov, Clarke, and to a degree, Willy Ley and Carl Sagan).
This mid century essay by Robert Heinlein on his predictions for the year 2000 is worth a look for a number of reasons. First, people did tend to take the man seriously in the postwar era. Though his standing has arguably declined somewhat at least relative to contemporaries like Asimov, at his peak, he was the best known and best respected hard science fiction writer among mainstream audiences. (Bradberry also had significant mainstream following, but even when writing about spaceships and aliens, his work tended to fall more in the category of fantasy).
Second, this essay is of particular value because the author not only makes a great number of detailed predictions (including a notable amount of time spent on the appeal of socially acceptable nudity), he also explicitly spells out the assumptions that underlie much of the period's attitudes toward the future. He even states his axioms and provides a handy graph of human advancement.
As a serious attempt at describing the rate of progress, this picture is fatally flawed. The year 1900 came at the end of a huge technological and scientific spike. Extending it back a couple of decades would have completely thrown off the curve. (Interestingly, you actually can justify an exponential curve describing progress in the 19th century.) Furthermore, it is difficult to argue a steady acceleration from the naughts to the teens, the teens to the 20s, and the 20s to the 30s.
This graph, however, is tremendously revealing when it comes to the ways people in the 1950s thought about progress. Like the end of the 19th century, the postwar era was a period when conditions lined up to cause a number of very steep S curves to cluster together. The result was a time of explosive, ubiquitous change. There was also, as mentioned before, a tendency to look at the two world wars and the interval between (particularly the Great Depression) as anomalous. It was natural for people in the postwar era to see themselves as living on an exponential slope that was on the verge of shooting past the comprehensible.
Tuesday, October 23, 2018
Cult of the CEO
This is Joseph
This is a revealing symptom of the cult of the CEO:
Irreplaceable men often beget disasters. Great leaders, like Napoleon, often get cocky and make grave mistakes that end up costing a great deal despite the attributes that made them successful for a long time.
I think that this line of thinking isn't ideal.
This is a revealing symptom of the cult of the CEO:
If you have a CEO this dead to rights on securities fraud, why let him continue as CEO? According to the SEC, Musk was indispensable. In a statement, SEC Chair Jay Clayton said “holding individuals accountable is important and an effective means of deterrence,” but that he must take the interests of investors into account, and “the skills and support of certain individuals may be important to the future success of a company.”One of the most pernicious myths is that of the irreplaceable man. We know that nobody is really irreplaceable, because in the end we are all replaced by the natural force of mortality. But it is a terrible sign of a society when it sees the importance of a person to a business enterprise as an excuse for leniency for poor conduct. The pressure to cheat to reach the top has to be high because the stakes are so incredibly meaningful in terms of wealth and status. That suggests more scrutiny, not less.
Irreplaceable men often beget disasters. Great leaders, like Napoleon, often get cocky and make grave mistakes that end up costing a great deal despite the attributes that made them successful for a long time.
I think that this line of thinking isn't ideal.
Monday, October 22, 2018
It is always useful to go back and read the contemporary accounts.
One of the nails I've pounded flush to the board recently (apologies to long-suffering regular readers) is that much of the standard 21st-century narrative of technology consists of things that were at best sometimes true in the past and are almost entirely false now. The best example is probably the idea that the advances of old invariably came as a thief in the night with almost no one imagining the magnitude of their impact and what now seem obvious applications going undiscovered for years.
There are, of course, technological developments that caught people off guard or that moved in unexpected directions, but in most cases, if you go back and read early speculations about the potential of breakthrough technologies in the late 19th/early 20th centuries or the postwar era, you'll generally find that people had a pretty good sense of what was likely to come.
The same can be said for the dawn of the personal computing era
.
Friday, October 19, 2018
In case the aerospace allusions are getting a bit obscure, here's a week in video recommendation.
The Mouse on the Moon is an easy film to overlook. Between Peter Sellers' spectacular turn in the Mouse that Roared and the general tendency of the time to look at sequels as second-class cinematic citizens, particularly when none of the original stars made a return appearance), it is easy to think of the 1963 film as "the other one."
That's too bad, because the second film can easily hold its own. It's sharp and funny and like its predecessor. Both get 3 1/2 stars in the Leonard Maltin guide. What's more, it features the direction of a young Richard Lester just before he broke through with Hard Days Night.
From Wikipedia:
The Mouse on the Moon is a 1963 British comedy film, the sequel to The Mouse That Roared. It is an adaptation of the 1962 novel The Mouse on the Moon by Irish author Leonard Wibberley, and was directed by Richard Lester. In it, the people of the Duchy of Grand Fenwick, a microstate in Europe, attempt space flight using wine as a propellant. It satirises the space race, Cold War and politics.
Thursday, October 18, 2018
Continuing the visionary aerospace thread, there's almost a "mouse on the moon" quality to India's avatar.
No disrespect meant for India here. Quite the opposite. I think there's long been a tendency to underestimate the country and its extraordinary intellectual capital. Here's one of the projects I would definitely keep an eye on.
From Wikipedia:
:
The idea is to develop a spaceplane vehicle that can take off from conventional airfields. Its liquid air cycle engine would collect air in the atmosphere on the way up, liquefy it, separate oxygen and store it on board for subsequent flight beyond the atmosphere. The Avatar, a reusable launch vehicle, was first announced in May 1998 at the Aero India 98 exhibition held at Bangalore.Avatar seems to have, if you'll pardon the metaphor, stalled out (recent tests don't seem to involve any of the really cutting-edge stuff). It could be that the technology actually has hit a wall. That would hardly be surprising for something this ambitious. There's another possibility, however, that is both more encouraging and depressing at the same time, namely that it simply hasn't gotten the funding it needs. Depressing because that would mean we have unnecessarily delayed important advances. Encouraging because it suggests that we still might get this plane flying.
There's a lot of money floating around out there in the vanity aerospace industry and it would be nice to see it go to something ambitious and important. With all due respect to the recently departed, if Paul Allen had taken the money spent on 60 year old visions of space travel and poured it into something forward thinking, his greater legacy might've been what he did after Microsoft.
Wednesday, October 17, 2018
It also suggests that I chose the wrong major in college.
This long but well worth your time article approaches some familiar problems from a new and disturbing direction. We've talked a great deal about manipulation of data and research, the corrupting influence of big money on supposedly objective processes, and the tendency to equate overly complicated math with insight and profundity, but probably not nearly enough about how these things can undermine public policy decision-making.
Here are a few excerpts, but you should really read the whole thing.
Here are a few excerpts, but you should really read the whole thing.
These Professors Make More Than a Thousand Bucks an Hour… — ProPublica
Jesse Eisinger,Justin Elliott
If the government ends up approving the $85 billion AT&T-Time Warner merger, credit won’t necessarily belong to the executives, bankers, lawyers, and lobbyists pushing for the deal. More likely, it will be due to the professors.
A serial acquirer, AT&T must persuade the government to allow every major deal. Again and again, the company has relied on economists from America’s top universities to make its case before the Justice Department or the Federal Trade Commission. Moonlighting for a consulting firm named Compass Lexecon, they represented AT&T when it bought Centennial, DirecTV, and Leap Wireless; and when it tried unsuccessfully to absorb T-Mobile. And now AT&T and Time Warner have hired three top Compass Lexecon economists to counter criticism that the giant deal would harm consumers and concentrate too much media power in one company.
Today, “in front of the government, in many cases the most important advocate is the economist and lawyers come second,” said James Denvir, an antitrust lawyer at Boies, Schiller.
Economists who specialize in antitrust — affiliated with Chicago, Harvard, Princeton, the University of California, Berkeley, and other prestigious universities — reshaped their field through scholarly work showing that mergers create efficiencies of scale that benefit consumers. But they reap their most lucrative paydays by lending their academic authority to mergers their corporate clients propose. Corporate lawyers hire them from Compass Lexecon and half a dozen other firms to sway the government by documenting that a merger won’t be “anti-competitive”: in other words, that it won’t raise retail prices, stifle innovation, or restrict product offerings. Their optimistic forecasts, though, often turn out to be wrong, and the mergers they champion may be hurting the economy.
Some of the professors earn more than top partners at major law firms. Dennis Carlton, a self-effacing economist at the University of Chicago’s Booth School of Business and one of Compass Lexecon’s experts on the AT&T-Time Warner merger, charges at least $1,350 an hour. In his career, he has made about $100 million, including equity stakes and non-compete payments, ProPublica estimates. Carlton has written reports or testified in favor of dozens of mergers, including those between AT&T-SBC Communications and Comcast-Time Warner, and three airline deals: United-Continental, Southwest-Airtran, and American-US Airways.
American industry is more highly concentrated than at any time since the gilded age. Need a pharmacy? Americans have two main choices. A plane ticket? Four major airlines. They have four choices to buy cell phone service. Soon one company will sell more than a quarter of the quaffs of beer around the world.
Mergers peaked last year at $2 trillion in the U.S. The top 50 companies in a majority of American industries gained share between 1997 and 2012, and “competition may be decreasing in many economic sectors,” President Obama’s Council of Economic Advisers warned in April.
While the impact of this wave of mergers is much debated, prominent economists such as Lawrence Summers and Joseph Stiglitz suggest that it is one important reason why, even as corporate profits hit records, economic growth is slow, wages are stagnant, business formation is halting, and productivity is lagging. “Only the monopoly-power story can convincingly account” for high business profits and low corporate investment, Summers wrote earlier this year.
...
These complex mathematical formulations carry weight with the government because they purport to be objective. But a ProPublica examination of several marquee deals found that economists sometimes salt away inconvenient data in footnotes and suppress negative findings, stretching the standards of intellectual honesty to promote their clients’ interests.
…
Recent research supports the classic view that large mergers, by reducing competition, hurt consumers. The 2008 merger between Miller and Coors spurred “an abrupt increase” in beer prices, an academic analysis found this year. In the most comprehensive review of the academic literature, Northeastern economist John Kwoka studied the effects of thousands of mergers. Prices on average increased by more than 4 percent. Prices rose on more than 60 percent of the products and those increases averaged almost 9 percent. “Enforcers clear too many harmful mergers,” American University’s Jonathan Baker, a Compass economist who has consulted for both corporations and the government, wrote in 2015.
Once a merger is approved, nobody studies whether the consultants’ predictions were on the mark. The Department of Justice and the Federal Trade Commission do not make available the reports that justify mergers, and those documents cannot be obtained through public records requests. Sometimes the companies file the expert reports with the courts, but judges usually agree to companies’ requests to seal the documents. After a merger is cleared, the government no longer has access to the companies’ proprietary data on their pricing.
The expert reports “are not public so only the government can check,” said Ashenfelter, the Princeton economist who has consulted for both government and private industry. “And the government no longer has the data so they can’t check.” How accurate are the experts? “The answer is no one knows and no one wants to find out.”
Tuesday, October 16, 2018
There's a lot of bold, visionary thinking coming out of aerospace, just not from the people who are supposed to be the bold visionaries.
Recently, we've been making the point that most of the "futuristic" and "revolutionary" proposals coming from the billionaire Messiah class (dominated by but not limited to the Silicon Valley variety) are usually postwar vintage and, even when you put aside those that probably won't work at all (see the Hyperloop), represent at best incremental advances. This is especially true in the vanity aerospace industry.
In making that point, I may have given the wrong impression about the aerospace industry as a whole. I'll try to post a couple more examples later. For now though, here is one technology that, if it proves viable and it is looking very promising, holds tremendous potential to revolutionize spaceflight.
From Wikipedia:
Like the RB545, the SABRE design is neither a conventional rocket engine nor jet engine, but a hybrid that uses air from the environment at low speeds/altitudes, and stored liquid oxygen (LOX) at higher altitude. The SABRE engine "relies on a heat exchanger capable of cooling incoming air to −150 °C (−238 °F), to provide oxygen for mixing with hydrogen and provide jet thrust during atmospheric flight before switching to tanked liquid oxygen when in space."
And from the Guardian:
Spaceplanes are what engineers call single-stage-to-orbit (if you really want to geek out, just use the abbreviation: SSTO). They have long been a dream because they would be fully reusable, taking off and landing from a traditional runway.
By building reusable spaceplanes, the cost of reaching orbit could be reduced to a twentieth current levels. That makes spaceplanes a game changer both for taking astronauts into space and for deploying satellites and space probes.
If all goes to plan, the first test flights could happen in 2019, and Skylon – Reaction Engines' spaceplane – could be visiting the International Space Station by 2022. It will carry 15 tonnes of cargo on each trip. That's almost twice the amount of cargo that the European Space Agency's ATV vehicle can carry.
...
Rockets are cumbersome because not only must they carry fuel, they also need an oxidising agent to make it burn. This is usually oxygen, which is stored as a liquid in separate tanks. Spaceplanes do away with the need for carrying most of the oxidiser by using air from the atmosphere during the initial stages of their flight.
This is how a traditional jet engine works, and making a super-efficient version has been engineer Alan Bond's goal for decades. In 1989, he founded Reaction Engines and has painstakingly developed the Sabre engine, which stands for Synergetic Air-Breathing Rocket Engines.
In late 2012, tests managed by the European Space Agency showed that the key pieces of technology needed for Sabre worked. No one else has managed to successfully develop such a technology.
...
Spaceplanes should not to be confused with space tourism vehicles such as Virgin Galactic's Space Ship Two. The highest altitude this vehicle will reach is about 110km, giving passengers about six minutes of weightlessness as the craft plummets back to Earth before the controlled landing.
Although there is no fully agreed definition, space starts at around 100km in altitude. To have any hope of staying in orbit, you would have to reach twice that altitude. The International Space Station orbits at 340 kilometres, whereas the Hubble Space Telescope sits at 595 kilometres.
Monday, October 15, 2018
The technology does not exist. The viability of the market is questionable. Obviously it's the regulators' fault.
I haven't decided if I'm going to discuss this Wired magazine article (Inside the Secret Conference Plotting to Launch Flying Cars by Eric Adams) in greater depth (frankly, probably not worth our time), but I did want to take a moment to hit this one point.
Of course, it is entirely possible that we will see some aeronautics innovation that will make this model not only viable but popular. Unfortunately, there are huge engineering challenges involved in developing what we're talking about here, an incredibly compact, autonomous, electric VTOL aircraft that can operate reliably, safely and quietly in dense urban areas with crowded air spaces (that last one alone is enough to make one skeptical about the prospects of the industry. Reducing aircraft noise is a problem that has stumped the world's best engineering minds for longer than any of us have been alive). And we can't begin to talk about the implications for urban planning, infrastructure, and regulation until we see what these things are going to look like.
Given the stunning difficulties on the development side, the questionable business models, and over a century of highly touted proposals for personal aircraft that never made it past the prototype stage, why the hell is the author talking about sluggish regulators?
One of the most cherished tenets of the standard tech narrative is that we would all be living in a wondrous futuristic land – – half sci-fi movie, half amusement park – – if not for those darned regulations. It's a perfect, multipurpose excuse. It teases us with the promise of great things just around the corner. It creates a handy set of villains to boo and hiss. It neatly explains away the failures of tech messiahs to come up with appealing and functional technology or viable business plans.
It is also bullshit. There are certainly cases where onerous regulations hold up big infrastructure projects and you can make the case that the IRB process is delaying certain medical advances, but in the vast majority of cases where a new technology fails to catch on, it is because of incompetent execution, bad engineering or non-feasible business models, but those explanations are difficult to write up, run counter to the standard narrative, and tend to make the journalists look like idiots for having bought the hype in the first place.
The next morning, the group, which featured more than 100 leaders from well-known high-tech companies, research entities, and investment firms, went to work on a single, seemingly impossible challenge: bringing the nascent air taxi industry to life.First off, the air taxi industry is not nascent; it is moribund. We've had helicopter-based shuttle services for decades. They just never, if you'll pardon the expression, took off the way a lot of people had expected. To be fair, no one was predicting the kind of door to door service you can get from an automobile, but in a high density area they did provide a fast and flexible way of getting reasonably close to your destination.
That vision—most prominently laid out by Uber—requires an entirely new class of vehicle powered by batteries far better than the ones we have now. It will depend on approvals from sluggish regulators and safe integration into crowded airspace. Autonomous flight systems that are nowhere near ready for human passengers will be essential. The field will need seemingly unlimited funding and a strategy for convincing the public to put their lives in the hands of this new tech. Oh, and the whole effort already risks being upstaged by well-financed Chinese innovators who are plowing forward absent many of these constraints.
Of course, it is entirely possible that we will see some aeronautics innovation that will make this model not only viable but popular. Unfortunately, there are huge engineering challenges involved in developing what we're talking about here, an incredibly compact, autonomous, electric VTOL aircraft that can operate reliably, safely and quietly in dense urban areas with crowded air spaces (that last one alone is enough to make one skeptical about the prospects of the industry. Reducing aircraft noise is a problem that has stumped the world's best engineering minds for longer than any of us have been alive). And we can't begin to talk about the implications for urban planning, infrastructure, and regulation until we see what these things are going to look like.
Given the stunning difficulties on the development side, the questionable business models, and over a century of highly touted proposals for personal aircraft that never made it past the prototype stage, why the hell is the author talking about sluggish regulators?
One of the most cherished tenets of the standard tech narrative is that we would all be living in a wondrous futuristic land – – half sci-fi movie, half amusement park – – if not for those darned regulations. It's a perfect, multipurpose excuse. It teases us with the promise of great things just around the corner. It creates a handy set of villains to boo and hiss. It neatly explains away the failures of tech messiahs to come up with appealing and functional technology or viable business plans.
It is also bullshit. There are certainly cases where onerous regulations hold up big infrastructure projects and you can make the case that the IRB process is delaying certain medical advances, but in the vast majority of cases where a new technology fails to catch on, it is because of incompetent execution, bad engineering or non-feasible business models, but those explanations are difficult to write up, run counter to the standard narrative, and tend to make the journalists look like idiots for having bought the hype in the first place.
Friday, October 12, 2018
Brainwashing and subliminal manipulation – – another entry in our failed postwar technology series and an excuse to play a great movie clip.
[now with links]
We've had a thread going for a while now on technologies and lines of research that looked promising (if "promising" is the right word in this case) in the postwar era but which have failed to produce substantial advances in the 50 years that followed. We came up with a pretty good list (see here and here and make sure to check out the comments), but I don't believe that we included one field that seem to be making major developments in the 1950s and was a major component of postwar pop culture.
The brainwashed secret agent was a standard fixture of the spy fiction of the time. In Fleming even had a mind -controlled James Bond tried to kill M in the Man with the Golden gun. The first and best remembered example is the novel the Manchurian candidate each came out in 1959. The film adaptation (we will ignore the remake) was remarkably faithful to the book, but it considerably toned down both the sex and the politics. The James Gregory character in the movie suggests Joseph McCarthy. In the novel, there is no question that Raymond's stepfather is meant to be tailgunner Joe.
As Pauline Kael pointed out at the time, Richard Condon took a popular contemporary joke about the senator actually helping the Soviets so much through his crude attacks that he might as well be an agent and made it funnier and more subversive by playing it straight. He did so by positing a connection between two Cold War topics that were on everyone's mind, McCarthyism and the breaking of American POWs by the Chinese in the Korean War.
Robert Cialdini has an excellent discussion of the topic in his seminal book on marketing psychology influence. In his account, the large majority of American prisoners who were found to have collaborated with the enemy after the war were persuaded by entirely mundane but highly effective techniques applied incrementally and based on intuitive and well-established concepts like reciprocation, social norming, and commitment/consistency.
In the popular imagination, however, the process was seen as a mysterious and incredibly advanced application of cutting-edge psychology and neuroscience, usually involving drugs, hypnosis, high-tech torture, and Freudian psychology. Once you stripped away the science and the pseudoscience (mainly the latter), you are left with a magical trope that even predated Mesmer. The idea that there were secret techniques that would allow you to impose your will upon others was further boosted by books like the hidden persuaders which substituted subliminal images for spells and incantations.
As we've said before, it is essential to view these postwar concepts in the context of their times. From the vantage point of the mid-1950s it seemed reasonable, perhaps even self-evident, that virtually every field of science and technology was about to break open with a flood of new advances and discoveries. Any attempt at a list will almost invariably be incomplete, but it was an age when people were mastering the Adam, reaching out into space, making machines that could think, and seemingly on the verge of curing all diseases. In the light of all that, the notion that we had finally unlock the secrets of the brain was believable, perhaps even to be expected.
We've had a thread going for a while now on technologies and lines of research that looked promising (if "promising" is the right word in this case) in the postwar era but which have failed to produce substantial advances in the 50 years that followed. We came up with a pretty good list (see here and here and make sure to check out the comments), but I don't believe that we included one field that seem to be making major developments in the 1950s and was a major component of postwar pop culture.
The brainwashed secret agent was a standard fixture of the spy fiction of the time. In Fleming even had a mind -controlled James Bond tried to kill M in the Man with the Golden gun. The first and best remembered example is the novel the Manchurian candidate each came out in 1959. The film adaptation (we will ignore the remake) was remarkably faithful to the book, but it considerably toned down both the sex and the politics. The James Gregory character in the movie suggests Joseph McCarthy. In the novel, there is no question that Raymond's stepfather is meant to be tailgunner Joe.
As Pauline Kael pointed out at the time, Richard Condon took a popular contemporary joke about the senator actually helping the Soviets so much through his crude attacks that he might as well be an agent and made it funnier and more subversive by playing it straight. He did so by positing a connection between two Cold War topics that were on everyone's mind, McCarthyism and the breaking of American POWs by the Chinese in the Korean War.
Robert Cialdini has an excellent discussion of the topic in his seminal book on marketing psychology influence. In his account, the large majority of American prisoners who were found to have collaborated with the enemy after the war were persuaded by entirely mundane but highly effective techniques applied incrementally and based on intuitive and well-established concepts like reciprocation, social norming, and commitment/consistency.
In the popular imagination, however, the process was seen as a mysterious and incredibly advanced application of cutting-edge psychology and neuroscience, usually involving drugs, hypnosis, high-tech torture, and Freudian psychology. Once you stripped away the science and the pseudoscience (mainly the latter), you are left with a magical trope that even predated Mesmer. The idea that there were secret techniques that would allow you to impose your will upon others was further boosted by books like the hidden persuaders which substituted subliminal images for spells and incantations.
As we've said before, it is essential to view these postwar concepts in the context of their times. From the vantage point of the mid-1950s it seemed reasonable, perhaps even self-evident, that virtually every field of science and technology was about to break open with a flood of new advances and discoveries. Any attempt at a list will almost invariably be incomplete, but it was an age when people were mastering the Adam, reaching out into space, making machines that could think, and seemingly on the verge of curing all diseases. In the light of all that, the notion that we had finally unlock the secrets of the brain was believable, perhaps even to be expected.
Thursday, October 11, 2018
Sometimes I'm tempted just to rerun all of our 2016 posts
From TPM:
Brian Kemp Is Blocking 53K Applicants From Registering To Vote, Most Of Them Black
By Cameron Joseph
Georgia Secretary of State Brian Kemp’s (R) office is blocking 53,000 people from registering to vote, according to records obtained by the Associated Press, a huge number that could sway his gubernatorial race against Democrat Stacey Abrams.
As TPM laid out this morning, Kemp has used a controversial “exact match” program to approve or block voter registrations that disproportionately impacts minority voters.
Now we know exactly how many people that might affect this election. According to the AP, fully 70 percent of the voter applications that are being held up by Kemp’s office are from black people.
Tuesday, May 3, 2016
Context only counts if it shows up in the first two dozen paragraphs
The New York Times has a good piece on the impact of voter ID laws but I do have a problem with a few parts (or at least with the way they're arranged).In the third paragraph, we have two conflicting claims that go to the foundation of the whole debate. If election fraud is a significant problem, you can make a case for voter ID laws. If not, it's difficult to see this as anything other than voter suppression. This paragraph pretty much demands some additional information to help the reader weigh the claims and the article provides it...
Stricter Rules for Voter IDs Reshape Races
By MICHAEL WINES and MANNY FERNANDEZ MAY 1, 2016
SAN ANTONIO — In a state where everything is big, the 23rd Congressional District that hugs the border with Mexico is a monster: eight and a half hours by car across a stretch of land bigger than any state east of the Mississippi. In 2014, Representative Pete Gallego logged more than 70,000 miles there in his white Chevy Tahoe, campaigning for re-election to the House — and lost by a bare 2,422 votes.
So in his bid this year to retake the seat, Mr. Gallego, a Democrat, has made a crucial adjustment to his strategy. “We’re asking people if they have a driver’s license,” he said. “We’re having those basic conversations about IDs at the front end, right at our first meeting with voters.”
Since their inception a decade ago, voter identification laws have been the focus of fierce political and social debate. Proponents, largely Republican, argue that the regulations are essential tools to combat election fraud, while critics contend that they are mainly intended to suppress turnout of Democratic-leaning constituencies like minorities and students.
More than twenty paragraphs later.
Mr. Abbott, perhaps the law’s most ardent backer, has said that voter fraud “abounds” in Texas. A review of some 120 fraud charges in Texas between 2000 and 2015, about eight cases a year, turned up instances of buying votes and setting up fake residences to vote. Critics of the law note that no more than three or four infractions would have been prevented by the voter ID law.
Nationally, fraud that could be stopped by IDs is almost nonexistent, said Lorraine C. Minnite, author of the 2010 book “The Myth of Voter Fraud.” To sway an election, she said, it would require persuading perhaps thousands of people to commit felonies by misrepresenting themselves — and do it undetected.
“It’s ludicrous,” she said. “It’s not an effective way to try to corrupt an election.”
I shouldn't have to say this but, if a story contains claims that the reporter has reason to believe are false or misleading, he or she has an obligation to address the issue promptly. Putting the relevant information above the fold is likely to anger the people who made the false statements, but doing anything else is a disservice to the readers.
Wednesday, October 10, 2018
Immigration and talent
This is Joseph
This says so much in so little space:
At the very least, it undermines the US as a magnet for international talent. In some ways this might be good, by creating other clusters of talent across the globe and diversifying the intellectual frontier. But it does suggest that it is timely to consider why this change has been quite so dramatic and whether it might be below the optimal rate for driving American innovation.
This says so much in so little space:
In 2017, 608,000 students went abroad and 480,900 returned. China is proud of a return rate of 79 per cent; in 1987, the return rate was about 5 per cent, and in 2007 only 30.6 per cent.Some of this is undoubtedly driven by better conditions in China. But there is clearly a change in appeal for staying in the United States, as you do not see these kinds of massive shifts without something quite important changing.
At the very least, it undermines the US as a magnet for international talent. In some ways this might be good, by creating other clusters of talent across the globe and diversifying the intellectual frontier. But it does suggest that it is timely to consider why this change has been quite so dramatic and whether it might be below the optimal rate for driving American innovation.
Tuesday, October 9, 2018
Monday, October 8, 2018
The tyranny of the mean or why most of the analysis you're about to hear on Kavanaugh's impact on the election may be wrong.
[Corrected the title. Everything else is the same.]
This is not a prediction.
This is not a counter analysis.
I don't want to get sucked up into predictions and subjective probabilities or whether my priors can beat up your priors. Instead, I want to make a point about the framework and the assumptions generally used in these conversations and the way they often overlook the obvious.
I perhaps should have called this "the tyranny of central tendency" or possibly even thrown in something about expected value. The argument here is not limited to statements about means but they are the most familiar and by far the most often abused example.
One of our long-standing concerns here at the blog is that while journalists and commentators are far more likely to talk about data these days, they have not gotten any more sophisticated in how they think about statistics. We've already discussed naïve reductionism, the implicit and often completely inappropriate assumptions of linearity, non-interaction, and stability.
This is a good time to add a couple more to the list. When people talk about the impact of some treatment or event, there is a strong tendency to assume that the mean (or in some cases the median) has changed but everything else has remained the same. You still have a nice, symmetric Gaussian with the same variance. You've just shifted it from here to there.
These assumptions are particularly likely to bite you in the ass when the point of interest involves a quantile other than the median. For example, when rent-controlled housing in an expensive neighborhood is torn down and replaced with high density market priced apartment buildings, it is entirely possible for both the average price and the amount of low-cost housing available to go down at the same time.
This brings us to the impact of the Kavanaugh confirmation. Over the next couple of weeks, when you hear people discussing the chances of the Democrats retaking the House, you will notice that many, probably most, of the answers will be to an entirely different question: what are the projected totals for the election?
Once again at the risk of stating the obvious, if you keep the mean the same and increase the variance, the probability of passing some cut off that is between the max and min possible values will approach 50%. Even if you shift the mean away from the cut off, it is still possible to increase the probability of passing that point by increasing the variance.
At the moment (and that's an important qualifier), it is entirely possible that we are seeing this situation with Kavanaugh and the midterms. While we can argue about what the polls are telling us, I think it is reasonable to claim that Kavanaugh has been through most of this process and an increasingly unpopular choice and that most news outlets outside of conservative media have raised serious questions about his fitness. At the same time, his confirmation has unquestionably enraged and energized voters on both sides. This possibly unsustainable level of enthusiasm almost inevitably increases variability. As a result, it is entirely possible for the expected number of Republican House seats to drop while the chances of the Republicans holding the house increases.
This is likely to be short-lived phenomena. A week from now, I expect we will have both a clearer picture and a more stable situation. For now although, this is a good time to remind ourselves to be more careful about how we frame analytic questions.
This is not a prediction.
This is not a counter analysis.
I don't want to get sucked up into predictions and subjective probabilities or whether my priors can beat up your priors. Instead, I want to make a point about the framework and the assumptions generally used in these conversations and the way they often overlook the obvious.
I perhaps should have called this "the tyranny of central tendency" or possibly even thrown in something about expected value. The argument here is not limited to statements about means but they are the most familiar and by far the most often abused example.
One of our long-standing concerns here at the blog is that while journalists and commentators are far more likely to talk about data these days, they have not gotten any more sophisticated in how they think about statistics. We've already discussed naïve reductionism, the implicit and often completely inappropriate assumptions of linearity, non-interaction, and stability.
This is a good time to add a couple more to the list. When people talk about the impact of some treatment or event, there is a strong tendency to assume that the mean (or in some cases the median) has changed but everything else has remained the same. You still have a nice, symmetric Gaussian with the same variance. You've just shifted it from here to there.
These assumptions are particularly likely to bite you in the ass when the point of interest involves a quantile other than the median. For example, when rent-controlled housing in an expensive neighborhood is torn down and replaced with high density market priced apartment buildings, it is entirely possible for both the average price and the amount of low-cost housing available to go down at the same time.
This brings us to the impact of the Kavanaugh confirmation. Over the next couple of weeks, when you hear people discussing the chances of the Democrats retaking the House, you will notice that many, probably most, of the answers will be to an entirely different question: what are the projected totals for the election?
Once again at the risk of stating the obvious, if you keep the mean the same and increase the variance, the probability of passing some cut off that is between the max and min possible values will approach 50%. Even if you shift the mean away from the cut off, it is still possible to increase the probability of passing that point by increasing the variance.
At the moment (and that's an important qualifier), it is entirely possible that we are seeing this situation with Kavanaugh and the midterms. While we can argue about what the polls are telling us, I think it is reasonable to claim that Kavanaugh has been through most of this process and an increasingly unpopular choice and that most news outlets outside of conservative media have raised serious questions about his fitness. At the same time, his confirmation has unquestionably enraged and energized voters on both sides. This possibly unsustainable level of enthusiasm almost inevitably increases variability. As a result, it is entirely possible for the expected number of Republican House seats to drop while the chances of the Republicans holding the house increases.
This is likely to be short-lived phenomena. A week from now, I expect we will have both a clearer picture and a more stable situation. For now although, this is a good time to remind ourselves to be more careful about how we frame analytic questions.
Subscribe to:
Posts (Atom)