Comments, observations and thoughts from two bloggers on applied statistics, higher education and epidemiology. Joseph is an associate professor. Mark is a professional statistician and former math teacher.
All three companies contend that because of energy cost advantages over other forms of transportation, a system will be able to break even in a decade after full-scale operations begin.
No one knows how much it would cost to build and operate a hyperloop over a great enough distance to make the speed worth while. Using existing methods, a project requiring this level of precision on this scale would be obscenely expensive. In order to make a serious attempt, a company would have to come up with radically new approaches and technology.
If Virgin and the rest were each pursuing possible breakthroughs independently, we should be seeing big differences in their cost estimates, even if those estimates were wildly unrealistic. The fact that all three give the same time to break even point is curious.
I'm afraid the most likely explanation is that they haven't really done anything substantive on the construction side. They're still using the orifice-derived numbers from the original whitepaper, which are even more nonsensical since all of them have abandoned Musk's original air-caster proposal.
All three companies contend that because of energy cost advantages over other forms of transportation, a system will be able to break even in a decade after full-scale operations begin. Not only will commuters be able to get from place to place faster, but doing so will allow people to comfortably live far from their work, giving access to educational, cultural and health services normally out of reach.
“Hyperloop Alpha” emphasizes that the hyperloop technology will be completely solar powered. However, maglev and HSR are also electric and could in theory also be solar powered. Focusing on the amount of energy required, HT found that for most routes hyperloop would be 2 to 3 times more energy efficient than air on a passenger mile basis; however, maglev and HSR also use 1/3 the energy of air on a passenger mile basis. The emphasis on solar power tends to obscure the fact that no technology is entirely clean because there is energy consumed in manufacture and construction of the technology.
Picking up where we left off on the painfully credulous New York Times Hyperloop story, here are a few passages I want to single out.
“From the point of view of physics, hyperloop is doable,” said Garrett Reisman, professor of astronautical engineering at the University of Southern California and a former astronaut on the International Space Station.
The experience will be no different from riding in an airplane with the shades drawn, and technical issues around maintaining the vacuum within the tube will be solved, he believes.
Instead, hyperloop projects will face more mundane challenges.
“Getting innovative things through the regulatory and certification environments is very difficult,” Mr. Reisman said. “This could face an uphill battle in the U.S.”
First off, the does-not-violate-the-laws-of-physics standard is an incredibly low bar for an engineering proposal, particularly one that has been floating around in more or less its current form for about a century, but nonetheless it is frequently invoked in these articles.
The question is cost (both in terms of construction and maintenance), followed by speed and reliability. The problem Reisman cites is nontrivial (we’re talking millions of cubic feet of near vacuum), but it’s minor compared to the issue of stability, which is itself minor compared to that of manufacturing and assembling a massive structure with this level of precision.
Worrying about regulation at this point in the process is like debating what color you’ll paint your mansion when you win the lottery.
But Reisman is a model of critical thinking next to the articles other “skeptic.”
Rick Geddes, professor in the department of policy analysis and management at Cornell University, sees a different challenge. “The biggest problems for hyperloop will be securing rights of way and permitting,” he said.
Still, Professor Geddes believes that hyperloop systems will become a reality, as the time is ripe.
“There’s a sense that things are stale; we’re just adding to existing modes of transport,” he said. “Time is more and more a valuable commodity. The transportation industry is ready for a new way of thinking.”
This perhaps the most unintentionally informative passage in the entire piece. The hyperloop is an example of a major genre of 21st Century tech writing, stories about some long promised technology that is suddenly just around the corner. Fusion reactors, Martian colonies, the end of aging, yes, even flying cars.
When you scrape away the hype from these announcements, you never find the kind of transformative advances that would be needed to make these things viable. Instead you get a desire to believe and a vague sense that “the time is ripe.” It’s like the gambler’s fallacy for futurists. we’ve waited so long. Surely we’re due
For some reason, it has become obligatory to cite pneumatic trains as precursors of the hyperloop despite the fact that the technology had little connection to Musk's hyperloops and almost none to the "hyperloops" being proposed today (which are actually just maglev vactrains).
This late 19th Century system is a much more direct ancestor.
If there's an engineer in the audience, I'd very much like to know what
the relationship is between this very cool 1890 system and the history
of linear induction trains.
This article by Eric Taub is the kind of multilayered awful that requires multiple passes to address.
Just to catch up those who are coming in late, there has never been any question as to whether or not it is possible to build a high-speed maglev vactrain. There are still some nontrivial points to be worked out about reliability and stability, but those pale next to the central challenge of cheaply and quickly constructing then maintaining hundreds of miles of tubes consistently sustaining a near vacuum.
Each segment has to be airtight, absolutely uniform (a small irregularity can make a big difference at 600 miles an hour), and each joined with perfect seals. Add to that the cost of the magnetic levitation track and linear induction system and you have a fantastically expensive and time-consuming project.
It has become the norm for hyperloop puff pieces to ignore these main challenges in order to breathlessly announce major advances in what invariably amount to trivial side issues, but this piece manages to break new ground.
Anyone who has seriously followed the climate change debate over the past 15 or so years will be familiar with the first level of false balance where a minority, even fringe position is given equal standing with the scientific consensus. If you followed the coverage of Mars One, you've seen this taken to the next level where the majority of time is spent credulously recounting the fringe position with the mainstream skeptical view addressed briefly somewhere past the halfway point of the articles.
Now, the New York Times takes things even further. No one represents the mainstream consensus. The experts who are presented as "skeptics" are actually true believers brought in to introduce that incredibly tired Silicon Valley line about regulations being the only things holding us back from a technological utopia.
We've been through this before and I'm certain we will cover it again, but almost invariably if you hear someone going on about evil regulators holding back the development of a new technology (with the partial exception of medical fields), you can be fairly certain it's an attempt to distract from nonviable tech.
The highly-estimable FT Alphaville has long had a series: This is nuts. When's the crash?. That is my reaction to learning that Hoover Institution senior fellows are now crypto...
It is not at all clear to me whether they are grifters or griftees here...
I had known about John Taylor, but had thought that was a strange one-off. And now Niall Ferguson. Is anybody even pretending to have a business model other than pup-and-dump?
The New York Times' Eric Taub really goes for the gold here. For depth of buried lede alone, he may have set a record.
Way down in the 31st graf: " before such musings turn into reality, hyperloop proponents must prove that their systems work, that they’re safe for people and cargo and that they’re affordable" https://t.co/hGSLsIcqcE
"[A] cross between a Concorde and a railgun and an air hockey table"
or more prosaically
“[R]educed-pressure tubes in which pressurized capsules ride on an air
bearings driven by linear induction motors and air compressors."
The idea of air bearings has been around for a long time and has proven
useful for a number of applications, but, after a great deal of effort,
researchers concluded sometime around the 1970s that it was not workable
for high-speed rail. When companies started trying to build even small,
very limited working models of the Hyperloop, the first thing that
most, possibly all, did was to scrap the one aspect that set Musk's
concept apart from more conventional maglev vactrains. This is a small
detail but it is enormously telling. They dropped much of the actual
idea, but they kept the name and the associated buzz.
2. Neither the Hyperlop or the “Hyperloop” offers much new.
At least in the broad strokes, there's is little new in any of the
recent proposals. Musk's original presentation relied mainly on
Disco-era technology. I believe most of the current efforts have updated
that with passive levitation systems developed in the late 90s. Either
way, the systems that are now promised as just around the corner are not
that different from proposals from twenty years ago which begs an
obvious question: why weren't these trains built a long time ago. The
answer is…
3. You didn't see supersonic trains twenty years ago for the same reason you aren't likely to see them in the near future.
Money.
Whenever people looked seriously at these projects, they concluded that
the cost was prohibitive. And no, this didn't have anything to do with
land rights or onerous regulations.
4. A question of tolerance and other things
Even under the best of circumstances, big projects cost a great deal of
money, and with maglev vactrains, the conditions are about the worst
imaginable. This is supposed to be a brief overview, so I'm not going to
make a deep dive here, but I will mention three factors: reliability,
safety, and most of all tolerance.
You've got people traveling hundreds of miles an hour in a near vacuum.
Just to get the damn thing to work, every part has to be manufactured to
the tightest possible tolerances, every piece of work has to be done
perfectly. But just working is much too low a bar here. With a
Hyperloop, even a fairly minor failure can turn catastrophic, causing
tens of billions of dollars of infrastructure damage, not to mention
loss of life. Those standards of construction and maintenance are
tremendously expensive, particularly for a piece of infrastructure that
will stretch hundreds of miles.
5. Beware science-fair level demonstrations
When trying to follow the Hyperloop discussion, it is absolutely
essential to distinguish between the easy parts and the hard parts. Many
elements of the proposed system are well understood and in some cases
widely used already. If you went through the Birmingham Airport in the late 80s or early 90s, you've probably already traveled on a maglev train propelled by linear induction.
Other elements are extraordinarily difficult to pull off. For instance,
radical new construction techniques will need to be developed to make
the system commercially viable. As mentioned before, the combination of
extremely high speeds with the need to maintain a near vacuum over
hundreds of miles requires a stunning degree of reliability and
adherence to incredibly tight tolerances. Every seam has to be literally
airtight.
You will notice that the "test runs" we have seen from various Hyperloop
companies have focused almost entirely on the aspects that don't need
testing.
[Ran across this shortly after posting.]
6. So what would a real Hyperloop test look like?
We will know that the Hyperloop is actually getting closer when we start
seeing demonstrations that address concerns of civil engineers and
transportation researchers (specifically those not in the employ of Musk
or companies like Hyperloop One). For example, a process or
manufacturing tube segments of sufficient quality cheaply or a system
for joining these segments quickly and requiring few if any skilled
workers.
7. And no, this is not just like SpaceX and Tesla.
The long-popular "we should take Musk seriously because he has done
impossible things" genre has recently spawned the subgenre "we should
take Musk seriously because he's doing the same thing with
[Hyperloops/brain chips/giant subterranean slot car tracks] that he did
with SpaceX and Tesla" This is simply not true. The approach is almost
exactly the opposite. With the latter, Musk proposed plans carefully
grounded in sophisticated but entirely conventional technology. With the
former, he made vague, underdeveloped suggestions that left experts in
the respective fields pulling out their hair.
To be clear, Tesla and particularly SpaceX certainly had their doubters,
but the skepticism was focused on the business and finance side. Elon
Musk unquestionably accomplished some extraordinary things, but he did
so by the deviating from conventional wisdom in terms of how you set up
companies while staying safely in the mainstream when it came to
technology.
So I'm just going to come out and say that Reveal is doing excellent work and you should definitely check it out, particularly these stories on the return of redlining. My first job as a statistician was working in the finance industry and, though I have tried to keep up with the field ever since, I learned a great deal from this report including some disturbing aspects of the way credit scores are calculated.
It is also an example of damn good radio storytelling, more effective heard than read, but since I can't include an audio excerpt here...
For Faroul, things suddenly took a turn for the better after her partner, Hanako Franz, agreed to sign on to her loan application. At the time, Franz – who is half white, half Japanese – was working part time for a grocery store. Her most recent pay stub showed she was making $144.65 every two weeks. Faroul was paying for her health insurance.
The loan officer had “completely stopped answering Rachelle’s phone calls, just ignored all of them,” said Franz, 32. “And then I called, and he answered almost immediately. And is so friendly.”
A few weeks later, the couple got the loan from Santander and bought a three-bedroom fixer-upper. But Faroul remains bitter.
“It was humiliating,” she said. “I was made to feel like nothing that I was contributing was of value, like I didn’t matter.”
Contacted by Reveal, the lenders defended their records. Tobin, who turned down Faroul on her first application, said race played no role in the rejection.
“That’s not what happened,” she said and abruptly hung up. A statement followed from Philadelphia Mortgage Advisors’ chief operating officer, Jill Quinn.
“We treat every applicant equally,” the statement said, “and promote homeownership throughout our entire lending area.”
Faroul’s loan officer at Santander, Dennis McNichol, referred Reveal to the company’s public affairs wing, which issued a statement: “While we are sympathetic with her situation, … we are confident that the loan application was managed fairly.”
Reveal’s analysis of lending data shows that nationally, Santander turned away African American homebuyers at nearly three times the rate of white ones. The company did not address that disparity in its statement but said it was more likely to grant a loan application from an African American borrower than five of its competitors.
Since into the first shutdown (Joseph can back me up on this one), I've been arguing that a declaration of the state of emergency was highly likely. Here was my reasoning.
[Let's not quibble about the definition of a voting paradox for now. I'm on a roll.] It has been all the noted that if the stock in a company is held by three people with the first holding 1000 shares, the second holding 1000 shares, and the third holding one share, all three have equal voting power. Any two can get together to form a majority coalition.
We can say something similar about a multi player stag hunt. Assuming it takes a dozen hunters to successfully bring down the stag, anyone who controls a big enough group to bring the party under that threshold has the power to end the hunt.
This brings up what always should have been obvious problems with the conservative movement strategy. Just to recap, conservative leaders especially in the 60s and 70s came to the conclusion that their policies would never be as popular in the long run as those being advanced by liberals like LBJ. In order to maintain power under these conditions, they came up with a plan that allowed a strong and disciplined minority to maintain a hold on most of the power in the government. The plan always operated on thin margins. In order for this to work, the GOP had to deliver a higher turnout, especially for elections of high strategic importance.
Arguably the central underlying flaw in the plan was static thinking, the failure to account for the consequences of their own success. For starters, there was the inevitable tendency to shave away margins of safety. We all do this to some degree. Whether it be with time or money, after a close call, we make sure to give ourselves a generous cushion only to find it has somehow been chipped away. In this case, the Republican Party has become entirely dependent on these strategic advantages in order to remain viable. (It is worth noting that, except for the patriotic fervor of 2004, no presidential candidate of the party has won the popular vote since 1988.)
At the margins were growing thinner, the conservative movement was also starting to lose control of the social engineering experiment designed to create a motivated and reliable base. The flaw has always been there in plain sight. A Straussian scheme to use the tools of a totalitarian state media – – propaganda and disinformation – – in a subculture of a free and open society will always prove unstable. The wrong messages will start to go to the wrong people and at some point the misinformed cannon fodder will end up holding positions of power in the party.
It was the worst possible time for things to fall apart. The threat of diminishing margins is twofold. First, there is an absolute lower bound of viable popularity. When enough of the country opposes you, even the most strategic allocation of resources will not save you. To make matters worse, as you approach this bound, smaller and smaller blocks within the movement gained veto power.
Which brings us to today.
Politically speaking, declaring a state of emergency to fund an unpopular project that your administration didn't bother with for the two years you held every branch of government is a bad move, but it might just be the best one open to the Republican Party.
What appears to be an increasing and increasingly motivated majority of the country opposes Trump and the GOP agenda. The Republicans' chances of holding anything more than an entrenched court and a few statehouses are very small and dependent on doing two things: slowing these trends and keeping their coalition completely intact.
Unfortunately, it is now next to impossible to do both of these things at the same time. Just to have some numbers to play around with, let's say that the Ann Coulter/Rush Limbaugh wing represents 10% of the country which is willing to punish a GOP politician in the primary. Furthermore, let's say that half of them might be persuaded to support a third-party candidate or simply sit out one or more future elections. Given the margins we are talking about, even that 5% would be a devastating loss.
We have a similar situation with the cult of personality followers of the president. If the Republicans try to use Trump as wolfmeat, the resulting rebellion of some of those followers will almost certainly enough to push the party deeply into the danger zone.
Under these circumstances, the worst thing that could happen would be to force every Republican member of Congress to cast a vote on the record either for opening the government or for standing firm until the wall was funded. As unpopular as the state of emergency is, it passes the bomb to the courts where the electoral consequences are less immediate and there is a chance it can be diffused.
Of course, there's always the possibility that the declaration will further anger the majority of the country while failing to placate the Coulter/Limbaugh faction. That would be ugly.
Last week, a staffer at Need to Impeach, an organization that advocates for the impeachment of Donald Trump, received an outrageous proposal via email. Jerry Media, the viral marketing agency famous for promoting the ill-fated Fyre Festival, was now working in an unofficial capacity with the anonymous creator of the World Record Egg, and the company was hoping to broker a deal between the nonprofit and the egg.
Over the past few weeks, the egg has become an internet phenomenon. On January 13, the account’s first post became the most-liked Instagram photo of all time; by the time Jerry Media approached Need to Impeach, the account had more than 9.4 million followers. Since then, the account has posted a series of photos of the same egg with a progressively larger crack, suggesting something inside. In a slide deck, Jerry Media proposed that the egg crack to reveal the words Impeach Trump as Trump popped out and did the chicken dance. The agency even created a short animated video demonstrating the stunt.
Need to Impeach ultimately passed on the opportunity. “It was interesting,” said Aleigha Cavalier, a spokesperson for Tom Steyer, the founder of Need to Impeach. “But I probably get 20 to 25 crazy ideas a week, [and] this didn’t move further than that.” Mick Purzycki, the CEO of Jerry Media, confirmed the details of the proposal, but stressed that the goal of the stunt wasn’t monetary. “We liked it for noncommercial reasons,” he said.
Following up on our 2016 thread, Timothy B. Lee has a great piece up at Ars Technica on how the National Highway Traffic Safety Administration screwed up its analysis of the safety record of Tesla's badly named Autopilot. The mistakes are really embarrassing but perhaps the most disturbing part is the way that the NHTSA kowtowed to the very company it was supposed to be investigating.
To compute a crash rate, you take the number of crashes and divide it by the number of miles traveled. NHTSA did this calculation twice—once for miles traveled before the Autosteer upgrade, and again for miles traveled afterward. NHTSA found that crashes were more common before Autosteer, and the rate dropped by 40 percent once the technology was activated.
In a calculation like this, it's important for the numerator and denominator to be drawn from the same set of data points. If the miles from a particular car aren't in the denominator, then crashes for that same car can't be in the numerator—otherwise the results are meaningless.
Yet according to QCS, that's exactly what NHTSA did. Tesla provided NHTSA with data on 43,781 vehicles, but 29,051 of these vehicles were missing data fields necessary to calculate how many miles these vehicles drove prior to the activation of Autosteer. NHTSA handled this by counting these cars as driving zero pre-Autosteer miles. Yet NHTSA counted these same vehicles as having 18 pre-Autosteer crashes—more than 20 percent of the 86 total pre-Autosteer crashes in the data set. The result was to significantly overstate Tesla's pre-Autosteer crash rate.
...
It's only possible to compute accurate crash rates for vehicles that have complete data and no gap between the pre-Autosteer and post-Autosteer odometer readings. Tesla's data set only included 5,714 vehicles like that. When QCS director Randy Whitfield ran the numbers for these vehicles, he found that the rate of crashes per mile increased by 59 percent after Tesla enabled the Autosteer technology.
So does that mean that Autosteer actually makes crashes 59 percent more likely? Probably not. Those 5,714 vehicles represent only a small portion of Tesla's fleet, and there's no way to know if they're representative. And that's the point: it's reckless to try to draw conclusions from such flawed data. NHTSA should have either asked Tesla for more data or left that calculation out of its report entirely.
NHTSA kept its data from the public at Tesla's behest
The misinformation in NHTSA's report could have been corrected much more quickly if NHTSA had chosen to be transparent about its data and methodology. QCS filed a Freedom of Information Act request for the data and methodology underlying NHTSA's conclusions in February 2017, about a month after the report was published. If NHTSA had supplied the information promptly, the problems with NHTSA's calculations would likely have been identified quickly. Tesla would not have been able to continue citing them more than a year after they were published.
Instead, NHTSA fought QCS' FOIA request after Tesla indicated that the data was confidential and would cause Tesla competitive harm if it was released. QCS sued the agency in July 2017. In September 2018, a federal judge rejected most of NHTSA's arguments, clearing the way for NHTSA to release the information to QCS late last year.
The best line to date on the Bezos/National Enquirer story.
Hard to think of a better demonstration of just how definitively our universe is The Bad One when the news story is "Bald Tech Gazillionaire With Drone Army Utterly Crushes Metropolitan Print News Journalist's Expose" and you know 'Lex' is the good guy in that scenario. https://t.co/195WaEWNSq