The big difference this time is people seem to be paying attention.I’d just like to point out that right now, right this very moment, @elonmusk is going through my Facebook history and screen grabbing things to share with his friends on the internet... if you’re investing in Tesla or in a Tesla you need to sit with that. $TSLA
— Linette Lopez (@lopezlinette) July 5, 2018
.@kirstenkorosec reports in today's @TechCrunch newsletter that Dr. Missy Cummings has received death threats, something I've heard from numerous sources over the last few days. pic.twitter.com/5tu0eBNkS4
— David Zipper (@DavidZipper) October 24, 2021
1/ This is the kind of toxic scum with whom @elonmusk associates: Omar Qazi.
— Motรถrhead (@BradMunchen) October 23, 2021
Here, in a since deleted tweet, Qazi sprouts an idea of how to get @NHTSAgov to stop going after Tesla's dangerous Autopilot/FSD product: sending a gay Tesla rep to convince @SecretaryPete. $TSLAQ pic.twitter.com/SFvyrF3JlM
To be clear - this is really well written and I’m very glad @DavidZipper is doing this analysis with fresh eyes on something most Tesla reporters & researchers have been dealing with for a long time.
— Lora Kolodny (@lorakolodny) October 23, 2021
From David Zipper's Slate article.
Many technologists and automotive experts are cheering a Cummings appointment to NHTSA. But an extremely online community of Tesla fans is furious. A couple of hours after the news broke, Omar Qazi, a Tesla booster with a large online following, tweeted, “If they try and take Autopilot away from us we will riot so hard January 6 will look like a day at Disneyland,” concluding with a laughing emoji. Qazi later deleted the tweet, issuing an apology and claiming it was a joke.
That may be true, but much of the online Tesla community seemed to be having a meltdown (including more than a few people who employed disturbing and misogynistic language). Within hours, a petition on Change.org called on the Biden administration to reconsider Cummings’ appointment, collecting more than 18,000 signatures in two days. Elon Musk himself tweeted, “Objectively, her track record is extremely biased against Tesla,” and then jokingly responded to a fake account created in Cummings’ name. On Thursday evening, after enduring two days of online harassment, Cummings seemingly deleted her Twitter account.
The hyperventilating reaction shouldn’t come as a surprise, given the cultlike loyalty that Tesla has inculcated with its fans, especially those active on social media (who, to be fair, do not reflect all Tesla supporters). In reality, any senior adviser’s ability to set policy is constrained by the rigidities of the Department of Transportation’s org chart as well as the byzantine federal regulatory process. No one should expect a recall of Autopilot anytime soon, even if such steps appear warranted on safety grounds, as I’ve argued previously. (In a nutshell: Autopilot should have stronger driver-monitoring systems, be given a less misleading name, and only be accessible in safe highway environments.)
But could the Biden administration ultimately force Tesla to pull Autopilot or place constraints on its use? That seems increasingly plausible. Five-year-old guidance from NHTSA articulates the agency’s authority to intervene if autonomous driving systems show evidence of “predictable abuse,” a reasonable charge to levy at Tesla given the array of YouTube videos of drivers asleep or playing games in the driver’s seat, despite warnings in Tesla’s manual. Over the summer NHTSA launched an investigation into a pattern of Teslas striking stationary emergency vehicles, and the agency has challenged the automaker to explain why it didn’t issue a recall for a recent software update. Meanwhile, a growing number of fatalities has been tied to Autopilot, including one in California in which a Tesla Model 3 traveling at 60 mph crashed into a pickup truck and killed one of its occupants (the victim’s family has sued the company). Tesla’s defenders often point to the nearly 40,000 annual traffic fatalities in the United States, suggesting that Autopilot is safer than human drivers, but evidence for that claim is lacking.