This just in...
“Consumers should never attempt to create their own test scenarios or use real people, and especially children, to test the performance of vehicle technology,” @NHTSAgov says after Tesla owners post videos putting FSD to the test with child pedestrians. https://t.co/z054vjWdqi
— Craig Trudell (@crtrud) August 17, 2022
___________________________________________________
I was just going to throw this in with the rest in a Tuesday Tweets post, but I decided it needed a post of its own.
Rob Stumpf writing for the Drive.
Video of the test was released earlier this week and shows a Tesla Model 3 repeatedly striking a small, stationary dummy directly in front of the car while supposedly operating on Tesla's controversially named Full Self-Driving Beta software. However, clips from the video led some online publications to instead call the test a "smear campaign" under the notion that FSD was not actually engaged, and after further evidence emerged that FSD was engaged during the test, Tesla fans and FSD users began filming their own experiments to see what happened—with mixed results. One noted Tesla devotee even staged a public call for people to volunteer their own children to stand in front of his Tesla and prove it'll stop in time.
So will a self-driving Tesla run over a child? Amid the noise, the answer seems to be a resounding "maybe," which is just as bad as "yes" in this case. Here's where things stand.
To understand why this test is so controversial, it's important to start with the funding behind it.The test, video of which is embedded above, was paid for and performed by The Dawn Project, an organization campaigning to promote "unhackable" software and systems. The Dawn Project is backed by Dan O'Dowd, who is the CEO, president, and founder of Green Hills Software, a competing company in the AV space. In January, O'Dowd took out a full-page ad in the New York Times campaigning to have Tesla's FSD Beta banned from operating on U.S. roads. O'Dowd also ran in the Democratic U.S. Senate primary to represent California earlier this year, though he only garnered 1.1% of the vote. [O'Dowd's "campaign" was strictly about publicizing the flaws in FSD and never pretended to be about anything else. 1.1% was probably more than he expected. -- MP] Tesla CEO Elon Musk publicly slammed Green Hills following the full-page ad, calling it "a pile of trash."
O'Dowd's test clearly shows the Model 3 slamming into the dummy three times, but its stitched-together footage showing the infotainment screen doesn't match up with how the car would actually look when FSD Beta is engaged. There is no Autopilot icon, the prediction line remains gray, and the speed doesn't match up with the reproducibility steps published by The Dawn Project. Electrek took this as evidence that FSD Beta was not engaged and published an article condemning the test.
Tesla fans began to reference this article as proof that the test was flawed. Even Elon Musk joined in on sharing the article, tweeting it at The Guardian while calling the test a "scam video."
Later, raw footage was published that showed the view from inside the cabin, and it appeared to show a UI on the central screen that would indicate FSD was in fact engaged. Furthermore, Art Haynie—the driver who conducted the test on behalf of The Dawn Project—signed an affidavit claiming that FSD Beta was active at the time of the test.
Regardless, this discrepancy then caused some people to go into full-on defense mode and re-enact tests themselves in an attempt to disprove the findings published by The Dawn Project.
Some people began setting up their own stationary mannequins on residential streets. They attempted to recreate the test results and published videos showing the vehicle avoiding the mannequin without hitting it. Others were able to replicate the findings as their own vehicles slammed into their homemade dummies. And yes, there's that guy on Twitter who asked for volunteers to have their children run in front of his own Tesla to prove it'll stop in time.
We've known for a while that FSD would sometimes try to steer into pedestrians, cyclists, pylons and especially stationary vehicles. There are endless videos on YouTube and Twitter (at least until Musk buys it) of test drivers having to disengage the system to prevent disaster, but the footage of Tesla mowing down statues in the shape of small children very much struck a nerve, particularly after the more honest attempts at debunking started producing more videos in the same vein.
The most popular defense has been that Tesla's AI is smart enough to tell the difference between a small child and a mannequin without any real chance of error. This argument has been made most visibly by the Whole Mars Catalog. More than "that guy on Twitter," the site has an unofficial but very close relationship with Musk and Twitter, which makes these latest threads all the more bizarre.
Just so we all have the full picture here: it's not just that these people want to endanger children for social media stunts posing as "self-driving testing," they will initiate an insurrection against the government if their right to do so is infringed https://t.co/i2LxcsrKxG
— E.W. Niedermeyer (@Tweetermeyer) August 10, 2022
HAHA JUST KIDDING THERE IS NO WAY A CAMERA-ONLY SYSTEM EVER PASSES THIS TEST IN THE DARK
— E.W. Niedermeyer (@Tweetermeyer) August 11, 2022
somewhere a san francisco divorce lawyer is about to have a very exciting day pic.twitter.com/8A8eHdApN0
— Sam Biddle (@samfbiddle) August 10, 2022
No comments:
Post a Comment