Monday, March 27, 2017

Self driving cars

This is Joseph

Over at Vox, Timothy B. Lee makes an argument about self-driving cars:
We don’t have enough data yet to say how today’s self-driving cars compare with today’s human drivers. What does seem likely, however, is that self-driving software will steadily get better over time, while human drivers won’t. And the sooner we reach the point where computers are safer than human drivers, the sooner we can start saving lives.
He talks about this partially in the context of Uber, which has a culture that is very tolerant of risk taking.  However, I think that the idealism in this viewpoint may be too strong.

If we were talking about computer driven trains, where one of the outcomes can be "stop the train and wait for a human" But I think that there are several issues here that should be considered when arguing for self-driving cares being necessarily safer.

1) I would expect there to be a period of transition where human and self-driving cars share the road.  The cars need to be able to handle the human drivers, where tricks like inter-vehicle communication are not going to work

2) Infrastructure is quite varied and tends to break.  How do self-driving cars handle a broken streetlight system or roads without markings, or badly parked cares?  Working 99% of the time is not enough if occasionally there is a disruption.  Right now most adults are very proficient at driving -- what if that skill stops being common among passengers?

3) Is silicon valley reliable enough?  Remember, running on software are vulnerable to issues like bad software updates.  It seems like special pleading to claim that a bad patch should be treated as an exception.  People aren't supposed to drive drunk but they do and the resulting mayhem is laid at the feet of cars (and rightly so).

4) How things work at scale is a whole different issue.  To make a lot of this stuff work we need to have strong regulation.  Look at the bitcoin problems -- fine for a secondary currency but imagine if the hard fork issue hit the US dollar?  What do you do if there are multiple protocols for self-driving cars that don't necessarily play nice with each other?

5) What happens when the car doesn't work?  Your iPhone can become a brick.  Do we really want to be on hold waiting to find out why the software system in the car isn't working?

None of this is to say that I don't like self-driving cars.  Insofar as they can make pubic transit possible for older and sick members of society then it will be a positive good.  It's not impossible to imagine ways that this technology can really help matters (automating parallel parking plays to all of the computer strengths and many drivers dislike it already).  But regulation seems to be the way to go, and it really would make me feel better if the people who spent decades making cars safe (traditional car companies) were more involved.

I would be surprised if Mark didn't have thoughts too.

2 comments:

  1. Tes And it's not just Timothy B. Lee. The publicity around self-driving cars is remarkably uncritical.

    Let's look at another aspect of safety. Even if the incidence of problems falls, it is possible the problems themselves will be more catastrophic. One scenario that is often cited as a real benefit of self-driving cars is that, once (nearly) all cars are self driving, it will be possible for bumper-to-bumper traffic to move at high speed on highways: the capacity of existing roads will be greatly increased. True, perhaps. But what if one car in this packed-like-sardines caravan suddenly has a mechanical problem and swerves out of line? Even if all the other cars' software recognizes it and responds in milliseconds, the laws of physics are not repealed. It will still take time to bring other cars to a stop. The pileup could be enormous. Dozens, even hundreds of vehicles could be involved in a single incident.

    ReplyDelete
    Replies
    1. But even by the low standards of the genre, Lee is pretty bad. Between his lack of knowledge and his eagerness to make this a regulation and Silicon Valley story, he manages to screw everything up. In particular, ignoring the relationship between levels of autonomy, safety and regulation makes his analysis worthless.


      https://en.wikipedia.org/wiki/Autonomous_car#Classification

      Delete