Half-baked self-driving cars will create aviation-style accidents?

From a recent New York Times Tesla test-drive:

so successful was Autopilot that I was tempted to let down my guard by not bothering to look in the rearview mirror.

For all its vision capabilities (including in darkness), Autopilot became confused when lanes weren’t clearly marked or split in two or at exit ramps. You can’t simply program the destination and let the car find its way. It’s reassuringly cautious about changing lanes, but in heavy traffic, I would have missed an exit while waiting for it to find a suitable opening, and had to assert manual control.

While heading south on the New Jersey Turnpike, I could see in the rearview mirror a BMW bearing down at high speed. I pushed the turn signal for a lane change, and despite its ultrasonic sensors, the Tesla seemed oblivious to the onrushing car. It started to move into its lane; the driver laid on his horn, and I had to grab control to avoid an accident.

Quite a few aviation accidents and incidents have occurred due to pilots’ confusion regarding what the autopilot was responsible for doing. One of the most famous is Asiana 214 at San Francisco. From Wikipedia:

In response, the captain selected an inappropriate autopilot mode, which, without the captain’s awareness, resulted in the autothrottle no longer controlling airspeed. … Over-reliance on automation and lack of systems understanding by the pilots were cited as major factors contributing to the accident. The NTSB further determined that the pilot’s faulty mental model of the airplane’s automation logic led to his inadvertent deactivation of automatic airspeed control.

It is extremely unlikely that the crew would have crashed the B777 if they’d simply been hand-flying and knew that they were responsible for both yoke and thrust levers.

I’m wondering if we will quickly conclude that anything more advanced than cruise control in a car is a bad idea, unless the car can drive itself under all conditions.

21 thoughts on “Half-baked self-driving cars will create aviation-style accidents?

  1. This post brings up a really good point. There may be a period in which semi-automated vehicles lead to more accidents because of increased reliance on the automation. This is clearly negligent behavior, but it’s negligence that would be so easy to partake in, especially with phones constantly calling for our attention.

  2. “I’m wondering if we will quickly conclude that anything more advanced than cruise control in a car is a bad idea, unless the car can drive itself under all conditions.”

    I thought this was basically what Waymo concluded.

    One key difference between aviation and auto is that in aviation you are applying automation to what is already an extremely safe system so rare edge case failures have a relatively big impact on overall safety. Much less so for auto.

  3. Does the autopilot automatically roll up the windows and lock the doors while driving through the ‘wrong’ neighborhood?

  4. Autopilot lanes (like carpool lanes) on interstates, for example between LA and SF, is a good compromise.

  5. Commanding the car to change lanes after seeing a BMW heading towards the lane at high speed sounds like something a 90 year old would do. Was the NYTimes photo of the author taken 50 years ago?

  6. bjdubbs:

    I have been wondering how much the infant automated-car-driving effort has borrowed from the mature autopilot technology for trains.

    So is the answer, pretty much everything except respect for the vehicle operator?

    I dislike automation, but anti-lock brakes and traction control have helped me avoid accidents. We forget that those are technologies meant to enhance the driver’s control, not eliminate his participation.

  7. autopilot technology for PLANES.

    (though now that I think about it, train traffic has been more successfully automated than any other transport.)

    And now that I think about it, before the internal combustion engine, all our vehicles had plant-fueled, self-healing, self-replicating engines with advanced protein-based sensory arrays and self-programming navigation systems.

    Why not put a modified horse’s brain in a Tesla?

  8. @Mememe I think a horse-like intelligence is probably possible, if not what something like a tesla autopilot can already deliver. The problem is that lazy/inattentive “drivers” might regard it less like a horse (probably won’t run into stuff, even helpful as a second set of eyes, but requiring some vigilance) and more like a human driver (getting us there there is someone else’s problem).

  9. We are thinning the herd of people gullible enough to turn a 4000 pound missile completely over to automation. I’m reminded of an old (maybe Bob Newhart) skit about designing a modern transportation system: “Just pave over the carriage paths, paint a line down the middle, and launch heavy self-propelled vehicles head-on at each other, but be careful.”

    Drivers will learn pretty fast to stay hands-on if an intersection (on-grade or limited access) is in sight. He/she can let the car try the transit, but must be prepared to override it. This does not bode well for truly autonomous vehicles – if they are bound by “envelope protections” the roads will be littered with parked unmanned vehicles needing a reset.

    I like the idea of carving out standardized routes for automation, and incrementally modifying existing routes to comply until most origins and destinations are connected. Trucking and automotive interests should be willing to support this financially

    Avigation is orders of magnitude more standardized and constrained than terrestrial navigation, despite having more degrees of freedom.

  10. It’s a good point, but isn’t this valid for any driver assistance? If you start relying on sensors for aiding driving (even if not auto-pilot) you risk that people take it non-critically, and become inattentive, not overruling mistakes.

    However, it goes back to risk-benefit: yes, it may be that some accidents are because people excessively relied on occasionally flawed driving-assistance/semi-autopilot systems. But how many times did that half-baked system prevent distracted humans from crashing?

    I’d argue aviation autopilot is different, but still, one can ask: Is there any data from the aviation industry? For every Asiana 214, how many accidents were prevented by the same autopilot because of tiredness/distraction/incompetence/etc?

  11. Why is it relevant that he was driving south?

    The problem is not self driving cars – it is the other humans. Why was the BMW bearing down at high (presumably illegal) speeds? Why did the semi-truck that killed the guy in the Tesla make an illegal left turn and put his truck crosswise in the face of oncoming traffic (because he figured that he was bigger and that the the other cars would brake even if they did not have the right of way). Why did the California highway authorities not fix the crush protector weeks after an earlier accident? Why do people keep hitting that spot?

    What we really need is for ALL the cars to be automated and communicating with each other and with the highway itself. Every self driving car will then come with a dog. The job of the dog will be to bite you if you try to touch the controls.

    Even then you will have issues like homeless people who decide to cross the median strip at night in the middle of the block.

  12. The initially very high fatal accident rate of Cirrus could well have resulted from the emboldened pilots by the BRS parachutes; education of the small group of pilots has been the key to the improved safety; trying to educating the general public (drivers) would be less fruitful.

  13. “There may be a period in which semi-automated vehicles lead to more accidents because of increased reliance on the automation.”

    I think there will be fewer accidents but DIFFERENT types of accidents. Just as in the case of autopilot in planes. Overall, I am sure that autopilot has increased aviation safety but in rare cases it causes accidents rather than prevents them. This is like the people who won’t wear seat belts because they are afraid of becoming trapped if the car skids off the road into deep water. This may have actually happened on rare occasions but for every one person who is killed by wearing seat belts maybe 1,000 lives are saved so they are strongly a net positive.

    Tesla is already claiming (and their numbers may be hokey) that Teslas in autopilot mode have fewer fatalities per mile than hand driven cars (hokey because they are comparing apples and oranges – the human #’s include driving in all conditions while autopilot is only supposed to be engaged under certain favorable conditions).

  14. Imagine a world where driving is 99% or more automated, but when it gets really hairy, the autopilot bails out with a “your car”, like it happend to the crew led by captain Sully.

    In such a world, those young people who aren’t really devoted to driving a car by themselves, will never get real experience. Which requires many, many hours of driving. And learning from errors.

    Imagine somebody like this, driving through

  15. I think we are massively overestimating how much a human will pay attention, when the car is driving autonomously. Psychologically, in the age of the very addictive cell phone, this is just unrealistic. Partial autonomy is not safe, and I don’t think it can be made safe.
    I am all for self-driving cars, but partially self-driving cars are a dangerous idea.

  16. @Finn brings up an excellent point. I think I read somewhere that ABS brakes hadn’t reduced accidents, because drivers just tailgate and drive more aggressively, negating the advantage of the ABS. I’ve heard this more than once.

    The other thing that emboldens drivers is auto insurance. The logic is as follows: “Why should I be careful if somebody else will pay?” Sure, third order your insurance goes up a bit, but most people discount this, I think.

  17. GC – therefore we should ban anti-lock brakes, auto insurance AND self-driving cars. Also seat belts – if you know you are going thru the windshield, this makes you a much more cautious driver.

  18. I don’t think we will see a problem with accidents caused by confusion in the use of technology presented to drivers/passengers in self-driving vehicles.

    A 777 aircraft and the knowledge needed by 777 pilots to correctly use the FMC, Autopilot and Auto-throttle systems in all phases of flight goes well beyond the potential knowledge needed by any future self-driving vehicle driver. I’m not sure we will have pilots making mode selections in aircraft cockpits in the near future.

    While the edge cases encountered by the self-driving vehicles are still being discovered, these are not insurmountable and it is only a matter of time before the vehicle technology and infrastructure improvements make self-driving vehicle safety orders of magnitude more safe than the current system.

  19. As others have said…. I suspect more self driving cars will mean different accidents- probably fewer, but we’ll see.

    I can’t find the article I read recently on semi trucks. While it was mostly favorable for eventually adopting self driving technology, it included a paragraph about a recent semi with automatic braking. An SUV cut in front of the rig on a freeway, the truck applied its own brakes, and the trailer fishtailed around and jackknifed. As I recall it was the (experienced) semi driver’s first accident, even though it wasn’t her fault.

    While self driving cars are controversial, and many of us would not want a computer to drive us, I bet on most long trips we each see at least one other driver where we’d all be better off if a computer was driving their car.

Comments are closed.