Half-baked self-driving cars will create aviation-style accidents?
From a recent New York Times Tesla test-drive:
so successful was Autopilot that I was tempted to let down my guard by not bothering to look in the rearview mirror.
For all its vision capabilities (including in darkness), Autopilot became confused when lanes weren’t clearly marked or split in two or at exit ramps. You can’t simply program the destination and let the car find its way. It’s reassuringly cautious about changing lanes, but in heavy traffic, I would have missed an exit while waiting for it to find a suitable opening, and had to assert manual control.
While heading south on the New Jersey Turnpike, I could see in the rearview mirror a BMW bearing down at high speed. I pushed the turn signal for a lane change, and despite its ultrasonic sensors, the Tesla seemed oblivious to the onrushing car. It started to move into its lane; the driver laid on his horn, and I had to grab control to avoid an accident.
Quite a few aviation accidents and incidents have occurred due to pilots’ confusion regarding what the autopilot was responsible for doing. One of the most famous is Asiana 214 at San Francisco. From Wikipedia:
In response, the captain selected an inappropriate autopilot mode, which, without the captain’s awareness, resulted in the autothrottle no longer controlling airspeed. … Over-reliance on automation and lack of systems understanding by the pilots were cited as major factors contributing to the accident. The NTSB further determined that the pilot’s faulty mental model of the airplane’s automation logic led to his inadvertent deactivation of automatic airspeed control.
It is extremely unlikely that the crew would have crashed the B777 if they’d simply been hand-flying and knew that they were responsible for both yoke and thrust levers.
I’m wondering if we will quickly conclude that anything more advanced than cruise control in a car is a bad idea, unless the car can drive itself under all conditions.
Full post, including comments




