People seem to be excited about government regulations around self-driving cars. I’m wondering why this couldn’t be reduced to “Your self-driving car needs a $20 million liability insurance policy from an A-rated company.” The insurers aren’t going to underwrite high risks, presumably.
On a boat tour of Ft. Lauderdale’s rich waterfront houses, the captain/guide told us of his experience working on big private yachts. I asked him “If I am a rich douchebag can I buy a crazy long yacht tomorrow and, knowing nothing about boats, start driving it around?” He explained that Coast Guard regulations are essentially irrelevant because long before a regulation would require training or certification the boat’s insurer would insist on a licensed captain and perhaps additional crew members.
This is also true in aviation. A woman who got her pilot certificate yesterday can get into a $5 million Pilatus PC-12 and push the start button from an FAA point of view. But no insurance company would allow that.
Why not assume that underwriters are prudent and can evaluate the risks of self-driving vehicles at least as well as government workers and politicians?
[Maybe tweak this with a requirement also for comprehensive data and video logging in any self-driving car so that it is easier to determine if the self-driving vehicle caused an accident. But that would still be only a couple of paragraphs of regulation.]
Just because the free market frequently gives the right outcome, I don’t think that’s a reason to give up our responsibility to ensure the right thing happens. Ask AIG if insurance companies can be trusted to not underwrite stupid risks.
@ND I would worry less about insurance companies underwriting stupid risks, the actual insurance business (as opposed to AIG Financial Products arm) are heavily regulated (by states).
I guess the worry may be uninsured/under-insured motorists. You could argue that only wealthier people who typically wouldn’t skip insurance would buy nice cars made by the most beloved corporations like Google and Volkswagen who would never think of making an unsafe car. But even for a regular car, there are some base level requirements and all the bells and whistles are bonuses that insurance company might also reward you for.
A sufficiently rich douchebag might just decide to skip paying the premiums and “insure” the boat/plane himself. Even if the government insists on insurance (true for autos in most states) you can usually get around that by posting a bond a self-insurance.
We might also save a lot of money by staffing fire departments the same way. Since we have way fewer fires these days, the cost per fire is outrageous, so a lot of money could be saved by turning many fire houses to rescue departments, maybe a few motorcycles and an SUV to get through traffic super fast, and then have less frequent fire departments who eventually get there and put out the fire, but at least the people are rescued quickly.
Because lawsuits in general aviation hobbled the industry for a decade until the GARA was passed. Without legislation, makers of self driving cars will be driven out if the markets by lawsuits.
Lawsuits in the aviation industry have had a horrible effect on innovation. I would hate to have that happen to self driving cars while 40,000 people die on the roads every year. Self driving vehicles should not be immune from liability, but some reasonable boundaries as well as ultimate driver responsibility should be taken into account. Letting insurance companies decide this would lead to huge premiums, stifled innovation, and greater overall deaths. Since when has insurance made any other industry more efficient and innovative? Medicine? Life insurance? Flood insurance?
Self-driving-tech companies are looking for two handouts:
1. reduced expected liability for people choosing to operate a car with self-driving features
2. road features or lane reservations or other traffic law changes that make automating driving ‘correctly’ easier
We don’t want to offer any concessions prematurely (as with solar subsidies); the tech needs to improve more than we need to lower the bar
A truly rich man could not only afford a huge boat and to drive it around like a douchebag, but also not to insure it. It sinks, so what. Buy a new one.
If the car is a car that can be human-operated but also has a feature where Google or some AI can operate it, then its a normal car that happens to have a self-driving feature. There is no reason to treat such a vehicle differently from another car.
Now the interesting question is with a self-driving car where the human operator can’t take control, or its designed only for the AI to relinquish control only in certain emergencies. In this case the situation is more like with a bus or taxi and everyone in the vehicle is a passenger. Any liability really should fall on the company that produced the AI or which operates the car remotely.
In fact, in the second scenario, there would probably be no individual ownership of these cars. You call for a car and it comes and picks you up and takes you to your destination. You might pay an annual subscription fee for this service. It will work more like a car service. The cheaper subscriptions will probably involve vans that will have picked up a few other passengers. More expensive subscriptions will come with vehicles that stay on the subscribers property and are dedicated to just servicing them, but title ownership and insurance will still be with the leasing company.
In terms of the car culture, I think what is more important that automated technology is moving, culturally and legally, from a situation where everyone is expected to own and operate their own vehicle, to one where you have many people who drive professionally and are good at it, and they get paid by the crappier drivers to drive around the crappier drivers. Automation at the most provides a nudge into this scenario, by lowering the cost of “professional drivers” (AI vs humans) and by providing a situation where you have to as a society really have to want to have bad drivers on the road to keep expecting people to operate individual vehicles.
But I think insurance is the least of the problems with automated drivers. It will either be normal individually owned and operated vehicles with the AI added as an optional feature, or essentially car services/ taxis.
Ed-
Wouldn’t it be cheaper to own your own car instead of leasing it, assuming you used it at least fairly frequently? Why would people want to pay a subscription when they could own their car?
Sam
“Wouldn’t it be cheaper to own your own car instead of leasing it, assuming you used it at least fairly frequently? Why would people want to pay a subscription when they could own their car?”
The insurance/ liability issues raised earlier.
Again, this is taking the case of an automated car with normally no option of disabling the automation. In this situation, having legal title for the vehicle would presumably mean being responsible for maintenance, registration, liability for accidents, etc. but you can’t operate it. I don’t know why any sane person would want to get into that situation. Maybe the ownership culture is so strong that people will willingly get into this situation so they can “own” something.
Philip makes a good point.
In order for anyone to drive a car on public roads, that person must:
1) Pass road exam to get a driver’s license,
2) Buy insurance to get a plate for your car, and
3) Have your license and plate on the car at all times while you and the car are on public roads
Now, if you own an autonomous vehicles for which you are NOT driving it, I don’t see why you need a driver’s license to prove you are capable of driving. You are in effect a passenger of that vehicle, as if you are on public bus, train, boat, etc.
Great point PG. Law professor Bryant Walker Smith makes a similar one here: http://cyberlaw.stanford.edu/publications/regulation-and-risk-inaction
and carries the argument a little further.