A self-driving car is seen during a demonstration at the Google campus in Mountain View, California, in 2015.

A self-driving car is seen during a demonstration at the Google campus in Mountain View, California, in 2015.

Who will be responsible for a self-driving car accident?

  • By Brian Fung The Washington Post
  • Wednesday, February 17, 2016 2:55pm
  • Business

Wow. A lot of readers had some very passionate responses to last week’s news that the federal government had recognized Google’s software, not the human passenger, as the “driver” in its self-driving cars. There was one, big theme running through many of your comments.

See if you can identify it:

So does the software have to get a driver’s license and insurance? — ikeaboy

So if I get drunk, get into my Googlemobile and crash into someone the software is going to jail? Seems awkward to put flash memory in with the other prisoners. — InAVanByTheRiver

Who is charged if there is a fatal accident and there is an occupant in the driverless car? What happens if there is a lawsuit? Who pays the fine or serves time if the driverless car is found guilty? — scoon42

All of these questions target the issue of liability, which is about to get very interesting. As computerized, self-driving cars come closer to fruition, car accidents are likely to become vastly more complex. What will happen when you get into a crash, and who will be to blame?

People who study robot cars have actually been aware of these questions for years. But they haven’t been able to do much about it, because, well, no actual cases or policies have appeared in the real world. With the National Highway Transportation Safety Administration’s decision on Google’s cars last week, however, the ball has begun to roll.

This is the future of the car accident, as policymakers and analysts see it.

Will carmakers be to blame for driverless crashes?

In general, experts have several big ideas about what could happen.

As many of you guessed, making the car the legal “driver” means the auto manufacturer may assume greater responsibility for crashes. This is largely a matter of product liability, several auto and insurance analysts said, not personal insurance — though as a 2014 study from the Brookings Institution suggests, determining where one type of coverage ends and the other begins will be tricky. We’ll come back to this in a minute. But basically, victims of a collision could (directly or indirectly through their own insurers) try to seek damages from a driverless-car maker for manufacturing a vehicle that didn’t operate as it was supposed to.

“If you have a catastrophic failure of a product, you can sue the bejeezus out of a company,” said John Townsend, a spokesman for AAA Mid-Atlantic, “if the product causes the crash.”

Google has said that, yes, it understands that dynamic and the possibility it could be held accountable for crashes where its cars are at fault. Driverless cars could therefore expand the range of Google’s future legal responsibilities in a very real way. (The company declined to comment for this story.)

But more broadly, product liability is a well-understood part of commercial law, and insurance companies give consumer-payouts all the time by collecting payments from parts manufacturers whose products malfunctioned. So, at least that much is relatively straightforward.

Will my car insurance change?

Self-driving cars could cause insurance companies to rewrite your policies or change your rates, but at this point it’s not entirely clear how, or what those changes might look like. Many companies are still working to understand the implications of NHTSA’s recent decision. Many of the insurance firms I spoke to offered non-committal reactions. Some didn’t respond at all.

“This is certainly an area we not only watching, but engaging in on a variety of levels,” said Anna Bryant, a spokesperson for State Farm.

Liberty Mutual didn’t have much more to say, either.

“Liberty Mutual Insurance has a dedicated Innovation team that constantly evaluates a portfolio of key trends and technologies that could have future impact on our customers,” said company spokesperson Karen Pevenstein. “Autonomous vehicle technology is a fast growing and exciting trend with a potential benefit to help reduce accidents and injuries for our customers.”

If these answers feel unsatisfying, there’s a good reason for that, said Wayne McOwen, executive director of the District of Columbia Insurance Federation. While NHTSA’s letter shed light on the federal government’s position on driverless cars, only a handful of states have weighed in on the matter — and it’s their laws that really matter to insurance companies. That’s because it’s the states that ultimately issue the drivers’ licenses that insurers use to evaluate risk.

“An insurance company asks, ‘Does the driver have a valid license? Are there any violations against that license? Are there DUIs or other kinds of violations that would be critical in terms of evaluating the kind of operator they are and the experience of that operator?’?” said McOwen.

Here’s the problem for insurers: These questions are going to have to be retooled if 49 states all come up with different definitions of the word “driver.” Computerized drivers, of course, don’t get DUIs. This is why insurance companies are in a tough position.

Different degrees of automation pose a real headache.

Even though driverless cars may limit the ability of humans to make mistakes, in many cases they won’t absolve humans of the responsibility to avert an imminent crash if they can. And so the future of car accidents may ironically create more scrutiny for humans in cars, not less.

Suppose you’re in a driverless car, and you see that you’re about to rear-end another car. Whether you bear some responsibility for the crash may ultimately turn on the degree of control you had over the car. Could you have reasonably prevented the accident, or not?

The answer will depend a lot on your car’s technical capabilities. Google’s driverless car isn’t designed to let humans take control. That’s because in Google’s view, letting humans take over actually makes driverless cars less safe, because a passenger could try to assert herself in ways that lead to a crash. Most accidents on the road today occur because of human error, not system malfunction. This is one of the reasons why Google, as we’ve discussed, could end up being liable for any accidents with its cars.

But in a different car, it might be a different story. Other car makers such as Audi intend to keep steering wheels in their cars indefinitely, even as they increase the amount of automation in their vehicles. And when Tesla announced its autopilot feature in the Model S, it was upfront with owners that they were responsible for keeping the car in control.

In a vehicle like that, a human might be able to step in before the rear-ender happens. And if the human fails to anticipate the crash, then he or she might be considered liable.

But what about the ability of the human driver?

Of course, humans are complicated. We try to mitigate that complexity today with standardized driving tests to ensure everyone can drive correctly. These tests make sure you can execute a three-point turn, parallel park or show you can fully stop at a stop sign.

But we’re moving into a future where some human drivers may be physically incapable of doing those things without the help of a driverless car. How should insurance companies, car makers and the law treat those folks?

You see, driver automation won’t just make computers drivers. It also stands to change how we think of human drivers, too. Driverless cars could allow those who have difficulty driving — such as the elderly or the blind — to get around in more meaningful, seamless ways that significantly improve their quality of life. It’s one of the big selling points of the technology.

The problem comes when insurance companies have to decide, again, whether the human inside the driverless car could have prevented an accident. Setting aside the question of whether the car even supports human intervention, a blind person would have a lot more trouble preventing that rear-ender compared to a seeing person. Should both people be held accountable to the same extent?

Some states have addressed this by proposing that all self-driving cars must have a licensed driver behind a physical steering wheel at all times, effectively ensuring that only those who can pass today’s driving tests will be allowed to operate a driverless car.

You can quickly see how messy this could get for your insurance.

If this is so complicated, why does Google want anything to do with it?

Let’s go back to Google assuming more responsibility for car crashes. Why would Google be on board with taking on such a big risk? If there are millions of Google cars on the road, that’s a big legal and insurance headache waiting to happen if something goes wrong.

For Google, however, the likely benefits outweigh the costs.

“I suspect Google is okay with this thinking because it considers the likelihood of a Google car causing an accident very, very low,” said Karl Brauer, an analyst at Kelley Blue Book. Indeed, Google’s reports suggest that its accident record is very good, though much of its testing has taken place in sunny, dry climates.

Part of the reason Google may be unconcerned is its trust in technology. Every second, the cars collect detailed information on the car’s location, its position relative to other objects, places and people, and can assess the local environmental conditions. Not only might this data come in handy for boosting Google’s core business, but it would also be useful in reviewing the critical moments before a crash.

“These Google cars will track everything going on in and around them, with cameras and full vehicle diagnostics,” said Brauer, “which would make blaming them very difficult — unless they really are at fault.”

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.