Who Is Liable for Self-Driving Car Accident?

Police giving alcohol test to drunk driver
Why and How Did .08 Become the Legal BAC Limit?
December 11, 2019
uninsured motorist insurance
What Is Uninsured Motorist Coverage and Should You Get It?
December 11, 2019

Who Is Liable for Self-Driving Car Accident?

self driving car accident liability

Self-driving cars, often referred to as autonomous vehicles, are quickly becoming a common sight on roadways. It’s no surprise that auto accidents involving self-driving vehicles have already occurred and have been the subject of legal debate and lawsuits.

When a self-driving car causes an accident that hurts or kills another driver, who is liable for its actions? The passenger being transported in the autonomous vehicle likely won’t be held liable for injuries the self-driving car causes unless they were somehow responsible.

If the auto accident was due to a lack of appropriate maintenance, for example, and that maintenance was the responsibility of the owner, then the owner could be liable.

If the self-driving vehicle passenger was failing to follow proper operating instructions, which was the case during the first Tesla autopilot fatality accident in May 2016, it may be difficult to hold the manufacturer liable for the accident and the resulting injuries or deaths.

At the time of that first accident, Tesla was adamant that drivers needed to remain alert and be prepared to take the wheel at a moment’s notice. From a liability perspective, Tesla wanted customers to see the self-driving features as more of a lane assist than a fully autonomous vehicle.

In the National Transportation Safety Board’s (NTSB) final report, they concluded the truck driver was at fault but also assigned some of the blame to the Tesla driver and Tesla for using a system that allowed drivers to take their eyes and focus off the road for a prolonged period of time.

Current Cases That May Affect Future Laws and Personal Injury Precedents

In March 2019 a Tesla Model 3 driver in Florida was killed when he crashed into a truck that was cutting across his lane unexpectedly. This accident shared similar situational circumstances with three other fatality accidents involving Tesla’s semi-autonomous driving feature, including that first one in May 2016.

Tesla uses a combination of cameras and radar to guide their self-driving vehicles. The system has documented difficulties recognizing static objects and those moving perpendicular to them, such as the Florida crash where a semi-truck was slowly turning left in front of an oncoming Tesla going 68 mph. There were also three reported Tesla accidents that involved the vehicles driving into stopped fire trucks in 2018.

Cadillac and Audi use specialized eye-tracking sensors in the vehicle’s cabin in their semi-autonomous models. These sensors can tell when a driver isn’t paying attention and uses indicator lights and sounds to get drivers to focus back on the road. Tesla relies on steering wheel torque to tell if the driver is paying attention. As the NTSB noted in their investigation of the first Tesla fatality accident, “monitoring steering wheel torque provides a poor surrogate means of determining the automated vehicle driver’s degree of engagement with the driving task.”

It’s fairly obvious there is a flaw in Tesla’s software, hardware or the way both are implemented. Otherwise the vehicles would be able to reliably break in time and avoid parked objects or slow-moving vehicles turning in front of it.

The other important liability-determining factor is the expectations set by the company. Does Tesla advertise and sell their product in a way that gives drivers the expectation that they can put the car on autopilot and safely dose off or zone out?

If Tesla overtly or through implication to their customers that their vehicles are safe to use on auto pilot, and it turns out they’re not, then Tesla can be held liable for the injuries or deaths that result.

In the March 2019 accident, the driver of the Tesla was going 13 mph over the speed limit, but even at that speed, accident reconstruction experts determined a driver would have only needed three second to stop. According to Tesla’s data, the driver hadn’t touched the steering wheel for 8 seconds prior to the accident.

Even if the truck made an illegal turn in front of the Tesla, the driver of the Tesla certainly shares some of the liability. His family could potentially still have a case against Tesla due to the vehicle’s documented inability to detect a slow-moving truck in the road ahead of the vehicle and the ease with which the driver was able to disregard the road and trust the autopilot.

Current Thinking About Tesla’s Legal Liability for Autonomous Driver Injuries and Deaths

Tesla is not promising perfect autopilot performance – at least not currently. Elon Musk has come out and said that Teslas could be full autonomous by mid-2020. If that’s the company’s official position and they set the expectation with customers that Tesla drivers can ignore the road and their Tesla will self-drive them safely, the company could end up facing even more liability for self-driving accident injuries or deaths.

Since the self-driving features began rolling out in 2016, Tesla has been the target of criticism from safety experts and other concerned motorists who don’t believe Tesla does enough to ensure their drivers are being attentive. This, combined with documented inefficiencies in the Tesla’s autonomous software and hardwire, likely indicate the system isn’t ready for fully autonomous deployment.

As a vehicle manufacturer, Tesla has a duty to the people who purchase and share the road with their product. An auto accident attorney could argue that knowingly selling a faulty and potentially dangerous product is a failure to uphold that duty. If that failure results in injuries and deaths, it would not be a surprised if judges and juries began ruling against Tesla.

There’s currently an ongoing wrongful death lawsuit being brought by the family of Walter Huang, an Apple engineer who died when his Tesla veered off the highway in autopilot and into a concrete divider. Their suit alleges that the Model X Huang was driving had a defect that caused it to leave lanes and strike stationary objects and that the company should have recalled the vehicles or warned drivers.

That case is ongoing, but it could potentially be the first of many.

en_USEnglish