Is It Still Wrongful Death If The Car Is Driving Itself?

Is It Still Wrongful Death If The Car Is Driving Itself?

When Carl Benz conceived the first lightweight car powered by a gasoline engine in the closing years of the 19th century, he could never have imagined the challenges faced by those in the 21st century, where cars that drive themselves are fast becoming a commonplace reality.

With big names like Tesla, Google, Uber, and Waymo betting their money on self-driving cars, it’s clear that this form of transportation will be part of the future. However, the utopia associated with self-driving cars is starting to give way to reality as engineers realize that driverless vehicles present new challenges that need novel solutions.

One of the biggest challenges associated with self-driving cars comes about when the vehicle causes a fatality. This results in a vital question: Is it still wrongful death if the car is driving itself?

To answer the question above, we will need to start by defining the idea of a self-driving car and wrongful death in general. We will then look at specific car accident cases where self-driving cars have killed people, leading to lawsuits. This should provide an idea of the arguments made by experts in this area regarding the question above.

As you read this article, it’s vital to remember that any information provided here is not intended as and does not constitute legal advice.

Self Driving Car Accident Statistics

Here are some numbers summarizing the situation with regards to self-driving cars and the fatalities they have caused.

  • You may be surprised to learn that self-driving cars cause more accidents (9.1 accidents per million miles driven) than vehicles involving a human driver (4.1 accidents per million miles driven)
  • To date, the number of deaths linked to Teslas where the Autopilot was engaged is six.
  • The 2019 traffic fatality rate was 1.1 fatalities per 100 million miles driven (including both human-driven cars and vehicles with self-driving capabilities).
  • Estimates indicate that Tesla Autopilot has driven about 3.3 billion miles. This implies that driving a Tesla on Autopilot (about one fatality in 550 million miles driven) is safer than human driving (about 1.1 fatalities per 100 million miles driven).

Brad Templeton writes for Forbes.com and suggests that it’s vital to look at the numbers above with caution. He points out several challenges with the math. For instance, when we count the fatality rate of Teslas with Autopilot, do we include deaths where it’s claimed that Autopilot was engaged even though that claim has not been verified? Is it possible to generate estimations of human driver fatalities based on highway information only?

Understanding Self-Driving Cars

The concept of a self-driving car denotes a vehicle endowed with numerous sensors that provide it with the capability to sense its environment in order to navigate safely whether it is driving itself or a human driver is in control. Other terms used interchangeably with self-driving cars include driverless cars, autonomous vehicles, and robo-cars.

Dr. Lance Eliot, an expert on Artificial Intelligence (AI) and Machine Learning (ML), explains the various categories of self-driving cars. He starts by noting that “True self-driving cars are ones that the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.”

According to Dr. Eliot, when a car entirely relies on AI to accomplish the entire driving task, it belongs in Level 4 or Level 5. On the other hand, cars that need coordination between the human driver and AI are usually placed in Level 2 or 3. He adds that “cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-ons that are referred to as ADAS (Advanced Driver-Assistance Systems).”

The move toward self-driving cars, in general, is based on the assumption that most accidents on roadways can be attributed to human error. For instance, a paper published by the Accident Analysis & Prevention journal identifies inexperience, lack of skill, and risk-taking behaviors as the main factors contributing to road accidents.

When self-driving cars designed to reduce road carnage start to cause fatal accidents, concerns are raised regarding their safety and the issues of liability. Determining liability can be a sizable challenge, especially when you consider that many individuals and entities could play a role in an accident: the driver, car maker, technology provider, government, and local authorities.

White car with autopilot sensing imagery of other cars

Defining Wrongful Death

Wrongful death is when a person dies because of the intentional or accidental wrongful act of another party. When there is a possibility that the car accident that led to the death of an individual constitutes wrongful death, members of the deceased person’s family can institute a lawsuit to get compensation for loss and pain associated with the death of the person in question.

Therefore, when we ask whether it is still wrongful death if the car is driving itself, we are trying to understand whom the wrongful death claim should be made against. Some people argue that the manufacturer of the car and the technology in it should be held liable.

On the other hand, there is a lobby that argues that there is always someone responsible for the car at any time. Therefore, that person has full responsibility for the car and should be held legally answerable for controlling the vehicle.

Who Can Sue For Wrongful Death?

When claiming wrongful death in car accidents involving self-driving cars, laws can differ depending on the jurisdiction. However, in most cases, the immediate family, life partners, spouses, and others who were dependent on the deceased individual can sue for wrongful death. Some jurisdictions also allow for anyone that can show proof that they have suffered financially to sue.

Who Can Be Sued?

In the case of road accidents involving self-driving cars, several individuals and entities can be sued:In the case of road accidents involving self-driving cars, several individuals and entities can be sued:

Drivers

All drivers are allowed to use public roads based on the assumption that they will observe the duty to care for other road users like cyclists, other drivers, and pedestrians. Where there is proof that a driver’s negligence caused a fatality, a wrongful death lawsuit can be brought to court.

Common examples of driver negligence include drinking and driving, speeding, and texting while driving.

Government

The government plays a profound role in ensuring that drivers are safe. For instance, the government is responsible for making sure that manufacturers of cars and technologies used by such cars meet established safety standards.

The government is also responsible for ensuring that the surfaces on which these self-driving cars move do not have hazards that could result in fatalities. Also, self-driving cars rely on properly functioning road signage and traffic signals. In most cases, local governments are responsible for maintaining such infrastructure and ensuring that drivers observe the law.

Technology Providers

The companies that provide technologies to manufacturers of self-driving cars have a significant role in ensuring that their technology functions as it is supposed to. Therefore, if a car relying on such technology is involved in an accident that results in a fatality, it’s vital to consider whether there was negligence on the part of the technology supplier.

Vehicle Manufacturers

As has been seen in several lawsuits involving Tesla, the vehicle manufacturer is often the first port of call for families alleging wrongful death from self-driving cars.

In 2019, Tesla was sued by the family of a driver (Jeremy Beren Banner) who died in a car crash while using the Autopilot advanced driver assistance system.

In a report about the 2019 case published by TheVerge.com, Sean O’Kane writes, “Banner is the fourth known person to die while using Autopilot, and his family is the second to sue Tesla over a fatal crash involving the technology. O’Kane adds that “In May [2019], Tesla was sued by the family of 38-year-old Wei Huang, who died in 2018 after his Model X crashed into an off-ramp divider with Autopilot engaged.”

Vehicle manufacturers also face a higher level of scrutiny than other entities that may contribute to wrongful death in self-driving cars. For instance, Yahoo Finance senior columnist Rick Newman produced an article entitled “It’s time to notice Tesla’s Autopilot death toll.” In that article, he seems to lay most of the blame for Tesla self-driving vehicle fatalities on the car maker’s doorstep.

Newman illustrates the challenge of determining responsibility when he writes, “Tesla is proving something other automakers dare not attempt: New technology + foolish drivers = death.” He continues, “Tesla promotes Autopilot as if it’s the world’s most advanced self-driving system. It’s not.” It’s his view that the specific use of the term Autopilot could confuse drivers into a false sense of security that could result in risky behavior.

Who Is Liable For The Fatality?

In an attempt to work out who is responsible when a self-driving car causes a fatality, Ron Schmelzer, Managing Partner & Principal Analyst at AI Focused Research and Advisory firm Cognilytica and a contributor at Forbes.com, resorts to the 2018 Uber AV fatality. The Uber vehicle struck pedestrian Elaine Herzberg as she crossed a road in Arizona. She died from the injuries sustained in the crash.

In his attempt to determine who could have been liable for the fatality, Schmelzer analyzes the role of everyone involved.

Schmelzer starts by looking at the two humans involved in the accident: the person supervising the car and the deceased. The deceased is reported to have been walking a bike and not using a crosswalk. Suppose this means that the pedestrian was at fault. In that case, the next question is why the person supervising the vehicle did not take any action to stop the car or swerve.

Schmelzer then considers the technology and the vehicle as the potential main culprits in the fatal accident. He asks several questions related to the AI technologies’ ability to handle all the possible road scenarios in the real world and whether the sensors were confused because the pedestrian was walking a bike.

Schmelzer concludes that “Maybe it’s not such a good idea to pin the fault on just one person or one factor.” This is a view based on the assumption that “The totality of the circumstances may add up to the reality that fault lies with many parties.”

The BBC later reported that the Uber backup driver, Rafael Vasquez, had been streaming an episode of the television show The Voice at the time of the accident. Vasquez was later charged with negligent homicide.

Uber did not face any criminal charges because prosecutors ruled that the company was not criminally liable for Herzberg’s death.

A Complex Web

The views expressed by Schmelzer above are vindicated by a decision made by the National Transportation Safety Board (NTSB) following the 2018 Uber crash. The Columbia University’s engineering division reports that the NTSB “split the blame among Uber, the company’s autonomous vehicle (AV), the safety driver in the vehicle, the victim, and the state of Arizona.”

It’s clear from the NTSB decision that determining liability is complicated. This is because often, an accident is not a result of the wrongful action of a single person or factor.

Is It Still Wrongful Death?

From the views expressed above, we can conclude that whether a car is driving itself or is being controlled by a human driver, a wrongful death claim can still result if the fatality is caused by any intentional or accidental wrongful act of another party.

While it’s clear that a fatality caused by a wrongful act should be considered a wrongful death, the challenge comes about when we attempt to determine who is liable. For those who want precedents regarding liability, it’s unfortunate that big companies like Uber have resorted to out-of-court settlements. Consequently, there is limited guidance from court rulings regarding the issue.

There are still only a small number of cases that have gone to court and only limited court rulings that could provide an idea of how the courts look at issues of liability in varying cases. As such, it’s safe to say that there is no one-size-fits-all when it comes to the issues of self-driving cars and liability for accidents. Circumstances will always differ. Laws in different jurisdictions may also differ.

Contact a Wrongful Death Car Accident Attorney

As you can see, wrongful death car accident claims can be complex and difficult to navigate, especially when there are elements like self-driving cars involved. If you or a loved one were injured or affected by a wrongful death, you may need the services of an experienced attorney who can sort out liability, damages, and other legal issues.

At Pittman, Dutton, Hellums, Bradley & Mann, our attorneys have the track record to handle multi-faceted litigation. We can help you obtain the legal remedy needed to get your life back on track. Contact us today at (205) 322-8880 to schedule a free, no-obligation consultation regarding your claim.  

CATEGORIES

  • Car Accidents
  • Firm News
  • Personal injury
  • Product Liability
  • Wrongful Death

GET A FREE CASE EVALUATION

Fill out the form below to contact our firm. One of our experienced attorneys is prepared to speak with you. Consultations are free and confidential.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Have you been injured in an accident?