Posts Tagged: fatal

Tesla’s Autopilot was not to blame for fatal 2019 Model 3 crash, jury finds

A California jury has found that Tesla was not at fault for a fatal 2019 crash that allegedly involved its Autopilot system, in the first US trial yet for a case claiming its software directly caused a death. The lawsuit alleged Tesla knowingly shipped out cars with a defective Autopilot system, leading to a crash that killed a Model 3 owner and severely injured two passengers, Reuters reports.

Per the lawsuit, 37-year-old Micah Lee was driving his Tesla Model 3 on a highway outside of Los Angeles at 65 miles per hour when it turned sharply off the road and slammed into a palm tree before catching fire. Lee died in the crash. The company was sued for $ 400 million plus punitive damages by Lee’s estate and the two surviving victims, including a boy who was 8 years old at the time and was disemboweled in the accident, according to an earlier report from Reuters.

Lawyers for the plaintiffs argued that Tesla sold Lee defective, “experimental” software when he bought a Model 3 in 2019 that was billed to have full self-driving capability. The FSD system was and still is in beta. In his opening statement, their attorney Jonathan Michaels also said that the “excessive steering command is a known issue at Tesla.”

Tesla’s defense argued that there was no such defect, and that an analysis cited by the plaintiffs’ lawyers identifying a steering issue was actually looking for problems that were theoretically possible. A fix to prevent it from ever happening was engineered as a result of that analysis, according to the company. Tesla blamed human error for the crash, pointing to tests that showed Lee had consumed alcohol before getting in the car, and argued that there’s no certainty Autopilot was in use at the time.

The jury ultimately found there was no defect, and Tesla was cleared on Tuesday. Tesla has faced lawsuits over its Autopilot system in the past, but this is the first involving a fatality. It’s scheduled to go on trial for several others in the coming months, and today's ruling is likely to set the tone for those ahead.

This article originally appeared on Engadget at https://www.engadget.com/teslas-autopilot-was-not-to-blame-for-fatal-2019-model-3-crash-jury-finds-210643301.html?src=rss

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

Uber safety driver involved in fatal self-driving car crash pleads guilty

The Uber safety driver at the wheel during the first known fatal self-driving car crash involving a pedestrian has pleaded guilty to and been sentenced for an endangerment charge. Rafaela Vasquez will serve three years of probation for her role in the 2018 Tempe, Arizona collision that killed Elaine Herzberg while she was jaywalking at night. The sentence honors the prosecutors’ demands and is stiffer than the six months the defense team requested.

The prosecution maintained that Vasquez was ultimately responsible. While an autonomous car was involved, Vasquez was supposed to concentrate on the road and take over if necessary. The modified Volvo XC90 in the crash was operating at Level 3 autonomy and could be hands-free in limited conditions, but required the driver to take over at a moment’s notice. It noticed Herzberg but didn’t respond to her presence.

The defense case hinged on partly blaming Uber. Executives at the company thought it was just a matter of time before a crash occurred, according to supposedly leaked conversations. The National Transportation Safety Board’s (NTSB) collision findings also noted that Uber had disabled the emergency braking system on the XC90, so the vehicle couldn’t come to an abrupt stop.

Tempe police maintained that Vasquez had been watching a show on Hulu and wasn’t paying attention during the crash. Defense attorneys have insisted that Vasquez was paying attention and had only been momentarily distracted.

The plea and sentencing could influence how other courts handle similar cases. There’s long been a question of liability surrounding mostly driverless cars — is the human responsible for a crash, or is the manufacturer at fault? This suggests humans will still face penalties if they can take control, even if the punishment isn’t as stiff for conventional situations.

Fatal crashes with autonomy involved aren’t new. Tesla has been at least partly blamed for collisions while Full Self Driving was active. The pedestrian case is unique, though, and looms in the background of more recent Level 4 (fully driverless in limited situations) offerings and tests from Waymo and GM’s Cruise.While the technology has evolved since 2018, there are still calls to freeze robotaxi rollouts over fears the machines could pose safety risks.

This article originally appeared on Engadget at https://www.engadget.com/uber-safety-driver-involved-in-fatal-self-driving-car-crash-pleads-guilty-212616187.html?src=rss
Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

Google needs to fix this fatal flaw before I consider a Pixel 7

The Google Pixel 7 is shaping up to be a great phone. But only if Tensor 2 addresses a major issue.
Android | Digital Trends

Virtual reality could help elderly people avoid potentially fatal falls

Researchers have been investigating whether VR tech could be used to help prevent falls among the elderly and people with neurodegenerative conditions. Here’s what they’re busy planning.

The post Virtual reality could help elderly people avoid potentially fatal falls appeared first on Digital Trends.

Cool Tech–Digital Trends