SAN FRANCISCO, Sept 28 (Reuters) - Opening statements are set to begin on Thursday in the first U.S. trial over allegations that Tesla’s (TSLA.O) Autopilot driver assistant feature led to a death, and its results could help shape similar cases across the country.

The trial, in a California state court, stems from a civil lawsuit alleging the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 kph), strike a palm tree and burst into flames, all in the span of seconds.

The 2019 crash killed Lee and seriously injured his two passengers, including a then-8-year-old boy who was disemboweled, according to court documents. The lawsuit, filed against Tesla by the passengers and Lee’s estate, accuses Tesla of knowing that Autopilot and other safety systems were defective when it sold the car.

Tesla has denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also claims it was not clear whether Autopilot was engaged at the time of crash.

Tesla has been testing and rolling out its Autopilot and more advanced Full Self-Driving (FSD) system, which Chief Executive Elon Musk has touted as crucial to his company’s future but which has drawn regulatory and legal scrutiny.

Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” name. A Model S swerved into a curb in 2019 and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and that driver distraction was to blame.

The stakes are higher in the trial this week, and in other cases, because people died. Tesla and plaintiff attorneys jousted in the runup about what evidence and arguments each side could make.

Tesla, for instance, won a bid to exclude some of Musk’s public statements about Autopilot. However, attorneys for the crash victims can argue that Lee’s blood alcohol content was below the legal limit, according to court filings.

The trial, in Riverside County Superior Court, is expected to last a few weeks.

  • Haui@discuss.tchncs.de
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    1 year ago

    Exactly. Most importantly, „it is unclear if the autopilot was active“ is an admission of guilt in my book. A system that is highly critical (impacting steering, accelerating and decelerating a 2+ ton vehicle with multiple people on board) is unclear if active means they are completely clueless of their own products status and should be fully liable. You can not sell something like this at all. I feel like they (people in charge) should go to jail for negligent manslaughter.

    • cerevant@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      Yes, so much. It drove me crazy when a car company argued “but our logs say it was the driver’s fault”. We’re arguing that your most critical software failed, and you want us to trust the logging subsystem?

      The NTSC needs to be qualifying car software the same way the FAA qualifies aircraft software. We need to stop trusting the manufacturers to self police.

        • cerevant@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          It is a nice break from my day job, where I am certifying software for critical systems.

          sigh

          • Haui@discuss.tchncs.de
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Oof. Seeing this must frustrate you. It bothers me too seeing sleazebags getting their way while honest people get fucked.