NHTSA investigated the accident and confirmed that the vehicle was using Autopilot at the time of the crash. However, according to phone data, it blamed the driver, who was playing a video game on his phone, and the lack of a crash attenuator, which affected the severity of the crash.

When using Autopilot or FSD Beta, Tesla tells drivers that they need to pay attention at all times and to be ready to take control at all times. If drivers are not doing that, they are misusing the system.

The family has sued Tesla for wrongful death, and it is going to be quite an uphill battle for them because it looks like he was using his phone while driving, which is a traffic violation and against Tesla’s guidance on how to use Autopilot.

That said, the family’s lawyers benefit from learning from previous similar trials and they are taking a different approach. They are not denying Huang’s misuse of Autopilot, but they are focusing on Tesla’s communications, which they claim led to the driver misusing Autopilot.

  • Franklin
    link
    fedilink
    English
    2
    edit-2
    4 months ago

    The driver is absolutely at fault, however Tesla does deserve to have to rename this and in my opinion self driving needs to be disabled entirely as their current tech is just not safe for its intended purpose.

    • @[email protected]
      link
      fedilink
      English
      13 months ago

      I agree on the tech, but the name autopilot is far older than tesla and describes a large variety of adas systems. What they should do is stop advertising it as “full self driving capable” because holy shit is it bad (I own a Model X that included FSD when I bought it used). It only tried to kill me like 6 times by doing random swerves/turns/breaking in the first week of owning it (once it tried to plant itself into a dumpster) this was on both 11 stable and the latest beta as it released the week I was testing.