The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 months ago

    That is the minimal outcomes for an automated safety feature to be an improvement over human drivers.

    But if everyone else is using something you refused to that would have likely avoided someone’s death, while misnaming you feature to mislead customers, then you are in legal trouble.

    When it comes to automation you need to be far better than humans because there will be a higher level of scrutiny. Kind of like how planes are massively safer than driving on average, but any incident where someone could have died gets a massive amount of attention.