• Akasazh@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I think it’s quite hard a you’d have to compare it on ‘time spent behind the wheel’, which is not easy to precisely calculate.

  • Nousfeed
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    The article is very misleading, and kinda disproves the title.

    This from the article

    It is unclear whether the data captures every crash involving Tesla’s driver-assistance systems. NHTSA’s data includes some incidents where it is “unknown” whether Autopilot or Full Self-Driving was in use. Those include three fatalities, including one last year.

    Then there is this

    NHTSA said a report of a crash involving driver-assistance does not itself imply that the technology was the cause. “NHTSA has an active investigation into Tesla Autopilot, including Full-Self Driving,” spokeswoman Veronica Morales said, noting the agency doesn’t comment on open investigations. “NHTSA reminds the public that all advanced driver assistance systems require the human driver to be in control and fully engaged in the driving task at all times. Accordingly, all state laws hold the human driver responsible for the operation of their vehicles.”

    And then the story about this kid getting hit buy the tesla after walking off a school bus we find out at the end.

    The Tesla driver, Howard G. Yee, was charged with multiple offenses in the crash, including reckless driving, passing a stopped school bus and striking a person, a class I felony, according to North Carolina State Highway Patrol Sgt. Marcus Bethea. Authorities said Yee had fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands: Autopilot disables the functions if steering pressure is not applied after an extended amount of time. Yee did not respond to a request for comment.

    To use this as an anecdote example is just straight up BS.