• CaptObvious@literature.cafe
    link
    fedilink
    arrow-up
    116
    arrow-down
    9
    ·
    1 year ago

    Who the hell thinks beta software is appropriate for real-world applications in something as dangerous as vehicle control at highway speeds?

    I’ve come to believe that all Teslas should be recalled until they get their act together. They’re getting people hurt and killed by field testing their experiments on roadways that we paid for.

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      61
      arrow-down
      4
      ·
      1 year ago

      More importantly, hold musk responsible for the mayhem. They call it “full self driving” when it has not qualified to be called that.

    • athos77@kbin.social
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      1 year ago

      While I agree, let’s not pretend like this is limited just to Tesla. My feed lately has had numerous stories of crazy FSD taxis as well.

      I also have to say that one of my concerns with FSD is the deterioration in people’s driving skills and their awareness of their car’s abilities (especially as those change over time). Leaving aside all the wisecracks about people’s normal abilities or not paying attention anyway, let’s take a snowstorm. FSD can’t drive in it, so you’re left with regular human drivers going manual in their cars. But they haven’t actually driven themselves in a while, so they’ve forgotten some of the lessons they learned like how to apply the brakes differently in ice and snow, they don’t know where the corners of their car are, they’re driving entirely too fast and - because their FSD car was compensating for mechanical issues - they’re not aware that their tires are near-bald and the brakes are iffy.

      Thing is, I know this is something that’s going to happen. I just don’t know how we can mitigate the risks.

    • CeeBee@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Who the hell thinks beta software is appropriate for real-world applications in something as dangerous as vehicle control at highway speeds?

      I honestly think it’s a mixture of public perception and liability. The company can try to spin negative insurances as “well, we said it’s still in beta, there are bugs we’re still ironing out”. And legally I think the stance is “we said it’s a beta version, if you used it in a dangerous situation that’s on you”.

      I know it doesn’t exactly work that way, but I genuinely think they’re positioning that way at that if (read: when) a legal case pops up they can use the “beta” moniker as part of a defence.

      • CaptObvious@literature.cafe
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Agreed. I’m just not sure how regulators justify allowing software that claims to be beta to operate a vehicle autonomously.

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Who the hell thinks beta software is appropriate for real-world applications in something as dangerous as vehicle control at highway speeds?

      Ruling class investors, techbro grifters, and the credulous rubes that believe the Singularity™ is right around the corner and every person killed in a janky car is only making the car smarter.

    • max@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Users do. See all the people running iOS betas on their daily drivers, then complaining about a forced reset or data loss.