• davewritescode@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Attacking AI based systems with malicious input is like shooting fish in a barrel. Combine high complexity with low comprehension of how the internals of the system actually work and you have a field day for security researchers.

    It’s only a matter of time before someone hurts or robs a Tesla driver by forcing the car to do something that’s not in the best interest of the operator.

  • Aldehyde@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    If self driving cars become disabled by putting a cone on the hood, maybe they aren’t ready to be on the streets.

    • HughJanus@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 year ago

      By that measure they never will be. Because people are never going to stop fucking with them.

      I took a ride in a self driving car. There were 3 different people who just walked right out in front of the car. One of them crossed the street then turned back around to walk back in front. People really enjoy fucking with them. This is why we can’t have nice things.

    • dom@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      They won’t be ready to be on the streets until they’ve had a lot of time on the streets.

      It’s a catch 22.

    • hayander@lemmyngton.au
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I wouldn’t expect a human driver to move if their view is obstructed or there are objects on the vehicle

      • theluddite@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Humans are capable of assessing and addressing the obstruction; meanwhile these cars are permanently disabled without outside assistance.