A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

  • phdepressed
    link
    fedilink
    English
    arrow-up
    21
    ·
    6 months ago

    Because humans have more accountability. Also it has implications for military/police use of self-guided stuff.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      6 months ago

      What is the purpose of accountability other than to force people to do better? If the lack of accountability doesn’t stop a computer from outperforming a human, why worry about it?

      • medgremlin@midwest.social
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        6 months ago

        The lack of accountability means that there is nothing and no one to take responsibility when the robot/computer inevitably kills someone. A human can be faced with legal ramifications for their actions, the companies that make these computers have shown thus far that they are exempt from such consequences.

        • Turun@feddit.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          That is true for most current “self driving” systems, because they are all just glorified assist features. Tesla is misleading its customers massively with their advertisement, but on paper it’s very clear that the car will only assist in safe conditions, the driver needs to be able to react immediately at all times and therefore is also liable.

          However, Mercedes (I think it was them) have started to roll out a feather where they will actually take responsibility for any accidents that happen due to this system. For now it’s restricted to nice weather and a few select roads, but the progress is there!

          • medgremlin@midwest.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 months ago

            The driverless robo-taxis are also a concern. When one of them killed someone in San Francisco there was not a clear responsible entity to charge with the crime.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          6 months ago

          That is simply not true. The law since basically forever had held that manufacturers are liable if their product malfunctions and hurts someone when it’s being operated in accordance with their instructions.

          Edit: I hope all y’all who think the rule of law doesn’t exist are gonna vote against the felony party.

          • Kanzar
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 months ago

            Excuse us for being sceptical that businesses will actually be held accountable. We know legally they are, but will forced arbitration or delayed court proceedings mean people too poor to afford a good lawyer for long will have to fuck off?

          • medgremlin@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 months ago

            The current court cases show that the manufacturers are trying to fob off responsibility onto the owners of the vehicles by way of TOS agreements with lots of fine print and Tesla in particular is getting slammed for false advertising about the capabilities of their self-driving features while they simultaneously try to force all legal liability onto the drivers that believed their advertising.