• ArbitraryValue
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    you have a 95% chance of never seeing it. Don’t pull the lever.

    I’m confused: 0.99^32 = 0.72, not 0.95. And if you know that everyone except the last guy won’t pull the lever, that’s still a 1% chance of killing everyone on earth (average expected deaths: 70 million) is way worse than definitely killing one person!

    (Edit: unless “don’t pull the lever” means killing that one person, because it isn’t clear which is the default “no action” outcome. In which case, never mind.)

    (Edit 2: if you know the 34th and last person might be a sociopath, you’re best off if the first 27 people might also be sociopaths.)

    • The Snark Urge@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You’re probably right.

      The thing that doesn’t sit well with me about this sort of ethical reasoning is that it’s really only oriented towards the ends. Is it ethical to even comply with such a game at all? If they put a gun to your head or hold the world hostage for an answer, they’re basically forcing you to treat the situation as a pure math problem, which means they’ve determined the “right answer” by the framing of the question.

      Better to have a “rogue AI moment” try and kill the experimenter.

      • ArbitraryValue
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I totally get that - my natural impulse is also to pull a Captain Kirk (Kobayashi Maru) or a Captain America (we don’t trade lives). What is it about captains and that sort of thing? But IRL no-win scenarios do happen…