you have a 95% chance of never seeing it. Don’t pull the lever.
I’m confused: 0.99^32 = 0.72, not 0.95. And if you know that everyone except the last guy won’t pull the lever, that’s still a 1% chance of killing everyone on earth (average expected deaths: 70 million) is way worse than definitely killing one person!
(Edit: unless “don’t pull the lever” means killing that one person, because it isn’t clear which is the default “no action” outcome. In which case, never mind.)
(Edit 2: if you know the 34th and last person might be a sociopath, you’re best off if the first 27 people might also be sociopaths.)
The thing that doesn’t sit well with me about this sort of ethical reasoning is that it’s really only oriented towards the ends. Is it ethical to even comply with such a game at all? If they put a gun to your head or hold the world hostage for an answer, they’re basically forcing you to treat the situation as a pure math problem, which means they’ve determined the “right answer” by the framing of the question.
Better to have a “rogue AI moment” try and kill the experimenter.
I totally get that - my natural impulse is also to pull a Captain Kirk (Kobayashi Maru) or a Captain America (we don’t trade lives). What is it about captains and that sort of thing? But IRL no-win scenarios do happen…
The fact of the game never ending is what made the choice too easy, you’re right.
EDITED
For this study you want sociopathy, not psychopathy. I can report from my wasted psych degree that sociopathy occurs in 1-2% of the population.
Binary probability tells us that if you repeat a 1% chance test 32 times, you have a 95% chance of never seeing it.
Don’t pull the lever. Sorry for the ninja edit, I misread something.
I’m confused: 0.99^32 = 0.72, not 0.95. And if you know that everyone except the last guy won’t pull the lever, that’s still a 1% chance of killing everyone on earth (average expected deaths: 70 million) is way worse than definitely killing one person!
(Edit: unless “don’t pull the lever” means killing that one person, because it isn’t clear which is the default “no action” outcome. In which case, never mind.)
(Edit 2: if you know the 34th and last person might be a sociopath, you’re best off if the first 27 people might also be sociopaths.)
You’re probably right.
The thing that doesn’t sit well with me about this sort of ethical reasoning is that it’s really only oriented towards the ends. Is it ethical to even comply with such a game at all? If they put a gun to your head or hold the world hostage for an answer, they’re basically forcing you to treat the situation as a pure math problem, which means they’ve determined the “right answer” by the framing of the question.
Better to have a “rogue AI moment” try and kill the experimenter.
I totally get that - my natural impulse is also to pull a Captain Kirk (Kobayashi Maru) or a Captain America (we don’t trade lives). What is it about captains and that sort of thing? But IRL no-win scenarios do happen…