What if the machine were 10x more complex? 100x more complex? What if part of the machine could misbehave in such a way that the person at the end isn't harmed with certainty, but only with some probability? What if the machine has 10 switches, all of which have to be flipped by 10 different individuals before the machine starts and harms the person; do you hold any of the individuals responsible for the harm?
#morality #ethics #TrolleyProblem
(*) Trolley problems are silly because they are decontexualized, and so are the proposed Rube Goldberg ones. I am satirizing them all, in part, though I do think if you're going to play around with thought experiments RG is a bit closer to modern lived reality than a runaway trolley.
