Open Side Menu Go to the Top
Register
Ethical Dilemma. What would you do?  Trolley Problem. Ethical Dilemma. What would you do?  Trolley Problem.
View Poll Results: What would you do? Ethical Dilemma Question 1.
I would never throw the switch. We have no right to play God or decision maker here.
4 44.44%
I would throw the switch. The needs of the many outweigh the needs of the few.
5 55.56%

07-05-2021 , 04:26 PM
Destroying homes in Commerce, CA is correct almost no matter what the circumstances.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-05-2021 , 06:44 PM
Quote:
Originally Posted by BrianTheMick2
Destroying homes in Commerce, CA is correct almost no matter what the circumstances.
Well, as long as the casino is safe.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 01:13 AM
Quote:
Originally Posted by Pokerlogist
Diverting the trolley is correct, while letting 50 people die is wrong.

An event like this has happened in real life where a runaway train headed for metro Los Angeles was purposely derailed into a less populated housing area to prevent a greater disaster. Thankfully the the right thing was done and only a few people were injured.

"Railroad switched cars off main line, knowing derailment 'likely'. Saturday, June 21, 2003
A freight train derails and scatters its load of lumber into homes in Commerce, California. Residents watch train demolish homes

COMMERCE, California (CNN) -- A runaway freight train carrying lumber through Southern California derailed after being switched to a side track Friday in suburban Commerce, which sent its cargo crashing into three homes and left 13 people injured, the Los Angeles Fire Department said."

https://edition.cnn.com/2003/US/West...ils/index.html
In my opinion, your real-life example isn't relevant to the Trolley Dilemma. In the Trolley Dilemma, the switch operator knows ahead of time what the exact outcome will be based on his decision.

In the real-life example above, it is possible that nobody would have been injured if the train had not been re-directed. There is no guesswork in the Trolley Dilemma.

addendum: An example that Dennis Prager likes to use is (I'm paraphrasing), "Sometimes people die because they are wearing a seat belt, but in general wearing a seat belt saves a lot more lives than if seat belts weren't being worn."
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 05:44 AM
To me, the bigger question that arises from the trolley problem is whether to accept the principles of utilitarianism, and for that question I answer with a resounding, "no!"

Obviously the more extreme the example, the more the gut reaction becomes to save the majority, whether we're talking about saving hundreds of millions vs. one person's death or even 50 vs 4.

The real question is whether we base the morality of an action on the net positive and negative effects like a sort of EV equation.

When you follow this line of thinking to its logical conclusion, the justifiable actions can become very disturbing, because you're basically saying there is no such thing as a moral absolute.

So potentially ANY action can be justified IF the pay off is big enough.

Is it acceptable to murder one million? Only if it saves one million one.
Is genocide acceptable? Only if it creates a better future society.
Is it morally acceptable to murder political dissidents? Only if the benefit to society is large enough.

That's how utilitarian thinking plays out in the real world. It almost always seems to benefit some powerful group at the expense of less powerful people. It is often used to justify some sort of atrocity.

To me we are better off as a society if we just agree that murder is wrong period. That's why I voted against pulling the switch. I believe that justice, fairness and other concerns are also relevant, and not just the net number of fatalities.

Anyway, I felt like I needed to further explain my views.

To DS, who asked about Truman's decision to drop the A-bomb: I do not think this action was justifiable on utilitarian grounds. However, the right to self defense against an aggressive attack is also relevant. Knowing only what Truman knew when he made that decision, I think the decision was morally justifiable at the time. Obviously we now know a lot more about the effects of nuclear weapons, and a decision to use them today would be inconceivable.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 09:20 AM
Quote:
Originally Posted by John21
Most everyone would be fine if trolley switches were programmed to derail the car with the fewest people.
But being with the majority isnt a convincing argument. Its actually quite a fallacy.

Great White Fish nailed it.

The issue with forming an equation on these matters is, where to start and where to stop?

Considering overpopulation, environmental damages and the likes, Im sure you could easily argue even for the most attrocious acts in our history as being +EV in terms of "lifes saved".
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 10:12 AM
Quote:
Originally Posted by David Sklansky
Again. The question was not supposed to have details that people can hang their hat on. Change it to:

Somehow you know that five (no need for it to be fifty) people are about to die through no fault of their own. Unless you push a button. If you do, it reduces to four DIFFERENT people. YOU HAVE ZERO OTHER INFORMATION. That essentially reduces the question to "Is the fact that you did something that causes the death of four totally random people somehow a reason to not do it even though it saves five random people?"

To me the only two reasons not to push the button are "God's will" or "I don't want to feel bad" (though of course you should if you are an atheist who doesn't push.)

By the way this isn't a silly hypothetical question. It has happened several times in an almost completely analogous way. I speak of the situation pilots have found themselves in when they think their plane is going down. Regardless of whether they can bail out, they have usually first steered the plane into an area where it figures to kill fewer people even though they have changed the path that nature intended.

I have had this issue in many threads on this forums in the debates (arugments) i have got it.

That is that many people 'cannot' and 'do not' recognize an inconsistency in their logic and they think and state their position as an absolute one that they believe proves their position but when tested with an 'extreme' example, it completely falls apart. They end up taking the opposite side of what they said prior.

And instead of acknowledging that flaw they instead say (think) that because the example is extreme it is not applicable.

So in this case they might say "it is absolutely morally wrong for a person to make a choice to kill 4 people in the car to save the 5 in the train. Man should not be making those type of calls ever.' But when you change it to '4 in the car versus 100k in the train' their view changes but they cannot and do not see that their initial logic applied to the situation has therefore been invalidated by them.


I think that 'logic flaw' is the single biggest blind spot most people have. They do not understand that if their position is absolute it must hold up under the most extreme situations one could theorize. If it fails that then your argument is relative, and you must make the 'case' why 'in this instance' it is correct.

I am certain if in your example you used 4 in the car versus 5 in the train you would find most people would not throw the switch. I would guess 890%+ would not throw the switch. Change that to 4 versus 100k and it would be 90% the other way.

My goal by choosing 4 and 50 was to try and pick a point where people would find themselves more challenged to answer. to expose to people this issue (a person choosing who lives and who dies) may seem clear at certain numbers on each extreme but may vary widely at more middling points.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 11:48 AM
Quote:
Originally Posted by GreatWhiteFish
To me, the bigger question that arises from the trolley problem is whether to accept the principles of utilitarianism, and for that question I answer with a resounding, "no!"

Obviously the more extreme the example, the more the gut reaction becomes to save the majority, whether we're talking about saving hundreds of millions vs. one person's death or even 50 vs 4.

The real question is whether we base the morality of an action on the net positive and negative effects like a sort of EV equation.

When you follow this line of thinking to its logical conclusion, the justifiable actions can become very disturbing, because you're basically saying there is no such thing as a moral absolute.

So potentially ANY action can be justified IF the pay off is big enough.

Is it acceptable to murder one million? Only if it saves one million one.
Is genocide acceptable? Only if it creates a better future society.
Is it morally acceptable to murder political dissidents? Only if the benefit to society is large enough.

That's how utilitarian thinking plays out in the real world. It almost always seems to benefit some powerful group at the expense of less powerful people. It is often used to justify some sort of atrocity.

To me we are better off as a society if we just agree that murder is wrong period. That's why I voted against pulling the switch. I believe that justice, fairness and other concerns are also relevant, and not just the net number of fatalities.

Anyway, I felt like I needed to further explain my views.

To DS, who asked about Truman's decision to drop the A-bomb: I do not think this action was justifiable on utilitarian grounds. However, the right to self defense against an aggressive attack is also relevant. Knowing only what Truman knew when he made that decision, I think the decision was morally justifiable at the time. Obviously we now know a lot more about the effects of nuclear weapons, and a decision to use them today would be inconceivable.
A utilitarian would not just consider the number of fatalities except in the almost impossible eventuality that there was no other information.

Meanwhile the trolley problem isn't even about that. It is about whether an individual is being unethical if he switches things so that the one group of deaths are less of a tragedy GOING UNDER THE ASSUMPTION THAT EVERYONE AGREES THAT ONE GROUP OF DEATHS IS INDEED MORE OF A TRAGEDY. Regardless of what philosophy you are using. That could even mean that the decision involves switching to a group with a higher number of people.

Given the above, I maintain that not switching to the less tragic result MUST be the less ethical choice for the non religious. To show that imagine a psychopath who you somehow know is not lying, tells you he will use the gun he is holding to either kill everyone in group A or everyone in group B depending on who you choose. If you don't choose, everyone dies. Reluctantly you choose the group whose deaths are unquestionably less tragic (eg 90 year old terminal cancer patients versus a bunch of children.)

But now change the problem slightly. Suppose he tells you that he had ALREADY DECIDED to kill the children but that he will SWITCH if and and only if you tell him too. There are non religious posters on this thread who are making a distinction between the scenarios and arguing that in the second case (but presumably not the first) you should say nothing merely because the outcome had been decided without your input. (Actually I'm sure that even most religious people would say such a stance is wrong.)
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 01:10 PM
Quote:
Originally Posted by David Sklansky
...

But now change the problem slightly. Suppose he tells you that he had ALREADY DECIDED to kill the children but that he will SWITCH if and and only if you tell him too. There are non religious posters on this thread who are making a distinction between the scenarios and arguing that in the second case (but presumably not the first) you should say nothing merely because the outcome had been decided without your input. (Actually I'm sure that even most religious people would say such a stance is wrong.)
I find this question very intriguing and agree it forces people to deal with some of their more perceived absolute positions.

I would love to see any of the 'abstainers' answer this one?
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 01:46 PM
Presumably those voting to kill the car occupants would hope to employ a legal defence of justifiable homicide, though I'm not sure which excusing condition covers this case.

https://en.wikipedia.org/wiki/Justifiable_homicide
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 07:10 PM
Quote:
Originally Posted by Cuepee
I think that 'logic flaw' is the single biggest blind spot most people have.
It is not. Fundamental attribution error is bigger.

Also, try to keep in mind that people mostly* "know" what feels morally right/wrong and then rationalize from there. That they do not come up with bulletproof rationalizations should be expected.

*There is one unconfirmed case of someone not doing this in September 1843 so could not claim "always"
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 08:24 PM
Quote:
Originally Posted by David Sklansky
A utilitarian would not just consider the number of fatalities except in the almost impossible eventuality that there was no other information.

Meanwhile the trolley problem isn't even about that. It is about whether an individual is being unethical if he switches things so that the one group of deaths are less of a tragedy GOING UNDER THE ASSUMPTION THAT EVERYONE AGREES THAT ONE GROUP OF DEATHS IS INDEED MORE OF A TRAGEDY. Regardless of what philosophy you are using. That could even mean that the decision involves switching to a group with a higher number of people.

Given the above, I maintain that not switching to the less tragic result MUST be the less ethical choice for the non religious. To show that imagine a psychopath who you somehow know is not lying, tells you he will use the gun he is holding to either kill everyone in group A or everyone in group B depending on who you choose. If you don't choose, everyone dies. Reluctantly you choose the group whose deaths are unquestionably less tragic (eg 90 year old terminal cancer patients versus a bunch of children.)

But now change the problem slightly. Suppose he tells you that he had ALREADY DECIDED to kill the children but that he will SWITCH if and and only if you tell him too. There are non religious posters on this thread who are making a distinction between the scenarios and arguing that in the second case (but presumably not the first) you should say nothing merely because the outcome had been decided without your input. (Actually I'm sure that even most religious people would say such a stance is wrong.)

I agree with your first point. A utilitarian might argue that Einstein's life is worth more than the lives of 100 welfare recipients. I was simply focusing on the number of fatalities to avoid adding another layer of complexity.

As for your example, I do think there is a subtle difference between the two scenarios. I won't go so far as to draw any absolute conclusions, but the first scenario is basically an attempt at mitigating a tragedy that cannot be avoided. People are going to die, and you're just trying to minimize the loss.

In the second scenario, the issue lagtight brought up on the first page could be relevant.

To illustrate the point, suppose there are two groups of three people. We suppose the deaths of either group would be almost equally tragic, except the deaths of group A would be .00000001% more tragic.

Now suppose group A is hang-gliding, while group B is enjoying a picnic in a park. Group A is about to crash, and we are presented with the choice of using group B to break the fall, supposing this will certainly save group A while killing group B.

Now I realize this is sort of a dumb example, but isn't it relevant that group A has knowingly chosen to engage in a risky activity? If the deaths of either group are almost equally tragic, wouldn't you agree that group A should suffer the natural consequences of their own actions?

I think it would be wrong in this scenario for us as a third party to intervene to save group A at the expense of group B, even after accepting that the deaths of group A would be ever-so-slightly more tragic
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 08:38 PM
The Trolley Problem measures the relative strength of the two personal values or impulses, "Mind Your Own Business" and "Be a Good Samaritan" (or Be a Hero). The more people in the train required to get you to switch, the stronger the first impulse is compared to the second. While you may make your choice based on how you feel that doesn't mean the feeling isn't there for good reasons. Although, even if the feeling has developed for good reasons that doesn't mean those reasons justify acting according to the feeling in this case.


PairTheBoard
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-06-2021 , 11:50 PM
Quote:
Originally Posted by GreatWhiteFish
I agree with your first point. A utilitarian might argue that Einstein's life is worth more than the lives of 100 welfare recipients. I was simply focusing on the number of fatalities to avoid adding another layer of complexity.

As for your example, I do think there is a subtle difference between the two scenarios. I won't go so far as to draw any absolute conclusions, but the first scenario is basically an attempt at mitigating a tragedy that cannot be avoided. People are going to die, and you're just trying to minimize the loss.

In the second scenario, the issue lagtight brought up on the first page could be relevant.

To illustrate the point, suppose there are two groups of three people. We suppose the deaths of either group would be almost equally tragic, except the deaths of group A would be .00000001% more tragic.

Now suppose group A is hang-gliding, while group B is enjoying a picnic in a park. Group A is about to crash, and we are presented with the choice of using group B to break the fall, supposing this will certainly save group A while killing group B.

Now I realize this is sort of a dumb example, but isn't it relevant that group A has knowingly chosen to engage in a risky activity? If the deaths of either group are almost equally tragic, wouldn't you agree that group A should suffer the natural consequences of their own actions?

I think it would be wrong in this scenario for us as a third party to intervene to save group A at the expense of group B, even after accepting that the deaths of group A would be ever-so-slightly more tragic
Yes of course the group that knowingly was risking their lives has an attribute to works against them in the calculation as to which group most deserves saving.

But yet again I have to point out that this was not, I don't think, supposed to be the point of the trolley problem. Rather it was to contrast sins of omission to sins of commission. It assumes that for whatever reason, including the "taking on risk" reason, there is agreement as to which group would be preferable to see live on. And it asks whether a decision to not intervene (assuming no legal downsides if you do) to get that preferred result, is ethical if the only reason you have to do nothing is that you think sins of omission are no big deal.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 05:47 AM
Quote:
Originally Posted by David Sklansky
Yes of course the group that knowingly was risking their lives has an attribute to works against them in the calculation as to which group most deserves saving.

But yet again I have to point out that this was not, I don't think, supposed to be the point of the trolley problem. Rather it was to contrast sins of omission to sins of commission. It assumes that for whatever reason, including the "taking on risk" reason, there is agreement as to which group would be preferable to see live on. And it asks whether a decision to not intervene (assuming no legal downsides if you do) to get that preferred result, is ethical if the only reason you have to do nothing is that you think sins of omission are no big deal.
Hey, DS, just thought I'd warn you that BrianTheMick2 will probably alert you to the fact that you're in the wrong subforum for using the term sins.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 08:17 AM
I do nothing then kill the family some other way.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 08:51 AM
Quote:
Originally Posted by BrianTheMick2
It is not. Fundamental attribution error is bigger.

Also, try to keep in mind that people mostly* "know" what feels morally right/wrong and then rationalize from there. That they do not come up with bulletproof rationalizations should be expected.

*There is one unconfirmed case of someone not doing this in September 1843 so could not claim "always"
Ya that logic error probably ranks higher.

I should have said in my statement 'one of the biggest logic errors...' instead of the biggest.

This is another of the common 'biggest logic errors' so common that Carlin immortalized it in a joke routine.





The number of times I have dealt with people who assume that 'since that makes sense to me and that is how I would decide...' which then translates to "you are wrong if you disagree" and not "we have to agree to disagree" is astounding. There are a vast number of people who would defend to the death that they are correct and right based on the 'speed' they think makes sense while telling everyone else driving slower or faster they are wrong.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 09:01 AM
Quote:
Originally Posted by GreatWhiteFish
I agree with your first point. A utilitarian might argue that Einstein's life is worth more than the lives of 100 welfare recipients. I was simply focusing on the number of fatalities to avoid adding another layer of complexity.

As for your example, I do think there is a subtle difference between the two scenarios. I won't go so far as to draw any absolute conclusions, but the first scenario is basically an attempt at mitigating a tragedy that cannot be avoided. People are going to die, and you're just trying to minimize the loss.

In the second scenario, the issue lagtight brought up on the first page could be relevant.

To illustrate the point, suppose there are two groups of three people. We suppose the deaths of either group would be almost equally tragic, except the deaths of group A would be .00000001% more tragic.

Now suppose group A is hang-gliding, while group B is enjoying a picnic in a park. Group A is about to crash, and we are presented with the choice of using group B to break the fall, supposing this will certainly save group A while killing group B.

Now I realize this is sort of a dumb example, but isn't it relevant that group A has knowingly chosen to engage in a risky activity? If the deaths of either group are almost equally tragic, wouldn't you agree that group A should suffer the natural consequences of their own actions?

I think it would be wrong in this scenario for us as a third party to intervene to save group A at the expense of group B, even after accepting that the deaths of group A would be ever-so-slightly more tragic
I think many of the questions in this thread are difficult for most and not clear and in a poll many people would differ.

I think the bolded above is not difficult for most and would yield almost 100% agreement in most situations (such as the one you present).

For instance if you take Richard Branson (extreme risk taker) billionaire business man, skydiving and his parachute only partially opens and you know he will die upon impact with the earth as he slams into an object, but you have the option to have him first hit a homeless person, which cushions and absorbs most of the energy, such that he lives but the homeless person dies, I don't many (if any) would argue that is a 'just', 'moral', 'right' thing to do.

I understand that some might argue 'getting on a train' has known risks just as skydiving does but I think the gulf between them deserves distinction. I am curious if others would disagree?
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 09:33 AM
Quote:
Originally Posted by David Sklansky
Yes of course the group that knowingly was risking their lives has an attribute to works against them in the calculation as to which group most deserves saving.



But yet again I have to point out that this was not, I don't think, supposed to be the point of the trolley problem. Rather it was to contrast sins of omission to sins of commission. It assumes that for whatever reason, including the "taking on risk" reason, there is agreement as to which group would be preferable to see live on. And it asks whether a decision to not intervene (assuming no legal downsides if you do) to get that preferred result, is ethical if the only reason you have to do nothing is that you think sins of omission are no big deal.
The algorithm is that we (tend to) place greater moral weight on committing bad behaviors than on neglecting good behaviors.

Failure to render aid laws tend to be rare and newsworthy due to this gap. We also tend to think worse of a person who nabs a tip than one who fails to tip. We also tend to think worse of someone who lies than someone who withholds truth. The trolly problems simply points out that our algorithms don't always lead to the best results, which is apparently surprising to idiots.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 11:19 AM
Then there's the guy who's convinced the train is going to derail and pulls the switch to kill the 4 people when in fact the train engineer had the situation well in hand with plenty of brake to avoid the derailment.


PairTheBoard
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 11:32 AM
Even "Thou Shalt Not Kill", written in stone according to myth, allows an exception for self defense. Similarly, practical morality in general provides a large allowance for self interest. As it must. If it didn't, people would reject it.


PairTheBoard
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 11:43 AM



PairTheBoard
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 02:21 PM
Quote:
Originally Posted by PairTheBoard
Even "Thou Shalt Not Kill", written in stone according to myth, allows an exception for self defense
That is one of many permissible exceptions. "Thou shalt not kill members of your group unless they do something annoying," is the rule.
Ethical Dilemma. What would you do?  Trolley Problem. Quote
07-07-2021 , 05:11 PM
Quote:
Originally Posted by Cuepee
Ya that logic error probably ranks higher.

I should have said in my statement 'one of the biggest logic errors...' instead of the biggest.

This is another of the common 'biggest logic errors' so common that Carlin immortalized it in a joke routine.





The number of times I have dealt with people who assume that 'since that makes sense to me and that is how I would decide...' which then translates to "you are wrong if you disagree" and not "we have to agree to disagree" is astounding. There are a vast number of people who would defend to the death that they are correct and right based on the 'speed' they think makes sense while telling everyone else driving slower or faster they are wrong.
George Carlin is my favorite stand-up comic of all time.

Unfortunately, in his later years his "humor" became rather mean-spirited. He often came across as being bitter and angry.

Even so, he was a comic genuis. Let me restate that: He was the comic genius.
Ethical Dilemma. What would you do?  Trolley Problem. Quote

      
m