Open Side Menu Go to the Top
Register
A Simple Version Of The Trolley Problem I Haven't Seen A Simple Version Of The Trolley Problem I Haven't Seen

03-21-2017 , 07:37 PM
The train will kill two people and you can save only one. You are not one of those people who thinks its OK to walk away or even to indulge whims. You want to save the one that is ethically more important to save. Neither victim can be expected to save other lives nor to take other lives if he lives. So your choice depends on other attributes. (Both victims figure to live awhile if saved.)

In some cases the difference in attributes is such that almost everyone would agree that the person should be saved is x versus than y.

But I now change the question in one important way. The track is set up so that there are THREE people about to die and you can either save the one x or the two y's with lessor attributes. Does this change your answer? Does it depend on the degree of difference in the attributes? Might you save the one rather than two but not the one if you could save twenty y's?

Edit: Lets eliminate from consideration those y's whose attributes are so bad that their death might be considered desirable.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-21-2017 , 08:06 PM
Not to rain on your party, but two serial killers and a 9 year old girl make this a very easy answer for your THREE question, for 95% of people imo.

Without special nastiness in any of the participants, I think people choose killing the lower number though. There's a reluctance to judge and an even larger reluctance to judge and execute. Most people cower from this task unless they can fit someone into the "nasty" box, then it's open season. Not sure what that says about human nature, but it can't be good.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-21-2017 , 08:41 PM
Quote:
Originally Posted by ToothSayer
Not to rain on your party, but two serial killers and a 9 year old girl make this a very easy answer for your THREE question, for 95% of people imo.

Without special nastiness in any of the participants, I think people choose killing the lower number though. There's a reluctance to judge and an even larger reluctance to judge and execute. Most people cower from this task unless they can fit someone into the "nasty" box, then it's open season. Not sure what that says about human nature, but it can't be good.
I edited the OP to deal with your points.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-21-2017 , 09:31 PM
I'd save a normal person over 2 downies, slam dunk.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-21-2017 , 09:36 PM
Thread just got real. Sklansky needs to make a new thread. Perhaps involving Down Syndrome, relative IQs, months left to live, etc.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-21-2017 , 10:03 PM
One better quality X is worth at least twenty mediocre Y's, or two health-riddled human parasites, or two lower order humans - as H.L. Mencken would put it.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-23-2017 , 04:39 PM
A better version: If you do nothing, the trolley will run over one normal person. If you pull the lever, it will run over 5 people who really like to talk about trolley problems.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-23-2017 , 07:13 PM
Quote:
Originally Posted by BrianTheMick2
A better version: If you do nothing, the trolley will run over one normal person. If you pull the lever, it will run over 5 people who really like to talk about trolley problems.
TTLM, dude. "Normal" is a relative term.

Spoiler:
Trolley Talkers Lives Matter
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-23-2017 , 08:58 PM
Came in expecting supermarket trolley problems.

Leaving dissatisfied.

In a difficult dilemma the universe knows best, so you ought leave it to chance and flip a coin.
This I do not consider 'indulging in a whim'.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-23-2017 , 10:25 PM
One persons x is another mans y
edit: The OP was "designed" exactly against this... Which may be a problem concerning consensus.

Last edited by drowkcableps; 03-23-2017 at 10:32 PM.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 12:19 AM
Quote:
Originally Posted by Pokerlogist
TTLM, dude. "Normal" is a relative term.

Spoiler:
Trolley Talkers Lives Matter
My operational definition of "normal" included everyone not in the other group.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 12:20 AM
Quote:
Originally Posted by drowkcableps
One persons x is another mans y
edit: The OP was "designed" exactly against this... Which may be a problem concerning consensus.
It could have been shortened to "do you prefer people you prefer?"
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 12:39 AM
Quote:
Originally Posted by David Sklansky
The train will kill two people and you can save only one. You are not one of those people who thinks its OK to walk away or even to indulge whims. You want to save the one that is ethically more important to save. Neither victim can be expected to save other lives nor to take other lives if he lives. So your choice depends on other attributes. (Both victims figure to live awhile if saved.)

In some cases the difference in attributes is such that almost everyone would agree that the person should be saved is x versus than y.

But I now change the question in one important way. The track is set up so that there are THREE people about to die and you can either save the one x or the two y's with lessor attributes. Does this change your answer? Does it depend on the degree of difference in the attributes? Might you save the one rather than two but not the one if you could save twenty y's?

Edit: Lets eliminate from consideration those y's whose attributes are so bad that their death might be considered desirable.
Is this supposed to tug on our intuitions about whether to maximize average or aggregate utility? Sort of a Repugnant Conclusion, but for Trolley Problems?
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 12:54 AM
Quote:
Originally Posted by Original Position
Is this supposed to tug on our intuitions about whether to maximize average or aggregate utility?
My guess is "no."
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 05:21 AM
Always save the one you know the best to be a decent human because the avg human is not exactly that lol. And 2 of them will not make things better for the system.

Clearly who "you" save is irrelevant because there is no free will anyway (but who you save is relevant - without the quotes lol). The universe decided for you and you are part of it but not all of it. Your "decision" belongs to a greater system that created it and your brain as well!

A rare good heart well educated human is easy "choice" always (from the point of view or a remote observer). The avg human is detrimental to the planet actually, even if not a terrible person exactly. A "good" individual can have greater overall impact simply by inspiring others constantly every day instead of agitating them. The avg ahole out there will create amazing damage over a lifetime.

That is a sad reality and true through all evolution even. There is however clearly a point where saving the bigger number avg people (= read avg as in unknowns), even "bad" qualities people, is the better choice because after all mankind is a net plus EV game of big numbers landing rare victories by individuals and the system that "made" them, precisely because of multi-body interactions that build wisdom over time and the future turns around things, even from bad beginnings. So saving all Taliban is better than saving me. So dont save me then.

Make sure when you save one to tell them that they are now playing for two and so do you!


PS: Any group that includes Trump and Putin is an easy choice too (as reject). You get extra bonus points from "heaven" lol.

Last edited by masque de Z; 03-24-2017 at 05:41 AM.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 06:20 AM
See it as a simple model probability game too. Lets say an individual has p1 chance for something remarkable by living that benefits the system and the random person has p.

Then for a group of N, ignoring nonlinear interactions among them (that cannot be truly ignored though exactly in a better model of reality) the chance for the great thing to happen is 1-(1-p)^N for the many and clearly ;

(1-(1-p)^N)>p1 for some N like N>Log[1-p1]/Log[1-p]

Eg if someone has chance 0.01 to make a miracle "gift" to mankind and the avg human 0.000001 then N>10050 in order to choose the many to live.

If it is p1=0.001 vs p=0.0001 then N>10.

If p1=0.5 and p=0.1 then N>6

If p1=0.5 and p=0.3 then N>=2 is a better choice to save for example.

Since there can be a number of good things a given "good" individual can deliver and the same true for the group of others (especially their offspring too) one has to include all of them to the calculation. One must also include nonlinear effects from multi-person interactions. So the above number becomes much smaller if we include these. Eg 100 people surviving will have families etc and "live" longer in terms of impact. The offspring consequence is not trivial in the calculation but true qualities in human civilization are hard to quantify too...

The thread is of more value if we make it more mathematical and do not necessarily target people but other things for selection criteria in better understood systems.


An even better model treats mankind and the 2 choices as a combined system (not each group isolated from the world) and runs forward predictions for both outcomes and decides better. A good person may have only 0.1% to make the miracle happen on their own but they can also increase the chance for others in their lives to make it by simply interacting with them over time. This is why i am super confident i choose the better person easily over a small number of randoms. Being a decent human has dramatically positive effect in the system over time. It is what saves it actually pretty often. The world survives on very few good people lol if you think about it that hold the system together at critical points every day. But a big number of randoms has also long term positive consequences because they have much more interactions and opportunities to create wisdom for the system eventually. Do not also underestimate the work all people offer to society in general in a variety of ways even boring ones. So a big number of them can eventually cover a big number of risks better than an individual even if a good one (in a system that values humans even better).

Eg in a Mars early colony losing 1 very good, even the best (unless all lives depend on that one somehow- but that is poor mission planning) is better than losing 5 avg if the colony has only say 20-30 members.

Last edited by masque de Z; 03-24-2017 at 06:34 AM.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 07:39 AM
People shouldn't be hanging around on train tracks. Maybe they failed the IQ test of life, and we should let them cull themselves from the herd.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-24-2017 , 06:06 PM
definitely depends on the degree of difference in attributes and how much more likely x is to have a meaningful impact on humanity than the y's.

the interesting question would be at what thresh-hold is saving one x no longer the better choice. how many y's would there need to be? what percentage of the population would the y's have to make up before one decides that not saving the x is worth it.... 1/4 of the population? 1/5 of the population? etc...

basically, how much of our population would an individual trade for 1 x, assuming the 1 x has a much higher probability of having a meaningful impact on humanity?

still, the degree of difference in attributes between x and y is extremely important and will greatly impact the choice depending on how many y's are being traded for x...

Last edited by Wealth$; 03-24-2017 at 06:29 PM.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 02:56 AM
I'm saving the 2.

Sorry for not providing a lengthy answer.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 07:37 AM
Quote:
Originally Posted by Wealth$
definitely depends on the degree of difference in attributes and how much more likely x is to have a meaningful impact on humanity than the y's.
9 people in wheelchairs
8 able bodied people

Which group do you save?
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 08:29 AM
Women and children first IMO.

A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 08:42 AM
I'd love to see an intellectual justification for female lives saved before male lives. Or the idea that a man should meekly accept his impending death so that a women he doesn't know might live.

Youth vs age at least makes sense as an argument (the young have much life left to live; the old have had a decent run of life already), but women before men?
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 09:13 AM
I would probably be frozen with indecision and end up getting run over myself.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 11:19 AM
Quote:
Originally Posted by ToothSayer
I'd love to see an intellectual justification for female lives saved before male lives. Or the idea that a man should meekly accept his impending death so that a women he doesn't know might live.

Youth vs age at least makes sense as an argument (the young have much life left to live; the old have had a decent run of life already), but women before men?
Boobs.
A Simple Version Of The Trolley Problem I Haven't Seen Quote
03-25-2017 , 11:22 AM
Boobs aren't much good when you're dead.

Also, why are granny boobs included? Surely granny boobs aren't worth much. Probably negative net worth.

I declare your attempted intellectual justification for this horrific sexism a

A Simple Version Of The Trolley Problem I Haven't Seen Quote

      
m