Open Side Menu Go to the Top
Register
Question about Trolley problems Question about Trolley problems

10-01-2009 , 09:30 AM
Quote:
Originally Posted by durkadurka33
You'd have to be VERY careful how you justified this, though. The 1 person would have to be some super genius or special person that would provide >5x the happiness than the 5 people being killed.

What you said was pretty much an empty tautology if that's all you meant. If we assume ceteris paribus that the 5 people and the 1 fat guy are effectively equal, then you're wrong. There's no 'cost' to justifying the killing of the one to save the many...which is why utilitarianism is viewed as 'cold.' They'd make the move without a second thought.
That the one person could be "better" or "more valuable" than the 5 others doesn't even have to be on the list of considerations you need to take my statement for true. You're never going to get agreement on a metric for such things anyway, so it doesn't really matter.

The reason utilitarianism is portrayed as cold is because it is usually either misunderstood or willfully misrepresented, like in your post.

It doesn't take brilliance to understand that accepting sacrifice of innocent bystanders carries costs that are unrelated to the specific scenario at hand.

Last edited by tame_deuces; 10-01-2009 at 09:35 AM.
Question about Trolley problems Quote
10-01-2009 , 10:28 AM
Quote:
Originally Posted by tame_deuces
That the one person could be "better" or "more valuable" than the 5 others doesn't even have to be on the list of considerations you need to take my statement for true. You're never going to get agreement on a metric for such things anyway, so it doesn't really matter.

The reason utilitarianism is portrayed as cold is because it is usually either misunderstood or willfully misrepresented, like in your post.

It doesn't take brilliance to understand that accepting sacrifice of innocent bystanders carries costs that are unrelated to the specific scenario at hand.
+1

I don't have On Liberty on hand, but Mill has a great quote in it where he says something like "Utility is the ultimate consideration in all ethical questions, but only in the greatest sense, grounded in the interests of man as a progressive being."

It's funny because, to take the stance that numbers of people don't matter (or matter less than usual) in this trolley problem, you're basically saying that the costs of taking a numbers approach outweigh the benefits... which is, to me, essentially just a more nuanced application of utilitarianism.
Question about Trolley problems Quote
10-01-2009 , 06:10 PM
I'm definitely not misrepresenting utilitarianism in my post.
Question about Trolley problems Quote
10-01-2009 , 06:28 PM
Quote:
Originally Posted by durkadurka33
No, the point is that we have conflicting intuitions!

Deontology and utilitarianism both hit at moral intuitions. That's the source of the paradox. If you ascribe to a single system (either deontology or utilitarianism) then the paradox is immediately dissolved.

The utilitarian will say to do whatever maximizes expected utility: save as many lives as possible. So, the trolly driver must switch tracks to kill the one and save the many. Also, you should shove the one man to save the many.

The deontologist will have (usually) different answers. In the first, it is unclear. I think that the deontologist would say that the driver shouldn't switch tracks since doing so may treat the one man as a means to save the lives of the many. It's REALLY unclear what a deontologist would say here. The second case is clear, though...do NOT shove the one man to save the many: you would be using him as a mere means and that is wrong.

So, no paradox/dilemma if you're clearly a deontologist or utilitarian...the paradox comes when you have the conflicting intuitions coming from both theories.
This isnt conflicting intuitions, this is a conflict between moral intuition and moral reasoning. Thats what I was trying to say. Moral reasoning is all well and good, it just doesnt really reflect much of what we really do when we make moral decisions. Regardless of whether you are a deontologist or a utilitarian, or something else, unless you've spent some time thinking specifically about these very types of problems, and sort of molded your intuition, you are going to pick basically the same answer as every other human being on earth will. And thats because we mostly act without checking to see if our decision fits in with the moral philosophy we have stamped onto our business cards. We jsut go by instinct. And the human moral instinct is, for the most part, pretty universal. Thats pretty interesting to me.
Question about Trolley problems Quote
10-01-2009 , 06:28 PM
Quote:
Originally Posted by durkadurka33
It doesn't take much imagination to see how it could conflict.

I will say this, though, if that's really what you mean, then you're going to have to be very very careful how you define "foreseeable" as well as qualify it as "reasonably foreseeable." That's no small task. Is it REALLY foreseeable that the trolley driver would have to make this sort of decision and have ready-at-hand, when the situation arises, a well-formed decision? No, that's implausible at best...absurd at worst. I'm not saying that you're completely off the trolley tracks with your idea...but it's a lot more difficult than you seem to think.
I don't want to get off on the wrong foot with you, but "It doesn't take much imagination to see..." is kind of a douchey thing to say when someone asks a simple question prefaced by an admission of general ignorance of academic philosophy. Also, is there a requirement that for something to be moral a good person be able to arrive at the correct moral conclusion at a moment's notice? Can't we instead ask that people do their best while being guided by some basic principles (such as harm that befalls someone who had a reasonable chance of avoiding the harm is preferable to harm forced upon someone or harm befalling someone who does not have that same opportunity of avoidance) and when falls short of an optimal decision in an honest attempt to apply the principle we chose to educate rather than punish the transgressor?
Question about Trolley problems Quote
10-01-2009 , 06:29 PM
Quote:
Originally Posted by durkadurka33
You'd have to be VERY careful how you justified this, though. The 1 person would have to be some super genius or special person that would provide >5x the happiness than the 5 people being killed.

What you said was pretty much an empty tautology if that's all you meant. If we assume ceteris paribus that the 5 people and the 1 fat guy are effectively equal, then you're wrong. There's no 'cost' to justifying the killing of the one to save the many...which is why utilitarianism is viewed as 'cold.' They'd make the move without a second thought.
This is why I'm a utilitarian. It lets me do whatever I want. Pretty sweet moral philosophy.

But yeah, I agree with tame. The best way to illustrate it is, ironically, by using the thought experiment that most people think is extremely damaging to utilitarians. Namely, the organ harvest one. It is quite obvious to me that the utility gained by sacrificing one person in order for 5 to live doesnt even remotely outweigh the costs.
Question about Trolley problems Quote
10-01-2009 , 06:37 PM
Quote:
Originally Posted by durkadurka33
I'm definitely not misrepresenting utilitarianism in my post.
By stating that "the 1 person would have to be some super genius or special person that would provide >5x the happiness than the 5 people being killed" for the utilitarian to not change the tracks, you definitely have.

This notion of 'happiness' is not necessarily confined to the people immediately involved in the situation, such that a utilitarian can easily conclude that changing the tracks is ethically improper, even if the life saved is a bum and the five killed are nobel prize winners. Without a deeper understanding of utility, the utilitarian as you're painting him would be going around harvesting healthy people's organs to save those in need, especially if the person in need of a vital organ is "some super genius or special person."

Edit: Credit to the people that beat me to the post lol... the organ example is very standard and good.
Question about Trolley problems Quote
10-04-2009 , 02:52 PM
To hopefully clarify what others are saying: Act utilitarianism 'fails' at trolley problems. Rule utilitarianism does not.

In a vacuum where there are no other consequences a utilitarian would be forced to push the fat man. (I actually don't have a problem with this if we truly were in a vacuum). But clearly, any society where people are killing each other constantly is going to have a lower overall utility due to people not being able to live their lives without the constant fear of being killed. Thus, you can't do it.

One interesting question might be, assuming everyone else followed the rules and didn't kill people in similar scenarios, and assuming my actions would not be made known to others (the man would have been considered to die in an accident), would I then be forced to push the fat man? I think likely so.
Question about Trolley problems Quote
02-18-2017 , 12:16 PM
bump
Question about Trolley problems Quote
02-18-2017 , 02:49 PM
I'm sure this has been asked before, but since someone bumped this very old thread does anyone know what became of durkadurka? It doesn't look like he has posted since 2012.
Question about Trolley problems Quote
02-19-2017 , 03:58 AM
Quote:
Originally Posted by Philo
I'm sure this has been asked before, but since someone bumped this very old thread does anyone know what became of durkadurka? It doesn't look like he has posted since 2012.
http://forumserver.twoplustwo.com/sh...86&postcount=2
Question about Trolley problems Quote
02-19-2017 , 08:49 PM
Quote:
Originally Posted by TomCowley
Does this mean no one knows?
Question about Trolley problems Quote
02-19-2017 , 10:00 PM
Correct.
Question about Trolley problems Quote
02-19-2017 , 11:00 PM
The inside story is that he took a trip to South America and while travelling on a bus it plunged over a cliff, bounced down into a river valley, and then was washed over a waterfall. Durkadurka was the only survivor. Based on this miracle, he started his own religion and made several billion dollars which he stashed away in secret Swiss bank accounts.

He now works for Trump. As any utilitarian should.
Question about Trolley problems Quote
02-20-2017 , 12:51 AM
Quote:
Originally Posted by Philo
Does this mean no one knows?
He disappeared with whatever the account/password issue was around that time and hasn't been seen since.
Question about Trolley problems Quote
02-20-2017 , 01:25 AM
i missed this thread the first time and the actual discussion is ridiculous at some points.

google AI/car is now a perfect example of this. this is actually a real problem they are facing in programming the car:

background
- car is self driving, driver has no control and no way to gain back control at this exact moment
- car's breaks fail
- the car is headed towards a busy crosswalk where the people walking have the right of way (so they accepted 0 risk in crossing the crosswalk when they have a green light). say there's 5 people who would be hit by the car if it continues.
- there's 1 person on the sidewalk that the car can steer towards rather than hit the 5 people. there's nowhere else for the car to go (imagine 1 person in every conceivable place the car can steer towards if it wants to steer away from the crosswalk if you wanna be a nit about it). so no matter what, the car is either steering INTO 1 person, or it can do nothing and let the car hit 5 people.

issue
how do programmers deal with this issue? either the car has to make a specific decision to hit 1 person, or the car can be instructed to make no decision and let the crosswalk people get hit.
Question about Trolley problems Quote
02-20-2017 , 01:30 AM
Quote:
Originally Posted by TomCowley
He disappeared with whatever the account/password issue was around that time and hasn't been seen since.
This makes me have much more respect than I had during my arguments with Durka.
Question about Trolley problems Quote
02-20-2017 , 01:34 AM
Quote:
Originally Posted by UpHillBothWays
i missed this thread the first time and the actual discussion is ridiculous at some points.

google AI/car is now a perfect example of this. this is actually a real problem they are facing in programming the car:

background
- car is self driving, driver has no control and no way to gain back control at this exact moment
- car's breaks fail
- the car is headed towards a busy crosswalk where the people walking have the right of way (so they accepted 0 risk in crossing the crosswalk when they have a green light). say there's 5 people who would be hit by the car if it continues.
- there's 1 person on the sidewalk that the car can steer towards rather than hit the 5 people. there's nowhere else for the car to go (imagine 1 person in every conceivable place the car can steer towards if it wants to steer away from the crosswalk if you wanna be a nit about it). so no matter what, the car is either steering INTO 1 person, or it can do nothing and let the car hit 5 people.

issue
how do programmers deal with this issue? either the car has to make a specific decision to hit 1 person, or the car can be instructed to make no decision and let the crosswalk people get hit.
Yep, totally odd that a discussion taking place in 2009 about trolley problems didn't talk much about self-driving cars.

Fortunately, self-driving cars don't have to worry about trolley problems. They can be programmed to completely avoid being in such situations.
Question about Trolley problems Quote
02-20-2017 , 02:33 AM
Quote:
Originally Posted by TomCowley
He disappeared with whatever the account/password issue was around that time and hasn't been seen since.
I see, thanks. I suspect he may have been disillusioned by the woeful academic job market.
Question about Trolley problems Quote
02-22-2017 , 01:06 AM
Rumor has it that he and madnak spend all their time texting arguments about free will back and forth.

PairTheBoard
Question about Trolley problems Quote

      
m