Quote:
Originally Posted by David Sklansky
Although I am not an atheist I think you were talking to me. Why did you think I would disagree with you.
I haven't really followed too closely what your views are. My reply really wasn't specifically about how I thought you would think. I thought people might disagree with me because to the extent that I keep up with atheist blogs / public access call in shows / books / New Atheists in pop culture, etc., there seems to be a lot of hand waving to the effect that morality is not subjective, and I am essentially claiming that it is.
But as someone else in the thread pointed out, we don't have to go to that level; we can discuss morality on a practical level.
If we accept that morality is essentially arbitrary, although perhaps consistently based on a set of axioms, I can only assume that the X% of people who would not feel comfortable diverting the train, or in your new example, who would let 5 people die instead of 1, must have an axiom which implies that it is better to allow 5 people (or perhaps 500, or all of humanity) die by inaction than it is to cause one person to die by your own action.
I don't think you can criticize it for being unreasonable or illogical since it is axiomatic. Many people seem to actually feel this way, however I suppose the point of all these discussions is to figure out whether they would acknowledge this kind of axiom, or whether they are expressing this choice (not divert the train) and it is actually in conflict with their stated moral principles.
Sort of like people have an asymmetrical tolerance for winning and losing, even if we are talking about the same amount of money -- maybe that isn't rational, maybe it is based purely on emotion, and it is universally a leak in someone's game to react more strongly to losing 3 buy-ins in a session (by maybe quitting, even if it is a good game) than they would to winning 3 buy-ins.
But then again, it is easy in poker to evaluate things like that because in poker we know the object is to win a lot of money. With moral questions, we have to decide what the rules are, what the objective is, and then decide what the best course of action is.
I know I'm rambling, but it seems pretty clear: either you have some moral axiom prohibiting your own direct action (as distinct from an "inaction") from causing harm, and place that higher on the list of priorities than the general axiom about it is good for people to not get killed, OR you can't justify not saving the people.
But maybe I should study some of the arguments people who have thought about all this for more than 3 minutes have made, so I'm not totally talking out of my ass?