Open Side Menu Go to the Top
Register
Robot ethics in times of war Robot ethics in times of war

04-03-2008 , 09:49 AM
I created this thought experiment where I bump into some fundamental questions. I hope you can tell me what you think about it.

Picture in the near future a military base on enemy grounds. The americans and russians are to work together to defend this compound and plan their joined missions from there. Both the russians and americans have brought a robot unit with guns mounted on them. For this experiment you are free to imagine these robotic units as kickass as you want.

Dusk starts to set in. 2 soldiers are placed in guard towers and the robot units are placed on the lawn with their guns facing the perimeter.

The general of the base has given the same orders to the soldiers and the robot units namely:
  • Fire 4 warningshots at any intruders
  • Then shoot to kill any intruders

In the distance a large group of people is approaching fast. When they near the fence they become visible to the eyes of the soldiers and scanners of the robots. It looks like a large group of men, women and children is chased out of a nearby village by an enemy tank. They are clearly civilians seeking refuge and protection.

As the first of the group make it over the fence, the soldiers and the robot units all start to fire warningshots and flares.

This does not deter the group of people, because they can't go back anymore with the tank closing in. They run onwards... then this happens:

- Soldier 1 The russian soldier bites on his tongue and starts aiming live rounds.
- Soldier 2 The american soldier thinks of how his mother used to call him a good boy and he keeps firing warningshots, thereby disobeying a direct order.
- Unit 1 The russian robot encounters an escape clause, written there by a rogue programmer (activated when the robot scans midgets or children), so it stops firing altogether and returns with a malfunction error.
- Unit 2 The american robot starts firing real rounds into the intruding crowd.

Nobody of the civilians survive. When the enemy tank comes within range both robot units fire a missile that obliterates it.

A journalist in the area has the whole encounter on tape and hands it over to a war tribunal. Justice is demanded.

Who is morally at fault for this encounter? Who is punishable by law of war?

Do you think nobody is to blame morally or by law in times of war?
The manufacturers of the robot units are to blame?
The mothers of the soldiers?
The general only?

What is your opinion on this?
Robot ethics in times of war Quote
04-03-2008 , 12:37 PM
First and foremost the moron who placed a defensive compound on enemy territory.

The following are my opinions on the issue:

Well, jokes aside. If one suspected there could be actual threats amongst the crowd then opening fire is acceptable. However both for the civilian's sake and the future of the troops other options should be weighed considerably. When the orders does not allow for such a thing in a situation like this then clearly command has royally messed up.

And yes war is a grissly thing btw.

Once you're saying 'they're clearly civilians' you have really answered the question. If it is no inch of a doubt there could not be threats within the crowd then opening fire is clearly wrong. Warning shots and crowd control are fine, you don't really want a bunch of civilians running around creating havoc - odds are it would make the situation for them even more dire. Knowing that for certain is however not very realistic in a situation like this.

Pending on your view on war the whole scenario may of course be immoral to start with for every soldier/robot/manufacturer involved.

Last edited by tame_deuces; 04-03-2008 at 12:47 PM.
Robot ethics in times of war Quote
04-03-2008 , 01:12 PM
A base in enemy territory with advanced robots defending the perimeter but no surveillance and reconnaissance ability to determine who is running at the base or detect that they are being chased by an enemy tank? Oh, and this enemy tank is making a run directly at this US base with no regard to itself but in a mad dash to kill women and children?

You really spent time thinking about this scenario?

BTW, you might be interested to know that the US is developing non-lethal crowd control measures such as the active denial system (google ray gun) to deal with situations where friend and foe cannot easily be distinguished, but keeping them away from a checkpoint or perimeter is important.
Robot ethics in times of war Quote
04-03-2008 , 01:28 PM
Quote:
Originally Posted by yossarian lives
BTW, you might be interested to know that the US is developing non-lethal crowd control measures such as the active denial system (google ray gun) to deal with situations where friend and foe cannot easily be distinguished, but keeping them away from a checkpoint or perimeter is important.
Nice, thanks for the keywords. If this thing lives up to its intentions, it could be a complete revolution.
Robot ethics in times of war Quote
04-03-2008 , 01:54 PM
Ok, if you are having trouble imagining this, I'm sorry. Please to make it fit inside a plausible scenario where both soldiers and robotic units get the same orders.

The scenario was loosely based on a enclave of Dutch U.N. soldiers in Bosnia.

Do you think soldier 2 (disobeyed order) and unit 1 (disobeyed order) should be trialled the same? Is unit 1 responsible for it's own actions or should every robotic unit be appointed a legal 'guardian' that takes the brunt?
Robot ethics in times of war Quote
04-03-2008 , 02:01 PM
Let's just jump any moral discussion and pretend there IS a trial and that shooting was a huge error, which will make the question much easier:

The question would depend on the robot. Sentience/advanced neural network machines could very stand trial for their own actions without much problem. More simplistic machinery and you would have to look elsewhere.

As for fair distribution of responsibility. Command should bear the brunt of the burden, with punishments distributed as one see fit to individuals on the receiving end of orders. It is hard to say no to orders, even bad ones (there is a lot of material on this issue, so this is a solid statement) and that leaves officers with a greater degree of responsibility.
Robot ethics in times of war Quote
04-03-2008 , 02:24 PM
Blame is a joke. It's very easy to come up with situations in which blame makes no sense, because blame makes no sense in the first place. It's probably part of human biology, but it's not logical.

With that said, in this case blame makes some degree of sense because it can provide incentive to do better next time, eliminate dangerous, irresponsible, or unethical elements, and so on.

Who is at fault? The soldier who fires, the general giving the order, and either the programmers, the design team, or someone else in control of the robots. We don't know what happens behind the scenes in terms of robot programming and deployment, so it's hard to say. If the policy of the entire army is to leave tricky ethical decisions to robots incapable of anything more sophisticated than a simple "if/then" clause, the policy is borked and whoever invented it is responsible.

Obviously if robots are incapable of thought, much less judgment, then there must exist human over-rides and the robots must be monitored constantly while deployed. You don't leave a machine 100% incapable of understanding the consequences of its actions to unilaterally make important decisions.
Robot ethics in times of war Quote
04-03-2008 , 02:26 PM
Quote:
Originally Posted by madnak
Obviously if robots are incapable of thought, much less judgment, then there must exist human over-rides and the robots must be monitored constantly while deployed. You don't leave a machine 100% incapable of understanding the consequences of its actions to unilaterally make important decisions.
One thing popped into my head when I read this; Where do you fit mines?

Just curious, I know they're not robots but I still thought it was a good question.
Robot ethics in times of war Quote
04-03-2008 , 02:30 PM
Quote:
Do you think soldier 2 (disobeyed order) and unit 1 (disobeyed order) should be trialled the same? Is unit 1 responsible for it's own actions or should every robotic unit be appointed a legal 'guardian' that takes the brunt?
Blame never has use except when applied to agents. These robots sound like they have extremely simplistic programming and are not agents.

Sentient robots thinking in the same terms as human beings should be tried on the same basis as human beings, but with consideration given to their unique situations. (Do they feel emotions? If not, then "punishment" can have no effect. Can they be reprogrammed? If so, then that is a better solution than destruction or "life" imprisonment. Is it possible to assign some kind of negative feedback or incentive to their processes, preventing them from making this mistake again? The ability to "learn" ethics with 100% reliability should exempt the machines from moral consideration. Etc.)

But neither the unit nor the soldier who disobeyed orders should be tried. Only those that followed the orders. The idea of "follow any order, no matter what" is absurd, and obviously unjust or absurd orders should not be followed. But the general giving the orders/programming should receive the bulk of the accountability.
Robot ethics in times of war Quote
04-03-2008 , 02:32 PM
Quote:
Originally Posted by tame_deuces
One thing popped into my head when I read this; Where do you fit mines?

Just curious, I know they're not robots but I still thought it was a good question.
Okay, okay. Sometimes exceptions may have to be made. War is war, after all. And personnel may be limited. But I think this is a good example - a simplistic robot is very similar to a mine. And it's equally reasonable to subject it to blame as to subject a mine to blame. Obviously if a bunch of innocent villagers runs over a minefield, they'll die (unless they're extremely lucky). That's not the fault of the mines, it's the fault of the person who set them.

Of course, in some cases it may be the fault of the villagers - I mean, they did run into the minefield/base, so it's hardly My Lai or anything.
Robot ethics in times of war Quote
04-03-2008 , 03:26 PM
Quote:
Originally Posted by 46:1
Do you think soldier 2 (disobeyed order) and unit 1 (disobeyed order) should be trialled the same? Is unit 1 responsible for it's own actions or should every robotic unit be appointed a legal 'guardian' that takes the brunt?
Are you seriously asking whether a robot should be tried for a war crime?

.... Okay, I'm out. Enjoy the rest of your thread.
Robot ethics in times of war Quote
04-03-2008 , 03:28 PM
Quote:
Originally Posted by tame_deuces
One thing popped into my head when I read this; Where do you fit mines?

Just curious, I know they're not robots but I still thought it was a good question.
(Okay, not quite out...)

Land mines should be court martialed for murder, ldo. And if found guilty, we should have a land mine prison where the land mines sit behind bars for years. We can put it right next to the robot prison where robots are making license plates.

(Now I'm really out.)
Robot ethics in times of war Quote
04-03-2008 , 04:52 PM
Yeah yossarian this was what I was partially aiming at.

If you deploy only sentient robots into a battlefield and nobody is to blame after stuff goes wrong (the robots cant be put behind bars, and the generals/manufacturers just made vague orders/rules), you can shift the blame from all war crimes, just because no human soldier was involved.

To me any robot, landmine, or soldier is simply an extension-tool of command. If you set a mine to explode, or you order a soldier to shoot anything that moves to me is quite the same.
Robot ethics in times of war Quote
04-04-2008 , 03:12 PM
I don't really think that the soldier or robot that did the fighting was wrong at all. In fact, the rogue robot or the disobedient soldier should probably be removed from combat.

The soldiers in this case are essentially robots. They aren't supposed to use their own emotional judgment. They were given instructions that they have to follow regardless of the outcome. Just because the instructions were bad doesn't make them bad.

Doesn't following orders in the military relinquish you of any blame? If your commander tells you to kill this little girl because she's a spy, and you do it, I can't possibly imagine you getting in trouble.

A bigger issue is what if a robot had been told to "get lost!" by it's commander in his most sever voice and took it literally. Also, these robots are programmed as to allow humans to be hurt as a result of their own passiveness.
Robot ethics in times of war Quote
04-04-2008 , 03:42 PM
Quote:
Originally Posted by PrekiGeo
I don't really think that the soldier or robot that did the fighting was wrong at all. In fact, the rogue robot or the disobedient soldier should probably be removed from combat.

The soldiers in this case are essentially robots. They aren't supposed to use their own emotional judgment. They were given instructions that they have to follow regardless of the outcome. Just because the instructions were bad doesn't make them bad.

Doesn't following orders in the military relinquish you of any blame? If your commander tells you to kill this little girl because she's a spy, and you do it, I can't possibly imagine you getting in trouble.

A bigger issue is what if a robot had been told to "get lost!" by it's commander in his most sever voice and took it literally. Also, these robots are programmed as to allow humans to be hurt as a result of their own passiveness.
You are very wrong.

Every American serviceman is required to take annual Law of Armed Conflict training which would show your example to be a war crime. Servicemen are responsible for their actions to a certain degree. It is their DUTY to disobey unlawful orders (thus, why they receive LOAC training each year to know the difference between shooting a combatant and shooting a little girl in cold blood who no longer poses a threat).

From http://usmilitary.about.com/cs/wars/a/loac.htm

Quote:
Three important LOAC principles govern armed conflict—military necessity, distinction, and proportionality.

...

Military necessity: Military necessity requires combat forces to engage in only those acts necessary to accomplish a legitimate military objective. Attacks shall be limited strictly to military objectives. In applying military necessity to targeting, the rule generally means the United States Military may target those facilities, equipment, and forces which, if destroyed, would lead as quickly as possible to the enemy’s partial or complete submission.

...

Distinction: Distinction means discriminating between lawful combatant targets and noncombatant targets such as civilians, civilian property, POWs, and wounded personnel who are out of combat. The central idea of distinction is to only engage valid military targets. An indiscriminate attack is one that strikes military objectives and civilians or civilian objects without distinction. Distinction requires defenders to separate military objects from civilian objects to the maximum extent feasible.

...

Proportionality: Proportionality prohibits the use of any kind or degree of force that exceeds that needed to accomplish the military objective.

...

Military members who violate the LOAC are subject to criminal prosecution and punishment. Criminal prosecutions may take place in a national or international forum. In theory, US Armed Forces could be prosecuted by courts-martial under the UCMJ or through an international military tribunal, such as those used in Nuremberg and Tokyo after WWII or in Yugoslavia and Rwanda. The defense, “I was only following orders,” has generally not been accepted by national or international tribunals as a defense in war crime trials. An individual airman/soldier/sailor/marine remains responsible for his or her actions and is expected to comply with the LOAC.
Robot ethics in times of war Quote
04-04-2008 , 04:38 PM
Quote:
Originally Posted by yossarian lives
You are very wrong.

Every American serviceman is required to take annual Law of Armed Conflict training which would show your example to be a war crime. Servicemen are responsible for their actions to a certain degree. It is their DUTY to disobey unlawful orders (thus, why they receive LOAC training each year to know the difference between shooting a combatant and shooting a little girl in cold blood who no longer poses a threat).

From http://usmilitary.about.com/cs/wars/a/loac.htm
Oh my mistake I didn't know that.

Why is that? Can't they order their troops to shoot anyone that poses a threat, but not civilians?

What if it is so important that even civilians must be shot if they trespass.
Robot ethics in times of war Quote
04-04-2008 , 06:13 PM
Quote:
Originally Posted by PrekiGeo
Oh my mistake I didn't know that.

Why is that? Can't they order their troops to shoot anyone that poses a threat, but not civilians?

What if it is so important that even civilians must be shot if they trespass.
Defending a position and giving adequate warning (signs, shots, w/e) before use of lethal force is understandable, but that wasn't your scenario with the captured girl. There may be need for use of deadly force with little or no warning, but it needs to be a military necessity with little alternative or it violates LOAC.

As for a robot or human soldier being told to kill anything that moves in an area that is not a war zone (like the OP base), there needs to be some warning or discretion used in the application of lethal force or you violate the LOAC principle of distinction.

You can't have an automated tank at a US army base in say Missouri or even Afghanistan that blows away any car that makes a wrong exit on the highway and approaches the gate to ask for directions or is confused. There need to be appropriate warning measures. Incidentally, the military has been working on several nonlethal measures for restraint, and improved measures for warning, for this situation at checkpoints since killing innocents does more harm than good in a battle for hearts and minds as in a counterinsurgency in Iraq.
Robot ethics in times of war Quote
04-04-2008 , 06:56 PM
Ah, I didn't know there was a LOAC principle with distinction making this scenario highly unlikely. Understand now why Yossarian might find the OP absurd.

What I found peculiar though in the LOAC is the following sentence:
Criminal prosecutions may take place in a national or international forum

Did the http://en.wikipedia.org/wiki/America...Protection_Act (ASPA) took out the 'international' from the LOAC?

It authorizes the President to use “all means necessary and appropriate to bring about the release of any US or allied personnel being detained or imprisoned by, on behalf of, or at the request of the International Criminal Court”.

Does the LOAC still hold in Guantanamo bay? I believe there in Cuba they are not even considered true Prisoners Of Wars who apply for treatment as described in the Geneva conventions.

And torture still remains a gray area. Military necessity, Distinction and Proportionality seem to have eroded to meaningless in some Blacksite Bases, Afghanistan and Iraq. Right now we still reprimand and criticise the ones in command for giving ambiguous orders and punish the soldiers for immoral behaviour. If torture chambers could be automated with robots or contracted to a firm who does not operate with LOAC, only the reprimands/critique and the (false) confessions remain, there is no immoral agent to punish anymore.

Last edited by 46:1; 04-04-2008 at 07:03 PM.
Robot ethics in times of war Quote
04-04-2008 , 07:09 PM
Quote:
Originally Posted by 46:1
What I found peculiar though in the LOAC is the following sentence:
Criminal prosecutions may take place in a national or international forum

Did the http://en.wikipedia.org/wiki/America...Protection_Act (ASPA) took out the 'international' from the LOAC?

It authorizes the President to use “all means necessary and appropriate to bring about the release of any US or allied personnel being detained or imprisoned by, on behalf of, or at the request of the International Criminal Court”.
President may waive this and send somebody to face an international court if in US interests.

Quote:
Does the LOAC still hold in Guantanamo bay? I believe there in Cuba they are not even considered true Prisoners Of Wars who apply for treatment as described in the Geneva conventions.
The LOAC applies everywhere. Geneva Conventions provisions for POWs are not part of the LOAC as used here (although LOAC stems from Geneva and Hague Convetions, it is just a subset).

Quote:
And torture still remains a gray area. Military necessity, Distinction and Proportionality seem to have eroded to meaningless in some Blacksite Bases, Afghanistan and Iraq.
There may be abuses, but if there are, then they are criminal. Torturing innocent people for no military necessity is not legal under any circumstances, not even in this gray area.

Quote:
Right now we still reprimand and criticise the ones in command for giving ambiguous orders and punish the soldiers for immoral behaviour. If torture chambers could be automated, only the reprimands/critique and the (false) confessions remain, there is no immoral agent to punish anymore.
There's always a moral agent. Robots don't build, program, and deploy themselves.
Robot ethics in times of war Quote
04-06-2008 , 01:19 PM
Once the civilians breach the compound, they can no longer be considered civilians, but combatants. That could be written up in their rules of engagement too.
The generals orders are ******ed. No grey areas. He definitely eats this one.

The soldiers can't simply let the 'civilians' onto the base immediately, because maybe they are all suicide bombers. They need to be searched. It seems like there is no time for a search though.

Ideally the robots fire their missiles at the tanks, obliterate them, and maybe the civilians calm down. With some warning shots coming at the from the direction of the base, I imagine they do.

The general will eat it hard on this one. The soldiers will probably get shat on too, but not legally, and the robot programmers may lose their contracts because they designed stupid robots that couldn't think on their feet.

Nothing precludes ones right to self-defense. The soldiers could easily claim they feared for their lives because they figured people with bombs strapped under their shirts were coming to blow them up. That wouldn't really be a bad tactic by the enemy. And we all know how crazy some of these people are, so some would be the designated bombers, and others would be designated as people to get shot at by their own guys to make it look more convincing.
Robot ethics in times of war Quote
04-06-2008 , 01:27 PM
Quote:
Originally Posted by PrekiGeo
Doesn't following orders in the military relinquish you of any blame? If your commander tells you to kill this little girl because she's a spy, and you do it, I can't possibly imagine you getting in trouble.
This is untrue. There's such things as unlawful commands. If you're ordered to shoot your friend because he screwed up, that's an unlawful command and you'll be done for if you follow it. You should also report it. Another is example of an unlawful command is a commander having you wash his car. He can't tell you to do something for personal gain. Yes, there's a chain of command, and yes, alot of what you're told to do sucks, but you have to use common sense.

Before I went overseas we did at least 5 hours of rules of engagement scenarios in a classroom, not to mention countless months of training excercises, to try to make it clear to everyone when they could shoot.
Robot ethics in times of war Quote

      
m