Tuesday, June 18, 2013

TEDTalk by Daniel Suarez: Kill decision shouldn't belong to a robot

"As a novelist, Daniel Suarez spins dystopian tales of the future. But on the TEDGlobal stage, he talks us through a real-life scenario we all need to know more about: the rise of autonomous robotic weapons of war. Advanced drones, automated weapons and AI-powered intelligence-gathering tools, he suggests, could take the decision to make war out of the hands of humans." 



I can see what Daniel Suarez is saying about robots taking the humanity out of war but if we can decrease the number of casualties maybe using drones or other types of robots would be a better alternative than sending humans off to war with the possibility of being killed.  Soldiers shoot enemy soldiers/terrorists in war. Do the people who are shooting really think about who they're shooting, especially in a situation where it is self defense in wartime? I don't know, that's why I'm asking. Is it realistic to think that a human person would change his mind about shooting another individual who is the enemy?  That probably depends on the situation but I would suggest in most situations our soldiers would not stop pulling the trigger.

The soldier is put at risk in war zones but he knowingly in an act of courage puts him/herself at risk when he or she signs up to be in the military and goes off to war. But does this mean that our soldiers should be put at risk needlessly if there is another, better option to fight the enemy like robots or drones?  Do you honestly think that a rogue nation like Iran or a rogue lawless people like members of Al Qaeda are going to follow a global treaty? Should we not make use of technological advances?  Is it possible for an autonomous drone/robot to differentiate whether a particular body is an ally or the enemy? Maybe it can be programmed to distinguish the difference between the two but to me this is would be impossible. This is one reason I think it is unlikely that there will be autonomous robots in the near future, or ever.  To me it seems like there will always be a necessity for a human behind the robot to make the decision whether to kill or not to kill.

Should we currently be using drones at all to assist us in war even with a human being behind the kill decision?


2 comments:

Constitutional Insurgent said...

"Do the people who are shooting really think about who they're shooting, especially in a situation where it is self defense in wartime?"

Not at the time...but we certainly think about it after.

We use certain passive identification that can be determined by electronic means to identify friend v. foe.....but that's only good if the ID is present and visible. Taking humans out of the loop will increase the odds of fratricide, without a doubt. Computers have also shown a poor substitute for having that innate sense to know that something isn't quite right, and to make an on-the-fly decision.

That said we should take advantage of our technology...as long as the override capability is ever present.

Teresa said...

Thanks for your comment Constitutional Insurgent.

Soldiers thinking about the shooting afterward makes sense. That's probably one of of the underlying issues that causes vets to have PTSD.

For some reason I have mixed feelings about taking advantage of certain technologies.