Page 1 of 2

Moral Battle Robots for War

PostPosted: Tue Nov 25, 2008 4:27 pm
by Penguin
Paging Philip K Dick, Horselover Fat, come in...Second variety? Moral?

http://www.nytimes.com/2008/11/25/scien ... ?_r=2&8dpc


ATLANTA — In the heat of battle, their minds clouded by fear, anger or vengefulness, even the best-trained soldiers can act in ways that violate the Geneva Conventions or battlefield rules of engagement. Now some researchers suggest that robots could do better.

“My research hypothesis is that intelligent robots can behave more ethically in the battlefield than humans currently can,” said Ronald C. Arkin, a computer scientist at Georgia Tech, who is designing software for battlefield robots under contract with the Army. “That’s the case I make.”

Robot drones, mine detectors and sensing devices are already common on the battlefield but are controlled by humans. Many of the drones in Iraq and Afghanistan are operated from a command post in Nevada. Dr. Arkin is talking about true robots operating autonomously, on their own.

He and others say that the technology to make lethal autonomous robots is inexpensive and proliferating, and that the advent of these robots on the battlefield is only a matter of time. That means, they say, it is time for people to start talking about whether this technology is something they want to embrace. “The important thing is not to be blind to it,” Dr. Arkin said. Noel Sharkey, a computer scientist at the University of Sheffield in Britain, wrote last year in the journal Innovative Technology for Computer Professionals that “this is not a ‘Terminator’-style science fiction but grim reality.”

He said South Korea and Israel were among countries already deploying armed robot border guards. In an interview, he said there was “a headlong rush” to develop battlefield robots that make their own decisions about when to attack.

“We don’t want to get to the point where we should have had this discussion 20 years ago,” said Colin Allen, a philosopher at Indiana University and a co-author of “Moral Machines: Teaching Robots Right From Wrong,” published this month by Oxford University Press.

Randy Zachery, who directs the Information Science Directorate of the Army Research Office, which is financing Dr. Arkin’s work, said the Army hoped this “basic science” would show how human soldiers might use and interact with autonomous systems and how software might be developed to “allow autonomous systems to operate within the bounds imposed by the warfighter.”

“It doesn’t have a particular product or application in mind,” said Dr. Zachery, an electrical engineer. “It is basically to answer questions that can stimulate further research or illuminate things we did not know about before.”

And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.

In a report to the Army last year, Dr. Arkin described some of the potential benefits of autonomous fighting robots. For one thing, they can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness, Dr. Arkin wrote, and they can be made invulnerable to what he called “the psychological problem of ‘scenario fulfillment,’ ” which causes people to absorb new information more easily if it agrees with their pre-existing ideas.
” Rest at link.

I think there is already a good code for robots.
That penned by Asimov.
A robot shall not cause harm to a living being, nor allow harm to come to a living being through inaction. If I remember it right.

PostPosted: Tue Nov 25, 2008 4:30 pm
by Penguin
I so hope something like this happens ...

http://blog.wired.com/defense/2007/10/r ... on-ki.html

"Robot Cannon Kills 9, Wounds 14

We're not used to thinking of them this way. But many advanced military weapons are essentially robotic -- picking targets out automatically, slewing into position, and waiting only for a human to pull the trigger. Most of the time. Once in a while, though, these machines start firing mysteriously on their own. The South African National Defence Force "is probing whether a software glitch led to an antiaircraft cannon malfunction that killed nine soldiers and seriously injured 14 others during a shooting exercise on Friday."

SA National Defence Force spokesman brigadier general Kwena Mangope says the cause of the malfunction is not yet known...

Media reports say the shooting exercise, using live ammunition, took place at the SA Army's Combat Training Centre, at Lohatlha, in the Northern Cape, as part of an annual force preparation endeavour.

Mangope told The Star that it “is assumed that there was a mechanical problem, which led to the accident. The gun, which was fully loaded, did not fire as it normally should have," he said. "It appears as though the gun, which is computerised, jammed before there was some sort of explosion, and then it opened fire uncontrollably, killing and injuring the soldiers." [More details here -- ed.]

Other reports have suggested a computer error might have been to blame. Defence pundit Helmoed-Römer Heitman told the Weekend Argus that if “the cause lay in computer error, the reason for the tragedy might never be found."


More fun stuff at the link:
ALSO:
* Roomba-Maker unveils Kill-Bot
* New Armed Robot Groomed for War
* Armed Robots Pushed to Police
* Armed Robots Go Into Action
* Cops Demand Drones
* First Armed Robots on Patrol in Iraq
* Unmanned "Surge": 3000 More Robots for War
* Taser-Armed 'Bot Ready to Zap Pathetic Humans
* Top War Tech #5: Talon Robots
* More Robot Grunts Ready for Duty
* Israel's Killer 'Bot: Safe Enough for War?
* Inside the Baghdad Bomb Squad

Moral fucks.

PostPosted: Tue Nov 25, 2008 4:41 pm
by NeonLX
Image

{SHRUG} I dunno.

PostPosted: Tue Nov 25, 2008 4:43 pm
by Penguin
Some who have studied the issue worry, as well, whether battlefield robots designed without emotions will lack empathy. Dr. Arkin, a Christian who acknowledged the help of God and Jesus Christ in the preface to his book “Behavior-Based Robotics” (MIT Press, 1998), reasons that because rules like the Geneva Conventions are based on humane principles, building them into the machine’s mental architecture endows it with a kind of empathy.


Wtf...Jesus built my moral killing machine?

Re: Moral Battle Robots for War

PostPosted: Tue Nov 25, 2008 4:57 pm
by Luposapien
And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.


Not precisely on topic, but what planet does this fellow hail from?

PostPosted: Tue Nov 25, 2008 5:13 pm
by brekin
God, can you only imagine the tech support calls of the future?
"Yeah, so we upgraded a drone to Vista 14.1.3, rebooted, he came up with the blue screen of death in his eyes and now he's on a murderous rampage throughout the city. Please advise."

Thought this was appropriate:

Image

PostPosted: Tue Nov 25, 2008 5:18 pm
by Penguin
I hope they add a "no military use whatsoever allowed" into Open sorce licences. Wishful thinking - I bet most of those robots run embedded *nix systems.

I know some coders do add such a clause if they dont want their free code to be used in killing machines.

PostPosted: Tue Nov 25, 2008 5:52 pm
by Uncle $cam
Jesus built my moral killing machine


haha, that's pithy... would make a great band name. Excellent points on the "no military use whatsoever allowed" into Open source licenses...

Gangster Computer God Worldwide Secret Containment Policy
http://www.youtube.com/watch?v=yJLhnts9-oQ

If hackers ran the world, there'd be no war--lots of accidents, maybe
- Unknown

Always keep your clothes and your weapons where you can find them in the dark
- Robert Heinlein

PostPosted: Tue Nov 25, 2008 5:58 pm
by beeline
This only brings us one step closer to our robot masters......OK, I'm gonna be the one to say it first:

HAIL ROBOTS!

Re: Moral Battle Robots for War

PostPosted: Tue Nov 25, 2008 6:44 pm
by Code Unknown
Penguin wrote:I think there is already a good code for robots.
That penned by Asimov.
A robot shall not cause harm to a living being, nor allow harm to come to a living being through inaction. If I remember it right.


Unfortunately, if the military-industrial complex can: it will. Sorry, Asimov. Good rule, though.

Luposapien wrote:
And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.


Not precisely on topic, but what planet does this fellow hail from?


Seriously.

beeline wrote:This only brings us one step closer to our robot masters......OK, I'm gonna be the one to say it first:

HAIL ROBOTS!


Never that.

PostPosted: Tue Nov 25, 2008 6:48 pm
by Code Unknown
Almost more disturbing than the primary subject matter of the article:

His report drew on a 2006 survey by the surgeon general of the Army, which found that fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents. More than one-third said torture was acceptable under some conditions, and fewer than half said they would report a colleague for unethical battlefield behavior.

PostPosted: Tue Nov 25, 2008 6:56 pm
by IanEye
[url=http://store.irobot.com/corp/index.jsp]When you work at iRobot, you know you are making a difference. Whether you’re designing robots that help keep people safer in dangerous situations, getting kids excited about technology or developing the next generation of practical, affordable robots for home use, everyone at iRobot has an important part to play. Join a company where your colleague are passionate about finding innovative solutions for everyday problems and are not afraid to be just a little bit different.

We’re looking for smart, creative, energetic talent. We take on challenges that have never before been approached everyday and are committed to delivering the right robots for the job.
[/url]

I drive by their headquarters from time to time. It has a sort of "Spacely's Space Sprockets" vibe about it.

PostPosted: Tue Nov 25, 2008 7:42 pm
by barracuda
The OP has a good point. How much worse could the robots be than what we've already got goin' on?

Image

Hmm, don't answer that one.

Image

I guess things could always be worse.

Image

PostPosted: Tue Nov 25, 2008 7:45 pm
by Penguin
If youve read Dicks Second Variety, you know things can be really, really, much worse...especially regarding robots.

And this a long long time before any wussy Terminator. Dick kicked ass.

PostPosted: Tue Nov 25, 2008 8:00 pm
by jingofever
Back in the day on one of those TV shows about robots and the men who love them they were profiling a particular robot company and their employees. One of the guys said that his dream was that when the robots inevitably take over and hunt down the humans that the robot tasked to exterminate him would recognize that he was their creator, shake his hand, say thanks, and then blow him away. *sniff* It's beautiful.