Moral Battle Robots for War

Moderators: Elvis, DrVolin, Jeff

Do you think battle robots are a thrilling prospect?

No
9
50%
Yes
3
17%
Skynet goes online on (tell us)
6
33%
 
Total votes : 18

Moral Battle Robots for War

Postby Penguin » Tue Nov 25, 2008 4:27 pm

Paging Philip K Dick, Horselover Fat, come in...Second variety? Moral?

http://www.nytimes.com/2008/11/25/scien ... ?_r=2&8dpc


ATLANTA — In the heat of battle, their minds clouded by fear, anger or vengefulness, even the best-trained soldiers can act in ways that violate the Geneva Conventions or battlefield rules of engagement. Now some researchers suggest that robots could do better.

“My research hypothesis is that intelligent robots can behave more ethically in the battlefield than humans currently can,” said Ronald C. Arkin, a computer scientist at Georgia Tech, who is designing software for battlefield robots under contract with the Army. “That’s the case I make.”

Robot drones, mine detectors and sensing devices are already common on the battlefield but are controlled by humans. Many of the drones in Iraq and Afghanistan are operated from a command post in Nevada. Dr. Arkin is talking about true robots operating autonomously, on their own.

He and others say that the technology to make lethal autonomous robots is inexpensive and proliferating, and that the advent of these robots on the battlefield is only a matter of time. That means, they say, it is time for people to start talking about whether this technology is something they want to embrace. “The important thing is not to be blind to it,” Dr. Arkin said. Noel Sharkey, a computer scientist at the University of Sheffield in Britain, wrote last year in the journal Innovative Technology for Computer Professionals that “this is not a ‘Terminator’-style science fiction but grim reality.”

He said South Korea and Israel were among countries already deploying armed robot border guards. In an interview, he said there was “a headlong rush” to develop battlefield robots that make their own decisions about when to attack.

“We don’t want to get to the point where we should have had this discussion 20 years ago,” said Colin Allen, a philosopher at Indiana University and a co-author of “Moral Machines: Teaching Robots Right From Wrong,” published this month by Oxford University Press.

Randy Zachery, who directs the Information Science Directorate of the Army Research Office, which is financing Dr. Arkin’s work, said the Army hoped this “basic science” would show how human soldiers might use and interact with autonomous systems and how software might be developed to “allow autonomous systems to operate within the bounds imposed by the warfighter.”

“It doesn’t have a particular product or application in mind,” said Dr. Zachery, an electrical engineer. “It is basically to answer questions that can stimulate further research or illuminate things we did not know about before.”

And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.

In a report to the Army last year, Dr. Arkin described some of the potential benefits of autonomous fighting robots. For one thing, they can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness, Dr. Arkin wrote, and they can be made invulnerable to what he called “the psychological problem of ‘scenario fulfillment,’ ” which causes people to absorb new information more easily if it agrees with their pre-existing ideas.
” Rest at link.

I think there is already a good code for robots.
That penned by Asimov.
A robot shall not cause harm to a living being, nor allow harm to come to a living being through inaction. If I remember it right.
Last edited by Penguin on Tue Nov 25, 2008 4:39 pm, edited 1 time in total.
Penguin
 
Posts: 5089
Joined: Thu Aug 23, 2007 5:56 pm
Blog: View Blog (0)

Postby Penguin » Tue Nov 25, 2008 4:30 pm

I so hope something like this happens ...

http://blog.wired.com/defense/2007/10/r ... on-ki.html

"Robot Cannon Kills 9, Wounds 14

We're not used to thinking of them this way. But many advanced military weapons are essentially robotic -- picking targets out automatically, slewing into position, and waiting only for a human to pull the trigger. Most of the time. Once in a while, though, these machines start firing mysteriously on their own. The South African National Defence Force "is probing whether a software glitch led to an antiaircraft cannon malfunction that killed nine soldiers and seriously injured 14 others during a shooting exercise on Friday."

SA National Defence Force spokesman brigadier general Kwena Mangope says the cause of the malfunction is not yet known...

Media reports say the shooting exercise, using live ammunition, took place at the SA Army's Combat Training Centre, at Lohatlha, in the Northern Cape, as part of an annual force preparation endeavour.

Mangope told The Star that it “is assumed that there was a mechanical problem, which led to the accident. The gun, which was fully loaded, did not fire as it normally should have," he said. "It appears as though the gun, which is computerised, jammed before there was some sort of explosion, and then it opened fire uncontrollably, killing and injuring the soldiers." [More details here -- ed.]

Other reports have suggested a computer error might have been to blame. Defence pundit Helmoed-Römer Heitman told the Weekend Argus that if “the cause lay in computer error, the reason for the tragedy might never be found."


More fun stuff at the link:
ALSO:
* Roomba-Maker unveils Kill-Bot
* New Armed Robot Groomed for War
* Armed Robots Pushed to Police
* Armed Robots Go Into Action
* Cops Demand Drones
* First Armed Robots on Patrol in Iraq
* Unmanned "Surge": 3000 More Robots for War
* Taser-Armed 'Bot Ready to Zap Pathetic Humans
* Top War Tech #5: Talon Robots
* More Robot Grunts Ready for Duty
* Israel's Killer 'Bot: Safe Enough for War?
* Inside the Baghdad Bomb Squad

Moral fucks.
Penguin
 
Posts: 5089
Joined: Thu Aug 23, 2007 5:56 pm
Blog: View Blog (0)

Postby NeonLX » Tue Nov 25, 2008 4:41 pm

Image

{SHRUG} I dunno.
Last edited by NeonLX on Tue Nov 25, 2008 4:43 pm, edited 1 time in total.
User avatar
NeonLX
 
Posts: 2293
Joined: Sat Aug 11, 2007 9:11 am
Location: Enemy Occupied Territory
Blog: View Blog (1)

Postby Penguin » Tue Nov 25, 2008 4:43 pm

Some who have studied the issue worry, as well, whether battlefield robots designed without emotions will lack empathy. Dr. Arkin, a Christian who acknowledged the help of God and Jesus Christ in the preface to his book “Behavior-Based Robotics” (MIT Press, 1998), reasons that because rules like the Geneva Conventions are based on humane principles, building them into the machine’s mental architecture endows it with a kind of empathy.


Wtf...Jesus built my moral killing machine?
Penguin
 
Posts: 5089
Joined: Thu Aug 23, 2007 5:56 pm
Blog: View Blog (0)

Re: Moral Battle Robots for War

Postby Luposapien » Tue Nov 25, 2008 4:57 pm

And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.


Not precisely on topic, but what planet does this fellow hail from?
User avatar
Luposapien
 
Posts: 428
Joined: Mon Nov 13, 2006 2:24 pm
Location: Approximately Austin
Blog: View Blog (0)

Postby brekin » Tue Nov 25, 2008 5:13 pm

God, can you only imagine the tech support calls of the future?
"Yeah, so we upgraded a drone to Vista 14.1.3, rebooted, he came up with the blue screen of death in his eyes and now he's on a murderous rampage throughout the city. Please advise."

Thought this was appropriate:

Image
User avatar
brekin
 
Posts: 3229
Joined: Tue Oct 09, 2007 5:21 pm
Blog: View Blog (1)

Postby Penguin » Tue Nov 25, 2008 5:18 pm

I hope they add a "no military use whatsoever allowed" into Open sorce licences. Wishful thinking - I bet most of those robots run embedded *nix systems.

I know some coders do add such a clause if they dont want their free code to be used in killing machines.
Penguin
 
Posts: 5089
Joined: Thu Aug 23, 2007 5:56 pm
Blog: View Blog (0)

Postby Uncle $cam » Tue Nov 25, 2008 5:52 pm

Jesus built my moral killing machine


haha, that's pithy... would make a great band name. Excellent points on the "no military use whatsoever allowed" into Open source licenses...

Gangster Computer God Worldwide Secret Containment Policy
http://www.youtube.com/watch?v=yJLhnts9-oQ

If hackers ran the world, there'd be no war--lots of accidents, maybe
- Unknown

Always keep your clothes and your weapons where you can find them in the dark
- Robert Heinlein
Suffering raises up those souls that are truly great; it is only small souls that are made mean-spirited by it.
- Alexandra David-Neel
User avatar
Uncle $cam
 
Posts: 1100
Joined: Fri Nov 03, 2006 5:11 pm
Blog: View Blog (0)

Postby beeline » Tue Nov 25, 2008 5:58 pm

This only brings us one step closer to our robot masters......OK, I'm gonna be the one to say it first:

HAIL ROBOTS!
User avatar
beeline
 
Posts: 2024
Joined: Wed May 21, 2008 4:10 pm
Location: Killadelphia, PA
Blog: View Blog (0)

Re: Moral Battle Robots for War

Postby Code Unknown » Tue Nov 25, 2008 6:44 pm

Penguin wrote:I think there is already a good code for robots.
That penned by Asimov.
A robot shall not cause harm to a living being, nor allow harm to come to a living being through inaction. If I remember it right.


Unfortunately, if the military-industrial complex can: it will. Sorry, Asimov. Good rule, though.

Luposapien wrote:
And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.


Not precisely on topic, but what planet does this fellow hail from?


Seriously.

beeline wrote:This only brings us one step closer to our robot masters......OK, I'm gonna be the one to say it first:

HAIL ROBOTS!


Never that.
Code Unknown
 
Posts: 665
Joined: Sat Oct 27, 2007 5:54 am
Blog: View Blog (0)

Postby Code Unknown » Tue Nov 25, 2008 6:48 pm

Almost more disturbing than the primary subject matter of the article:

His report drew on a 2006 survey by the surgeon general of the Army, which found that fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents. More than one-third said torture was acceptable under some conditions, and fewer than half said they would report a colleague for unethical battlefield behavior.
Code Unknown
 
Posts: 665
Joined: Sat Oct 27, 2007 5:54 am
Blog: View Blog (0)

Postby IanEye » Tue Nov 25, 2008 6:56 pm

[url=http://store.irobot.com/corp/index.jsp]When you work at iRobot, you know you are making a difference. Whether you’re designing robots that help keep people safer in dangerous situations, getting kids excited about technology or developing the next generation of practical, affordable robots for home use, everyone at iRobot has an important part to play. Join a company where your colleague are passionate about finding innovative solutions for everyday problems and are not afraid to be just a little bit different.

We’re looking for smart, creative, energetic talent. We take on challenges that have never before been approached everyday and are committed to delivering the right robots for the job.
[/url]

I drive by their headquarters from time to time. It has a sort of "Spacely's Space Sprockets" vibe about it.
User avatar
IanEye
 
Posts: 4865
Joined: Tue Jan 17, 2006 10:33 pm
Blog: View Blog (29)

Postby barracuda » Tue Nov 25, 2008 7:42 pm

The OP has a good point. How much worse could the robots be than what we've already got goin' on?

Image

Hmm, don't answer that one.

Image

I guess things could always be worse.

Image
The most dangerous traps are the ones you set for yourself. - Phillip Marlowe
User avatar
barracuda
 
Posts: 12890
Joined: Thu Sep 06, 2007 5:58 pm
Location: Niles, California
Blog: View Blog (0)

Postby Penguin » Tue Nov 25, 2008 7:45 pm

If youve read Dicks Second Variety, you know things can be really, really, much worse...especially regarding robots.

And this a long long time before any wussy Terminator. Dick kicked ass.
Penguin
 
Posts: 5089
Joined: Thu Aug 23, 2007 5:56 pm
Blog: View Blog (0)

Postby jingofever » Tue Nov 25, 2008 8:00 pm

Back in the day on one of those TV shows about robots and the men who love them they were profiling a particular robot company and their employees. One of the guys said that his dream was that when the robots inevitably take over and hunt down the humans that the robot tasked to exterminate him would recognize that he was their creator, shake his hand, say thanks, and then blow him away. *sniff* It's beautiful.
User avatar
jingofever
 
Posts: 2814
Joined: Sun Oct 16, 2005 6:24 pm
Blog: View Blog (0)

Next

Return to General Discussion

Who is online

Users browsing this forum: No registered users and 7 guests