Moderators: Elvis, DrVolin, Jeff
"Robot Cannon Kills 9, Wounds 14
We're not used to thinking of them this way. But many advanced military weapons are essentially robotic -- picking targets out automatically, slewing into position, and waiting only for a human to pull the trigger. Most of the time. Once in a while, though, these machines start firing mysteriously on their own. The South African National Defence Force "is probing whether a software glitch led to an antiaircraft cannon malfunction that killed nine soldiers and seriously injured 14 others during a shooting exercise on Friday."
SA National Defence Force spokesman brigadier general Kwena Mangope says the cause of the malfunction is not yet known...
Media reports say the shooting exercise, using live ammunition, took place at the SA Army's Combat Training Centre, at Lohatlha, in the Northern Cape, as part of an annual force preparation endeavour.
Mangope told The Star that it “is assumed that there was a mechanical problem, which led to the accident. The gun, which was fully loaded, did not fire as it normally should have," he said. "It appears as though the gun, which is computerised, jammed before there was some sort of explosion, and then it opened fire uncontrollably, killing and injuring the soldiers." [More details here -- ed.]
Other reports have suggested a computer error might have been to blame. Defence pundit Helmoed-Römer Heitman told the Weekend Argus that if “the cause lay in computer error, the reason for the tragedy might never be found."
Some who have studied the issue worry, as well, whether battlefield robots designed without emotions will lack empathy. Dr. Arkin, a Christian who acknowledged the help of God and Jesus Christ in the preface to his book “Behavior-Based Robotics” (MIT Press, 1998), reasons that because rules like the Geneva Conventions are based on humane principles, building them into the machine’s mental architecture endows it with a kind of empathy.
And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.
Jesus built my moral killing machine
Penguin wrote:I think there is already a good code for robots.
That penned by Asimov.
A robot shall not cause harm to a living being, nor allow harm to come to a living being through inaction. If I remember it right.
Luposapien wrote:And Lt. Col. Martin Downie, a spokesman for the Army, noted that whatever emerged from the work “is ultimately in the hands of the commander in chief, and he’s obviously answerable to the American people, just like we are.
Not precisely on topic, but what planet does this fellow hail from?
beeline wrote:This only brings us one step closer to our robot masters......OK, I'm gonna be the one to say it first:
HAIL ROBOTS!
His report drew on a 2006 survey by the surgeon general of the Army, which found that fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents. More than one-third said torture was acceptable under some conditions, and fewer than half said they would report a colleague for unethical battlefield behavior.
Users browsing this forum: No registered users and 0 guests