• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

Call for an International Ban on Autonomous Killer Robots

ED 209's 'incident' in the boardroom pretty much killed off autonomous killer robots without the need for a treaty. We won't even arm static defence robots where the situation can be a lot more controlled as we don't trust them.

YouTube video (warning: graphic content): ED 209 Just Won't Stop
 
Crantor said:
Didn't Robocop have directives programmed into him?  I can see all sorts of things that makes this a plus.  Program robots with all the steps of battle procedure.  Program all the steps from SHARP training.  Perhaps all the defence ethics ethos.  Heck even range standing orders.  The potential is limitless... :geek:

Robocop is a cyborg, whole other set of rules.
 
DEAL! Can we trade the landmine, anti-cluster bombs, anti-incendiary and anything relating to deforming bullets treaties for this one? Common Sense for The Win!

I've been saying we needed this for years, it's a huge philosophical quagmire.
 
Retired AF Guy said:
Autonomous Killer Robots have been with us for a long time - their called "suicide bombers."

Not entirely accurate.  Those models still suffer from the software glitch called "morality" that permits them to deviate from their programming from time to time.

The sentry robots from the director's cut of "Aliens" are but another example of autonomous killers, albeit with some remote control.
 
UN report wants to terminate killer robots, opposes life-or-death powers over humans
By: Peter James Spielmann, The Associated Press Posted: 05/2/2013
http://www.winnipegfreepress.com/breakingnews/un-report-wants-to-terminate-killer-robots-opposes-life-or-death-powers-over-humans-205847021.html

Killer robots that can attack targets without any human input "should not have the power of life and death over human beings," a new draft U.N. report says.

The report for the U.N. Human Rights Commission posted online this week deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films. The debate dates to author Isaac Asimov's first rule for robots in the 1942 story "Runaround:" ''A robot may not injure a human being or, through inaction, allow a human being to come to harm."

Report author Christof Heyns, a South African professor of human rights law, calls for a worldwide moratorium on the "testing, production, assembly, transfer, acquisition, deployment and use" of killer robots until an international conference can develop rules for their use.

His findings are due to be debated at the Human Rights Council in Geneva on May 29.

According to the report, the United States, Britain, Israel, South Korea and Japan have developed various types of fully or semi-autonomous weapons.

In the report, Heyns focuses on a new generation of weapons that choose their targets and execute them. He calls them "lethal autonomous robotics," or LARs for short, and says: "Decisions over life and death in armed conflict may require compassion and intuition. Humans — while they are fallible — at least might possess these qualities, whereas robots definitely do not."

He notes the arguments of robot proponents that death-dealing autonomous weapons "will not be susceptible to some of the human shortcomings that may undermine the protection of life. Typically they would not act out of revenge, panic, anger, spite, prejudice or fear. Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape."

The report goes beyond the recent debate over drone killings of al-Qaida suspects and nearby civilians who are maimed or killed in the air strikes. Drones do have human oversight. The killer robots are programmed to make autonomous decisions on the spot without orders from humans.

Heyns' report notes the increasing use of drones, which "enable those who control lethal force not to be physically present when it is deployed, but rather to activate it while sitting behind computers in faraway places, and stay out of the line of fire.

"Lethal autonomous robotics (LARs), if added to the arsenals of States, would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill - and their execution," he wrote.
a little more on link
 
Canada is being asked to lead this initiative.  The hope is we will pick-up from where we were in pushing the landmine ban.

... I enjoy the apparent need to mention the terminator whenever this topic comes up.  I suppose it is good for sensationalizing the topic.
Keep killer autonomous drones off the battlefield, activists say
Campaign to Stop Killer Robots wants Canada to be a leader in efforts to ban autonomous weapons

The Canadian Press
29 April 2014

Canada is being urged to lead a new international effort to ban so-called "killer robots" — the new generation of deadly high-tech equipment that can select and fire on targets without human help.

The Campaign to Stop Killer Robots is pushing for a new international treaty to ban such weapons from the battlefields of the future.

Paul Hannon, head of Mines Action Canada, said the development of such autonomous weapons — primitive versions of the Terminator of Hollywood fame — signals a profound change in the very nature of warfare.

Hannon's organization is one of nine international groups that are calling on Canada to take the lead in the banning of the weapons, as it was in the campaign against landmines in the 1990s.

"It was not that long ago that the world considered the landmine to be the perfect soldier. It is now banned because of the humanitarian harm it has created," Hannon said Tuesday on Parliament Hill.

"Canada led the movement to ban that weapon; it is one of the most successful international treaties of our era."

It would be far better to squelch the development of the weapons before they are actually built and deployed, Hannon said: once the weaponization genie is out of the bottle, it is much harder to get it back in, he noted.

Hannon also said his group isn't opposed to the use of robotics by the military for non-combat uses such as transportation.

The coalition has no evidence to indicate any Canadian companies are working on such weaponry, and the Department of National Defence has provided assurances it hasn't contracted any research on the subject.

"That doesn't mean there aren't, because there's not a lot of transparency on this," Hannon said.

Six countries are known to be working on the technology: the United States, Britain, Israel, China, Russia and South Korea.

Autonomous weapons don't actually exist yet, but with the rapid advancements being made in robotics, there are troubling signs, said Peter Asaro, co-founder and of the New York-based International Committee for Robot Arms Control.

He cited the 2010 "flash crash" on New York Stock Exchange that was ignited by a frenzy of computerized trading that drove down the stock prices of major companies.

At least two companies have created prototypes of unmanned combat aircraft that are deemed to be autonomous. Another company has a partially autonomous tracking and machine-gun system on the border between North and South Korea.

Mary Wareham, a Washington based arms expert with Human Rights Watch, said one of the aircraft makers, BAE Systems, sponsored a recent two-day symposium on the weapons in London.

"I think they realize that if they don't show interest and at least agree to a be a bit more transparent about the systems that are being developed, then that will increase suspicion, so it's in their interest to be transparent," Wareham said.

Ian Kerr, a law and ethics professor at the University of Ottawa, said removing humans from the decision to kill people poses a serious moral and philosophical problem.
http://www.cbc.ca/news/politics/keep-killer-autonomous-drones-off-the-battlefield-activists-say-1.2625748
 
busconductor said:
Our beloved among our kins inside the household, our "bitterest enemy in public places: The Political Left." Since time immemorial have they been acting as agents of influence for foreign communist countries. Not fomenting reading 'survival guides' here. "It's worse than Sodom"- God.

Hey! Remember what I told you, busconductor?

We have to keep quiet for now!!!  Really, really quiet!  These people will never understand us.

But wait patiently...our Moment approaches!
 
Two reasons autonomous killer robots won't be coming any time soon:

Short term: Hackers. If robots have rigid built in programming to defeat hackers they will be predictable and relatively easy to neutralize or defeat.

Long term: Robots with sufficiently flexible programming to operate autonomously in the real world will be complex, adaptive systems with unpredictable reactions to changing situations.
 
The second UN meeting on autonomous killing robots will take place next week.  While the tendancy still seems to lean in favour of Terminator jokes, the pro narrative sounds a lot more like the Robocop remake.
Killer robots' pose risks and advantages for military use
Human rights groups to push for ban on lethal autonomous weapons at UN meeting next week
Kathleen Harris
CBC News
09 Apr 2015

As Canada prepares to take part in international talks on so-called "killer robots" next week, documents obtained by CBC News show defence officials see risks but also military advantages to deploying deadly autonomous weapons.

Records released under the Access to Information Act show officials at Foreign Affairs and National Defence are keeping an open mind as they carve out a Canadian position on the controversial systems — in spite of growing calls for a pre-emptive global ban.

Lethal autonomous weapons systems (LAWS) are not currently in use, but could eventually have the ability to select, target and engage in deadly attacks without human intervention.

Censored emails, reports and briefing papers released to CBC were prepared last spring when the first United Nations meeting was convened on the issue. One 17-page report outlines the Defence Department's "initial thinking" on the military, strategic, diplomatic and ethical implications, flagging moral questions but also potential benefits.

That paper, which will help shape Canada's position on the issue, says the weapons "clearly promise" many of the same benefits as unmanned, human-controlled systems now in use — including limiting risks to military personnel, driving down costs, allowing penetration of enemy lines with little risk, and circumventing human shortcomings with faster response times and no fatigue or boredom.

"In short, weapons at various stages of autonomy, including LAWS, offer military advantage in several clear — and perhaps also unforeseen — ways," the report reads.

On Monday, officials and experts from around the world will meet again at the UN in Geneva. Defence spokeswoman Ashley Lemire confirmed Canadian representatives will attend.

While Canada is not currently developing any lethal fully autonomous weapons systems, Lemire said Defence Research and Development Canada (DRDC) has an active research program on unmanned systems that informs policy on the opportunities and threats the technologies could pose.

"As part of this work, DRDC is initiating research activity into ways to counter and/or defend against non-lethal and lethal autonomous weapons systems," she said, adding: "We would not want to speculate about potential applications of lethal autonomous weapons systems at this point."

Walter Dorn, a professor at the Royal Military College of Canada, has urged limits to ensure there is always an element of human decision-making in carrying out lethal force. No matter how advanced the technology, there is always the potential for glitches and malfunctions with technology that could harm soldiers or civilians.

"There is potential for great utility and great danger," he said.

But an international coalition of human rights activists, academics and security experts called the Campaign to Stop Killer Robots says that because technology is advancing so rapidly, world leaders must adopt a treaty to ban the weapons. Alex Neve, secretary general of Amnesty International Canada, said lethal weapons without human control — whether they're used for policing or military purposes — would violate international humanitarian law.

"Allowing robots to have power over life and death decisions crosses a fundamental moral line: the killing of humans by machines is an ultimate indignity in a certain sense, and humans should not be reduced to mere objects," he said.

He sees a ban as the "only real solution."

"Taking a wait-and-see approach could lead to further investment by states in the development of these weapons systems and their rapid proliferation in a new arms race," he warned.

The Defence Department documents point out that countries like China and Russia are "rapidly moving toward developing unmanned and autonomous systems," and that changes could revolutionize modern warfare.

"Depending how technology progresses, it may be possible to fight a war without ever leaving North America or Europe," it reads, noting the potential for major shifts in power for high-tech countries in military alliances.

The report also notes how the U.S. used robots in Iraq to clear explosive devices, and suggests those kinds of applications could be enhanced in future.

"Fitted with weapons, robots could be used for house-to-house clearance operations in modern urban combat," the paper reads. "Unlike human soldiers, they could be programmed to 'shoot second' with high accuracy — or even give an enemy the opportunity to surrender after the enemy has fired his weapon — thus potentially decreasing civilian casualties and increasing the chance of capturing enemy combatants."

The report also raises concerns about a potential arms race — and flags the danger if weapons wind up in the hands of non-state actors or repressive governments. But legal concerns might be "somewhat misplaced," the report says, adding the driving force for a ban is more likely "an under-explored moral unease over giving machines the power to kill."

Steven Groves, senior research fellow with the Washington-based Heritage Foundation, said a ban would be premature. He said those pushing for a ban depict the worst-case scenario of an evil humanoid "Terminator-type" robot roaming through crowded cities, which is likely a long way off.

He points to clear advantages machines have over human fighters.

"There are a number of ways these things can be deployed where they may even be more accurate than humans. They don't get scared. They don't get mad. They don't respond to a situation with rage — and we've seen some tragic consequences of that happening in Iraq and elsewhere where … a very human soldier reacts to a terrible situation and ends up killing a lot of civilians."

Groves said the history of warfare is about putting more distance from the enemy — from the bow-and-arrow, to the cannon, to bombs, to drones. And while there are legitimate concerns it could escalate conflict, the opposite could prove to be true.

"If the U.S. has developed weapons that are so accurate, so deadly, so thorough and can deploy them in a way that no U.S. soldiers are put at risk to effectively seek out and destroy the enemy — I wonder if I would start a war with the United States. I wouldn't start it very lightly."
http://www.cbc.ca/news/politics/killer-robots-pose-risks-and-advantages-for-military-use-1.3026963
 
It's been mentioned before somewhere, but I wonder why everyone gets worked up over "killer" RPAs which have an entire crew flying them, but no one says boo about CIWS, which are essentially autonomous.
 
I think they would have more to say if a CIWS was mounted in a guard tower or on a side of a hill and set to shoot anything that moves.
 
Robert0288 said:
I think they would have more to say if a CIWS was mounted in a guard tower or on a side of a hill and set to shoot anything that moves.

The systems have been deployed on land for over a decade, and no one says boo about them.

http://en.wikipedia.org/wiki/Counter_Rocket,_Artillery,_and_Mortar
 
I didn't mean in a counter rocket/mortar role, but in an area denial/anti-pers role similar to this: https://www.youtube.com/watch?v=ZFjGbOyd2ek
 
Robert0288 said:
I didn't mean in a counter rocket/mortar role, but in an area denial/anti-pers role similar to this: https://www.youtube.com/watch?v=ZFjGbOyd2ek

I'm no expert on CIWS, but I'd assume that if it was able to autonomously track and engage cruise missiles, mortar rounds, small boats, etc. then it's not a big jump to use it in an area denial role.
 
2 notable updates, including one about autonomous QF16s in the 2nd article quoted below:

NATO Association of Canada

Lethal Autonomous Weapons Systems- Changing the Environment of Warfare

March 24, 2016
March 24, 2016 Sandra Song

The advancement of military technology is inching towards a robotic revolution that was once only imaginable in science fiction.

In recent years, the development of lethal autonomous weapons systems (LAWS), often called ‘killer robots’ has sparked a number of debates. The main concern focuses on the ethical considerations, followed by its legality and efficacy in warfare. Although these fully autonomous weapons have not been deployed thus far, that we know of, the possibility is not too far on the horizon.

LAWS are weapons that would act on the basis of artificial intelligence (AI), which would be capable of selecting and firing upon a target without any human intervention. An example of one these weapons would be armed quadcopters that could chase after enemy combatants in a city and eliminate them based on the program that has been installed.

(...SNIPPED)

Not mere drones, but self-piloting QF16s:

Fortune

The Pentagon Wants Autonomous Fighter Jets to Join the F-35 in Combat


March 30, 2016, 6:22 PM EDT

The pilotless aircraft could take to the skies before driverless vehicles hit the road.

The U.S. Air Force Research Lab is moving ahead with an initiative to turn aging F-16 fighter jets into unmanned, autonomous combat aircraft. The pilotless planes will fly alongside the newer aircraft like the F-35 Joint Strike Fighter.

Speaking at a forum in Washington, Wednesday, U.S. Deputy Secretary of Defense Robert Work said he expects to see the autonomous aircraft plying the skies alongside manned jets before driverless vehicles enter service on the ground. Work spoke specifically about U.S. Air Force efforts to create autonomous wingmen for its fighter pilots that gave new life to older planes imbued with autonomous piloting technologies and teamed with next-generation aircraft.


(...SNIPPED)


But remotely controlled aircraft and autonomous piloting are two vastly different things. The Air Force isn’t just looking for an unmanned aircraft that can be piloted from afar, but a robotic aircraft that can pilot itself, taking cues from a human pilot in another aircraft. “The onboard autonomy must be sufficient for the Loyal Wingman to complete all basic flight operations untethered from a ground station and without full-time direction from the manned lead,” the Air Force Research Lab (AFRL) explained in a request for industry input published earlier this month.

(...SNIPPED)
 
Back
Top