PDA

View Full Version : Soon, Drones May Be Able to Make Lethal Decisions on Their Own




CaseyJones
10-08-2013, 12:14 PM
http://www.nationaljournal.com/national-security/soon-drones-may-be-able-to-make-lethal-decisions-on-their-own-20131008


Scientists, engineers and policymakers are all figuring out ways drones can be used better and more smartly, more precise and less damaging to civilians, with longer range and better staying power. One method under development is by increasing autonomy on the drone itself.

Eventually, drones may have the technical ability to make even lethal decisions autonomously: to respond to a programmed set of inputs, select a target and fire their weapons without a human reviewing or checking the result. Yet the idea of the U.S. military deploying a lethal autonomous robot, or LAR, is sparking controversy. Though autonomy might address some of the current downsides of how drones are used, they introduce new downsides policymakers are only just learning to grapple with.

The basic conceit behind a LAR is that it can outperform and outthink a human operator. "If a drone's system is sophisticated enough, it could be less emotional, more selective and able to provide force in a way that achieves a tactical objective with the least harm," said Purdue University Professor Samuel Liles. "A lethal autonomous robot can aim better, target better, select better, and in general be a better asset with the linked ISR [intelligence, surveillance, and reconnaissance] packages it can run."

Anti Federalist
10-08-2013, 12:25 PM
Mankind will be superfluous in 100 years.

The future is fail.

aGameOfThrones
10-08-2013, 12:27 PM
Cops already do.

mad cow
10-08-2013, 12:39 PM
http://www.youtube.com/watch?v=xMMyVKm9BjM

69360
10-08-2013, 12:42 PM
http://www.oxfamblogs.org/fp2p/wp-content/uploads/Rise-of-the-machines-300x240.jpg

dannno
10-08-2013, 12:48 PM
Who is steering this boat :confused:

jkr
10-08-2013, 12:57 PM
the W.O.P.E.R

DGambler
10-08-2013, 01:52 PM
Why would ANYONE think this is a good idea? I'm not a luddite by any means, but this scares the shit out of me. People joke about Skynet, but this is a step in that direction.

I also find it funny that the article states "Any autonomous weapons system is unlikely to be used by the military, except in extraordinary circumstances".... bullshit, if they get it, they'll want to use it.

hxxp://www.defenseone.com/technology/2013/10/ready-lethal-autonomous-robot-drones/71492/#



Scientists, engineers and policymakers are all figuring out ways drones can be used better and more smartly, more precise and less damaging to civilians, with longer range and better staying power. One method under development is by increasing autonomy on the drone itself.

Eventually, drones may have the technical ability to make even lethal decisions autonomously: to respond to a programmed set of inputs, select a target and fire their weapons without a human reviewing or checking the result. Yet the idea of the U.S. military deploying a lethal autonomous robot, or LAR, is sparking controversy. Though autonomy might address some of the current downsides of how drones are used, they introduce new downsides policymakers are only just learning to grapple with.

The basic conceit behind a LAR is that it can outperform and outthink a human operator. "If a drone’s system is sophisticated enough, it could be less emotional, more selective and able to provide force in a way that achieves a tactical objective with the least harm," said Purdue University Professor Samuel Liles. "A lethal autonomous robot can aim better, target better, select better, and in general be a better asset with the linked ISR [intelligence, surveillance, and reconnaissance] packages it can run."

Though the pace for drone strikes has slowed down -- only 21 have struck Pakistan in 2013, versus 122 in 2010 according to the New America Foundation -- unmanned vehicles remain a staple of the American counterinsurgency toolkit. But drones have built-in vulnerabilities that military planners still have not yet grappled with. Last year, for example, an aerospace engineer told the House Homeland Security Committee that with some inexpensive equipment he could hack into a drone and hijack it to perform some rogue purpose.

Drones have been hackable for years. In 2009, defense officials told reporters that Iranian-backed militias used $26 of off-the-shelf software to intercept the video feeds of drones flying over Iraq. And in 2011, it was reported that a virus had infected some drone control systems at Creech Air Force Base in Nevada, leading to security concerns about the security of unmanned aircraft.

It may be that the only way to make a drone truly secure is to allow it to make its own decisions without a human controller: if it receives no outside commands, then it cannot be hacked (at least as easily). And that’s where LARs, might be the most attractive.

Though they do not yet exist, and are not possible with current technology, LARs are the subject of fierce debate in academia, the military and policy circles. Still, many treat their development as inevitability. But how practical would LARs be on the battlefield?

Heather Roff, a visiting professor at the University of Denver, said many conflicts, such as the civil war in Syria, are too complex for LARs. “It’s one thing to use them in a conventional conflict,” where large militaries fight away from cities, “but we tend to fight asymmetric battles. And interventions are only military campaigns -- the civilian effects matter.”

Roff says that because LARs are not sophisticated enough to meaningfully distinguish between civilians and militants in a complex, urban environment, they probably would not be effective at achieving a constructive military end-- if only because of how a civilian population would likely react to self-governing machines firing weapons at their city. “The idea that you could solve that crisis with a robotic weapon is naïve and dangerous,” she said.

Any autonomous weapons system is unlikely to be used by the military, except in extraordinary circumstances, argued Will McCants, a fellow at the Brookings Saban Center and director of its project on U.S. Relations with the Islamic World. “You could imagine a scenario,” he says, “in which LAR planes hunted surface-to-air missiles as part of a campaign to destroy Syria’s air defenses.” It would remove the risk to U.S. pilots while exclusively targeting war equipment that has no civilian purpose.

But such a campaign is unlikely to ever happen. “Ultimately, the national security staff,” he said, referring to personnel that make up the officials and advisers of the National Security Council, “does not want to give up control of the conflict.” The politics of the decision to deploy any kind of autonomous weaponry matters as much as the capability of the technology itself. “With an autonomous system, the consequences of failure are worse in the public’s mind. There’s something about human error that makes people more comfortable with collateral damage if a person does it,” McCants said.

That’s not to say anyone is truly comfortable with collateral damage. “They’d rather own these kinds of decisions themselves and be able to chalk it up to human error,” McCants said. Political issues aside, B.J. Strawser, assistant professor at the Naval Postgraduate School, says that LARs simply could not be used effectively in a place like Syria. “You’d need exceedingly careful and restrictive ROEs [rules of engagement], and I worry that anyone could carry that out effective, autonomous weapon or not,” he said.

“I don’t think any actor, human or not, is capable of carrying out the refined, precise ROEs that would enable an armed intervention to be helpful in Syria.”

BarryDonegan
10-08-2013, 03:22 PM
Defense contractors and military scientists in over 77 countries have begun working on lethal autonomous robots that make their own decisions about who to kill. The Department of Defense has set guidelines regulating and allowing for the creation of these artificial intelligence guided drones. It appears that governments around the world are racing to create automated, killer robots like those from The Terminator series.

http://silverunderground.com/2013/10/coming-soon-autonomous-drones-that-choose-who-to-kill/

JK/SEA
10-08-2013, 03:28 PM
Any Sarah Connors in here?

chudrockz
10-08-2013, 03:42 PM
Why would ANYONE think this is a good idea? I'm not a luddite by any means, but this scares the shit out of me. People joke about Skynet, but this is a step in that direction.

I also find it funny that the article states "Any autonomous weapons system is unlikely to be used by the military, except in extraordinary circumstances".... bullshit, if they get it, they'll want to use it.

hxxp://www.defenseone.com/technology/2013/10/ready-lethal-autonomous-robot-drones/71492/#

"Unlikely to be used by the military, except in extraordinary circumstances" is indeed bullshit.

If it's autonomous, won't it just decide to use ITSELF?

robert9712000
10-08-2013, 04:27 PM
what could possibly go wrong?

Barrex
10-08-2013, 04:44 PM
Screw them. We genetically designed war seagulls. We sacrifice few virgins to them every now and then. They will defend us against this threat.
http://www.lolbrary.com/content/421/i-for-one-welcome-our-new-bird-overlords-40421.jpg

WM_in_MO
10-08-2013, 04:54 PM
I found the leaked data that will be used to identify threats:
http://urbangrounds.com/wp-content/uploads/journalists-guide-to-guns-1.jpg

Henry Rogue
10-08-2013, 07:46 PM
How convenient, soulless decision makers. "We didn't murder anyone, it's the machines fault."

Kords21
10-08-2013, 07:54 PM
I guess these scientists and engineers have never seen any version of Battlestar Galactica. In what movie/tv series has this ever turned out to be a good idea?

DFF
10-08-2013, 09:08 PM
Instead of constructing Terminator drones, perhaps a better use of money would be to repair America's crumbling, out-dated infrastructure?

Just a thought....

Mani
10-08-2013, 11:45 PM
Instead of constructing Terminator drones, perhaps a better use of money would be to repair America's crumbling, out-dated infrastructure?

Just a thought....


I know I'm more likely to die from collapsing bridges and infrastructure problems, but those terrorists are SCURRRY!!!!!!!!!!

Mani
10-08-2013, 11:48 PM
I guess these scientists and engineers have never seen any version of Battlestar Galactica. In what movie/tv series has this ever turned out to be a good idea?



What could possibly go wrong with building one of these to think on their own????


http://modernsurvivalonline.com/wp-content/uploads/2012/05/Drones.jpg

DamianTV
10-09-2013, 12:37 AM
Automation of Obedience is part of the Architecture of Oppression.

DamianTV
10-09-2013, 12:39 AM
What could possibly go wrong with building one of these to think on their own????


http://modernsurvivalonline.com/wp-content/uploads/2012/05/Drones.jpg

Thats where it starts.

http://media.moddb.com/images/mods/1/8/7308/58513.png

This is where it ends.

MRK
10-09-2013, 01:52 AM
Looks like Americans now have evidenced legitimate reasons to own RPGs and other missiles: in case of rogue autonomous robots.

Mani
10-09-2013, 01:57 AM
Looks like Americans now have evidenced legitimate reasons to own RPGs and other missiles: in case of rogue autonomous robots.

I want my second amendment right to include a Bazooka. Just in case Skynet beta droid accidentally gets hacked....Or begins to think. :eek:

DamianTV
10-16-2013, 05:36 AM
I want my second amendment right to include a Bazooka. Just in case Skynet beta droid accidentally gets hacked....Or begins to think. :eek:

NEVER ask for your Rights. You TAKE your Rights and respect the Rights of others. When you ask to have Rights, you will be denied when they know push is coming to shove. By asking for a Right, you give them the ability to control and a Right is no longer a Right, it is PERMISSION.

Aratus
10-26-2013, 09:43 AM
i don't jest about SKYNET arriving