Thread Tools
Sep 22, 2011, 03:14 AM
Registered User
Thread OP

The Future For Drones: Automated Killing

Just read a new about the evil side of drones, well, what can we say, we made it happened...

- A future for drones: Automated killing (Washington Post, Sep. 20, 2011):

One afternoon last fall at Fort Benning, Ga., two model-size planes took off, climbed to 800 and 1,000 feet, and began criss-crossing the military base in search of an orange, green and blue tarp.

The automated, unpiloted planes worked on their own, with no human guidance, no hand on any control.

After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look.

Target confirmed.

This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial “Terminators,” minus beefcake and time travel.

The Fort Benning tarp “is a rather simple target, but think of it as a surrogate,” said Charles E. Pippin, a scientist at the Georgia Tech Research Institute, which developed the software to run the demonstration. “You can imagine real-time scenarios where you have 10 of these things up in the air and something is happening on the ground and you don’t have time for a human to say, ‘I need you to do these tasks.’ It needs to happen faster than that.”

The demonstration laid the groundwork for scientific advances that would allow drones to search for a human target and then make an identification based on facial-recognition or other software. Once a match was made, a drone could launch a missile to kill the target.

Military systems with some degree of autonomy — such as robotic, weaponized sentries — have been deployed in the demilitarized zone between South and North Korea and other potential battle areas. Researchers are uncertain how soon machines capable of collaborating and adapting intelligently in battlefield conditions will come online. It could take one or two decades, or longer. The U.S. military is funding numerous research projects on autonomy to develop machines that will perform some dull or dangerous tasks and to maintain its advantage over potential adversaries who are also working on such systems.

The killing of terrorism suspects and insurgents by armed drones, controlled by pilots sitting in bases thousands of miles away in the western United States, has prompted criticism that the technology makes war too antiseptic. Questions also have been raised about the legality of drone strikes when employed in places such as Pakistan, Yemen and Somalia, which are not at war with the United States. This debate will only intensify as technological advances enable what experts call lethal autonomy.

The prospect of machines able to perceive, reason and act in unscripted environments presents a challenge to the current understanding of international humanitarian law. The Geneva Conventions require belligerents to use discrimination and proportionality, standards that would demand that machines distinguish among enemy combatants, surrendering troops and civilians.

“The deployment of such systems would reflect a paradigm shift and a major qualitative change in the conduct of hostilities,” Jakob Kellenberger, president of the International Committee of the Red Cross, said at a conference in Italy this month. “It would also raise a range of fundamental legal, ethical and societal issues, which need to be considered before such systems are developed or deployed.”

Drones flying over Afghanistan, Pakistan and Yemen can already move automatically from point to point, and it is unclear what surveillance or other tasks, if any, they perform while in autonomous mode. Even when directly linked to human operators, these machines are producing so much data that processors are sifting the material to suggest targets, or at least objects of interest. That trend toward greater autonomy will only increase as the U.S. military shifts from one pilot remotely flying a drone to one pilot remotely managing several drones at once.

But humans still make the decision to fire, and in the case of CIA strikes in Pakistan, that call rests with the director of the agency. In future operations, if drones are deployed against a sophisticated enemy, there may be much less time for deliberation and a greater need for machines that can function on their own.

The U.S. military has begun to grapple with the implications of emerging technologies.

“Authorizing a machine to make lethal combat decisions is contingent upon political and military leaders resolving legal and ethical questions,” according to an Air Force treatise called Unmanned Aircraft Systems Flight Plan 2009-2047. “These include the appropriateness of machines having this ability, under what circumstances it should be employed, where responsibility for mistakes lies and what limitations should be placed upon the autonomy of such systems.”

In the future, micro-drones will reconnoiter tunnels and buildings, robotic mules will haul equipment and mobile systems will retrieve the wounded while under fire. Technology will save lives. But the trajectory of military research has led to calls for an arms-control regime to forestall any possibility that autonomous systems could target humans.

In Berlin last year, a group of robotic engineers, philosophers and human rights activists formed the International Committee for Robot Arms Control (ICRAC) and said such technologies might tempt policymakers to think war can be less bloody.

Some experts also worry that hostile states or terrorist organizations could hack robotic systems and redirect them. Malfunctions also are a problem: In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.

The ICRAC would like to see an international treaty, such as the one banning antipersonnel mines, that would outlaw some autonomous lethal machines. Such an agreement could still allow automated antimissile systems.

“The question is whether systems are capable of discrimination,” said Peter Asaro, a founder of the ICRAC and a professor at the New School in New York who teaches a course on digital war. “The good technology is far off, but technology that doesn’t work well is already out there. The worry is that these systems are going to be pushed out too soon, and they make a lot of mistakes, and those mistakes are going to be atrocities.”

Research into autonomy, some of it classified, is racing ahead at universities and research centers in the United States, and that effort is beginning to be replicated in other countries, particularly China.

“Lethal autonomy is inevitable,” said Ronald C. Arkin, the author of “Governing Lethal Behavior in Autonomous Robots,” a study that was funded by the Army Research Office.

Arkin believes it is possible to build ethical military drones and robots, capable of using deadly force while programmed to adhere to international humanitarian law and the rules of engagement. He said software can be created that would lead machines to return fire with proportionality, minimize collateral damage, recognize surrender, and, in the case of uncertainty, maneuver to reassess or wait for a human assessment.

In other words, rules as understood by humans can be converted into algorithms followed by machines for all kinds of actions on the battlefield.

“How a war-fighting unit may think — we are trying to make our systems behave like that,” said Lora G. Weiss, chief scientist at the Georgia Tech Research Institute.

Others, however, remain skeptical that humans can be taken out of the loop.

“Autonomy is really the Achilles’ heel of robotics,” said Johann Borenstein, head of the Mobile Robotics Lab at the University of Michigan. “There is a lot of work being done, and still we haven’t gotten to a point where the smallest amount of autonomy is being used in the military field. All robots in the military are remote-controlled. How does that sit with the fact that autonomy has been worked on at universities and companies for well over 20 years?”

Borenstein said human skills will remain critical in battle far into the future.

“The foremost of all skills is common sense,” he said. “Robots don’t have common sense and won’t have common sense in the next 50 years, or however long one might want to guess.”
Sign up now
to remove ads between posts
Sep 22, 2011, 09:11 AM
Registered User
BlackCat_'s Avatar
Keep your yellow and blue tarps well hidden...
Aug 09, 2015, 07:14 AM
"Uh, Oh that ch was reversed."
frissy's Avatar
Of course they are doing it. I mean with facial recognition becoming fast enough. Is it horrible and terrible? No. Why? Because it will only make the process faster. Let's face it...would they stop killing them if they didn't have drones? No. With these the process is faster and requires less people in the area of danger. I just consider this a efficient and a safer way to whack people. Sinister opinion? Yup, but if the choice is to send a sniper/death squad or a drone... I would send a drone for everyone's sake.

Stop killing people? Stop making hit lists. As long as you have hit lists...make sure the only one in danger is the person on the list. Thus these aerial terminators are actually a safer option. As long as they don't go bonkers and become skynet.
Aug 09, 2015, 07:22 AM
"Uh, Oh that ch was reversed."
frissy's Avatar
Just to add...oh the fun when the system gets hacked I'm rather amazed how much the are willing to link to each other. Efficient of course, but bloody dangerous too. One hack and suddenly your whole army isn't yours anymore. I know it's not that simple, but the theory.
Aug 09, 2015, 07:40 AM
Registered User
PatR's Avatar

The Future For Drones: Automated Killing

You obviously are not one that has been involved with the kill chain that employed a drone. Perhaps your opinion would change had you been. I've watched innocents die because a drone was used in a termination strike. Seen the same when manned aircraft was used. Explosives do not discriminate.

R. Heinlein was correct when he asserted all robots be programmed never to harm a human.
Aug 09, 2015, 01:36 PM
"Uh, Oh that ch was reversed."
frissy's Avatar
That's the point. Same thing happens manned or unmanned. A pilot in the air is a pilot in the air. Unmanned doesn't have a pilot (at least not in the same sense).

Tools change. Bad ideas generally stay as bad ideas.
Aug 18, 2015, 01:22 AM
Registered User
megmaltese's Avatar
Originally Posted by frissy
That's the point. Same thing happens manned or unmanned. A pilot in the air is a pilot in the air. Unmanned doesn't have a pilot (at least not in the same sense).

Tools change. Bad ideas generally stay as bad ideas.
There's a small detail to drones: they are inexpensive, and the fact that they are unmanned, makes them perfect for anonymous killings.
You can reach the target from kms distance, and nobody can know who is behind it.
Drones will be the next weapon of choice for professional killers.
Of course, only for the ones that can fly
Aug 22, 2015, 12:43 PM
"Uh, Oh that ch was reversed."
frissy's Avatar
Doubt it. Too visible.
Aug 29, 2015, 12:51 PM
Registered User
megmaltese's Avatar
Imagine when this technology will be in muslim fanatics hands... and it's WHEN, not IF.
Aug 29, 2015, 02:52 PM
Registered User
PatR's Avatar
How about when it arrives in local police departments? North Dakota has legalized the use of armed drones by police.

Quick Reply

Thread Tools

Similar Threads
Category Thread Thread Starter Forum Replies Last Post
Discussion does this sound ok for my future plane list? skarface Micro Ready-to-Fly 6 Oct 06, 2011 08:28 PM
Discussion Less Prison Time for Killing Than for Threatening Comments logan5 Life, The Universe, and Politics 4 Jul 26, 2011 01:07 PM
Discussion Drones, drones... dll932 Life, The Universe, and Politics 0 May 10, 2011 07:24 PM
Discussion Automated Draganflyer Drones - Check it out! Sky Link Coaxial Helicopters 17 Nov 17, 2006 07:36 PM