TRUTH BE TOLD

TRUTH BE TOLD
WORLD NEWS EVERY DAY

Friday, November 30, 2012

Autonomy in Weapon Systems

Autonomy in Weapon Systems

November 26th, 2012 by Steven Aftergood The Department of Defense issued a new Directive last week establishing DoD policy for the development and use of autonomous weapons systems.
An autonomous weapon system is defined as “a weapon system that, once activated, can select and engage targets without further intervention by a human operator.”
The new DoD Directive Number 3000.09, dated November 21, establishes guidelines that are intended “to minimize the probability and consequences of failures in autonomous and semi-autonomous weapon systems that could lead to unintended engagements.”
“Failures can result from a number of causes, including, but not limited to, human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield,” the Directive explains.
An “unintended engagement” resulting from such a failure means “the use of force resulting in damage to persons or objects that human operators did not intend to be the targets of U.S. military operations, including unacceptable levels of collateral damage beyond those consistent with the law of war, ROE [rules of engagement], and commander’s intent.”
The Department of Defense should “more aggressively use autonomy in military missions,” urged the Defense Science Board last summer in a report on “The Role of Autonomy in DoD Systems.”
The U.S. Army issued an updated Army Field Manual 3-36 on Electronic Warfare earlier this month.


Greater Autonomy for Unmanned Military Systems Urged

September 6th, 2012 by Steven Aftergood The Department of Defense should focus on increasing the autonomy of drones and other unmanned military systems, a new report from the Defense Science Board said.
DoD should “more aggressively use autonomy in military missions,” the Board report said, because currently “autonomy technology is being underutilized.”  See “The Role of Autonomy in DoD Systems,” Defense Science Board, dated July 2012 and released last week.
“Autonomy” in this context does not mean “computers making independent decisions and taking uncontrolled action.”  The Board is not calling for the immediate development of Skynet at this time.  Rather, autonomy refers to the automation of a particular function within programmed limits.  “It should be made clear that all autonomous systems are supervised by human operators at some level,” the report stressed.
Increased autonomy for unmanned military systems “can enable humans to delegate those tasks that are more effectively done by computer… thus freeing humans to focus on more complex decision making.”
“However, the true value of these systems is not to provide a direct human replacement, but rather to extend and complement human capability by providing potentially unlimited persistent capabilities, reducing human exposure to life threatening tasks, and with proper design, reducing the high cognitive load currently placed on operators/supervisors.”
But all of that is easier said than done.
“Current designs of autonomous systems, and current design methods for increasing autonomy, can create brittle platforms” that are subject to irreversible error.  There are also “new failure paths associated with more autonomous platforms, which has been seen in friendly fire fatalities…. This brittleness, which is resident in many current designs, has severely retarded the potential benefits that could be obtained by using advances in autonomy.”
The Defense Science Board report discusses the institutional challenges confronting a move toward increasing autonomy, including the obstacles posed by proprietary software.  It offers an extended discussion of conflict scenarios in which the enemy employs its own autonomous systems against U.S. forces.  The authors describe China’s “alarming” investment in unmanned systems, and encourage particular attention to the relatively neglected topic of the vulnerability of unmanned systems.
The report includes some intriguing citations, such as a volume on “Governing Lethal Behavior in Autonomous Robots,” and presents numerous incidental observations of interest.  For example:
“Big data has evolved as a major problem at the National Geospatial Intelligence Agency (NGA).  Over 25 million minutes of full motion video are stored at NGA.”
But new sensors will produce “exponentially more data” than full motion video, and will overwhelm current analytical capabilities.
“Today nineteen analysts are required per UAV orbit [i.e. per 24 hour operational cycle].  With the advent of Gorgon Stare, ARGUS, and other Broad Area Sensors, up to 2,000 analysts will be required per orbit.”
The government “can’t hire enough analysts or buy enough equipment to close these gaps.”

No comments:

Post a Comment