Printer-friendly versionSend by emailPDF version

cc V M The increased use of drones raises not only questions related to efficiency and reliability, but also questions about ethics, human rights, legitimacy, sovereignty, and the morality of war

Waging war without any declaration is now facilitated by drones. But there are limits that drones can never cross, as machines can never handle sociopolitical contradictions. Initiating counter-moves against political maneuvers is beyond the capacity of machines.

Drones, to a section of politicians unerringly safe and deadly, are being used as a foreign policy tool. A section of politicians tout these as an arm of ideological crusade. The reality, however, implementing foreign policy or waging an ideological Armageddon, is relative. It is relative to the laws of nature and to the laws of social contradictions. This relativity imposes limits on the fly paths drones follow.

Impulses and compulsions of geostrategy and geotactics are proliferating the aerial robotic technology at bewildering speed. Armed drones flying from bases in Afghanistan, Ethiopia, the Seychelles and Yemen, and dominating skies of many countries including Afghanistan, Pakistan and Yemen is now regular news. Some more countries including Libya are also in the list of sky dominance. With escalation of conflicts and intensification of competition among contending capitals more countries will be included in the drone strike list.

Drone is claimed as a precise and effective weapon. Its “ability to compute and then act at digital speed,” as Brookings Institution analyst Peter Singer writes in Wired for War, his book on robot warfare, provides it with a “robotic advantage”. But, it is only tactical advantage. Wars are not won only by tactical moves. Waging war with a drone strike force in noncombatant countries and killing citizens with supposedly precision weaponry is a tactical edge, and sometimes, a tactical weakness; and this doesn’t provide strategic advantage.

Today, military drones are operated by Israel, Italy, the UK and the US. Many states possess unarmed flying robots. China, France, Germany, Iran, Russia and Sweden are developing weapons-carrying models. (David Axe, “Deadlier drones are coming”, Global Post, Sept. 23, 2012)

An average unmanned aerial vehicle (UAV) costs a mere 10 percent of an F-16 fighter jet. With these flying machines, writes Jason Berry, author of Render unto Rome: The Secret Life of Money in the Catholic Church, there is no risk to the pilot in case the machine is intercepted. (“Inside America's drone war, a moral black box”, Global Post, Sept. 26, 2012) John O. Brennan, former Homeland Security and Counterterrorism Assistant to the US President, claimed: Drones “can be a wise choice” as drones “dramatically reduce the danger to US personnel, even eliminating the danger altogether.” (“The Efficacy and Ethics of US Counterterrorism Strategy”, remarks at the Woodrow Wilson International Center for Scholars, Washington DC, Apr. 30, 2012)

“Billions upon billions of dollars”, Medea Benjamin, antiwar activist, writes in Drone Warfare: Killing by Remote Control, “have been spent from America to Asia on machinery, software and workers whose only purpose is building a better flying death robot.” The Pentagon allocated $95 million for drone purchases last year. Israel is the number two exporter of drones, and selling lots of those to Russia. (Jason Berry, op. cit.) Citing estimates of the Teal Group, an aerospace research firm, David informed that worldwide military UAV spending could almost double over the next decade from $6.6 billion in 2012 to $11.4 billion in 2022, in constant dollars. Referring to a US Air Force planning document from 2011 he also informed that the current force of around 250 armed drones would be more than doubled in the next decade.

Equipped with higher levels of technology future drones will be faster, smarter, bloodier, more “intelligent” and autonomous, more powerful and heavily armed, and their operators, human indeed, will be, as is being claimed, less involved. These will possess the capacity of “reasoning”, the capacity to draw conclusions on the basis of data. This improved capacity will allow future drones to “plan and execute attacks with less human participation”, and “[g">reater robot autonomy could herald a major expansion of the drone war.” (David Axe, op. cit.)

David cited a 30-year drone development plan of the US Air Force: “Advances in AI (artificial intelligence) will enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input.” The Air Force, according to David, “is already working to loosen those policy constraints, clearing a path for smarter, more dangerous drones.”

The robotic drones, military analysts and experts on the future of warfare apprehend, could raise “the specter of a whole new kind of conflict which would essentially remove the human element – and human decision-making – from the theater of war.” (ibid.)

Now, no one contends that drone attacks aren't flawless. The Stanford Report says: “This narrative”, that drones are effective and precise, “is false.” Operators’ repeated mistakes in targeting their enemies, the mistake for which civilians paid with blood, are also much recognized facts. “Today roughly a quarter of all the people killed in […] drone strikes are innocent bystanders.” (David Axe, op. cit.) From June 2004 through mid-September 2012, reports the Bureau of Investigative Journalism, an independent journalist organization, drone strikes killed 2,562-3,325 persons including 176 children in Pakistan. (Covert War on Terror) Other countries count respective casualty figures.

Drones with sensitive sensors, computers, bombs and missiles have generated two sets of questions: technical and sociopolitical. Questions related to technology locomote around efficiency, reliability, etc. while sociopolitical questions, as there is political force, revolve around ethics, human rights, fundamental freedoms, legitimacy, sovereignty of country and people, morality/moral standards of war, theories of war, etc.

TECHNICAL THROTTLE

David refers to experts’ opinions that tell technical limits of drones. The technical limits, however, turn deadly. Cummings, an MIT professor, says: “In the future we’re going to see a lot more reasoning put on all these vehicles.” Ryan Calo, a Stanford University researcher, foresees: “There’s no plan for humans to be totally out of the loop.” Patrick Lin, another Stanford researcher opines: “Military robots are potentially indiscriminate.” Robots “aren’t going to replace the need for a thinking human being to make decisions that are influenced by experience in a wide range of situational considerations that you just can’t program into a machine,” Carl Johnson, a Northrop vice president, told Global Post in 2011. “Even though it’s possible for a [UAV"> to find a target, identify it and give those coordinates electronically to a weapon, it won't do that unless it’s told to,” Johnson said. “The technology is there, but there is still a need for a human in the loop.” “Humans contribute the things humans are good at, and robots contribute what robots are good at,” is the way MIT’s Seth Teller describes the dynamic to Global Post. Highly autonomous robots could pose big problems, and not just legally, Calo and Lin warn. While remote, there is a chance that a highly sophisticated drone could go rogue in combat. “Autonomous robots are likely to be learning robots, too,” Lin says. “We can’t always predict what they will learn and what conclusions they might draw on how to behave.” “We’re reasonably confident that a human can act ethically, to distinguish right from wrong, but we have no basis yet for this confidence about robots,” Lin cautions.

There will be a machine-“reasoning”, programmed with “genetic algorithms”, a capacity build up by human operator/programmer, that’s inserting a huge volume of command based on only a fraction of human reasoning. The machine can turn mad if the flying machine is fed with disinformation, confusing data and data unknown to the machine, if its “reasoning” is distorted with misperception, wrongly summed experience, disturbed sight. The airborne killer can turn deviant, even can degenerate into suicidal.

It can’t be expected that armed airborne robots of the future can handle all possible problems encountered during a combat mission. The problems include changes in terrain, weather and camouflage and appearance of counter-technology to combat the drone. With the development of technology drones’ efficacy to detect targets will increasingly turn limited. Doesn’t the history of arms development, the development of bayonet, rifle, artillery, tanks, and all their later cousins in the breed, development of tactics and strategies over centuries confirm this?

A machine, even if it uses software algorithms, because of its basic nature, can never handle sociopolitical reality, can never take into consideration “surprises” and incidents that demand to be flexible and compromising in decision making, and possible and probable impacts and implications that may follow a tactical or strategic hit. This reality brings down drones’ decision making capacity to zero. Decision to hit a target is basically part of political including diplomatic and legal, and even economic, decision, and it’s a process at socio-economic-political level.

By authorizing a machine to take a decision, obviously partially, to kill, to trigger weapons release process, a human being takes the sole responsibility of the killing. A machine can be authorized to make combat decisions, mechanically, but the burden of implication of the decision is borne by political and military leadership with legal, ethical, social and political consequences.

Efficiency of not only drones but also of any machine can never be translated into a tool or mechanism for handling social contradictions that breed forces considered antagonistic to drone owners.

TRAMMEL OF SOCIOPOLITICS

The flying kill-machine encounters an array of moral, ethical, legal, sociopolitical issues that it can’t ignore, can’t face, can’t handle.

In mid-July, 2012 The New York Times in a story, “The Moral Case for Drones”, argued that the airborne weapon system “offer marked moral advantages over almost any tool of warfare”.

John Brennan in his April 2012-speech defended use of drones as legal under domestic and international law, ethical according to the standards of war, wise as it limits risk to US personnel and foreign civilians, and subject to a complex and thorough review process. He identified the advantages of drones as helping the US to satisfy the “principle of humanity”.

Other arguments favoring drone include (1) drone provides scope for a “more humane type of war”, (2) “a last resort after exhausting all feasible alternatives”, (3) “reasonable necessity”, (4) “resorting to justifiable force”. To a section of ideologues, use of drone “is a struggle to defeat an ideology.”

These arguments accompanied the concept of “just war” that Barak Obama outlined in his 2009 Nobel Prize speech.

However, the arguments are being debated and questioned. Legal experts challenge the legality of drone strikes across sovereign borders and targeted killings although international law till now doesn’t set limits to drones’ area of operation. “Proportional response and the right to human life are cornerstones of just war theory and central to the debate over drones [...">” (Jason Berry, op. cit.) Inviolable sovereignty of people and people’s inviolable right to peace are fundamental issues that drone operations can never resolve.

According to Daniel R. Brunstetter, professor of political science at the University of California, Irvine, the “2010 National Security Strategy – the document that outlines the foreign policy threats facing the US and the way the administration plans to deal with them – echoes [a"> cautious war philosophy. The language of pre-emptive war that predominated Bush’s national Security Strategy of 2002 and 2006 was removed, and a more cautious language that echoed the notion of last resort was employed: ‘While the use of force is sometimes necessary, we will exhaust other options before war whenever we can, and carefully weigh the costs and risks of inaction.’ The document goes on to emphasize the importance of using force in a way that ‘reflects our values and strengthens our legitimacy’ and stresses the need for ‘broad international support.’” The document, as Brunstetter quoted, asserts: “The United States must reserve the right to act unilaterally if necessary to defend our nation and our interests, yet we will also seek to adhere to the standards that govern the use of force.” “This leads us”, Brunstetter observed, “to the dilemmas posed by drones.” (“Can We Wage a Just Drone War?”, The Atlantic, July 19, 2012)

ON THE OPPOSITE

The reality that comes out of the drone operations is opposite to the claims and expectations. Serious concerns about the counter-productive result of drone strikes are being raised. The Stanford Report found “evidence of the civilian harm and counter-productive impacts of US targeted killings and drone strikes in Pakistan.”

“Drones”, the New York Times reported, “have replaced Guantánamo as the recruiting tool of choice for militants.” (Jo Becker and Scott Shane, Secret ‘Kill List’ Proves a Test of Obama’s Principles and Will, May 29, 2012) A Pew Research Center study found 74 percent of Pakistanis consider the US an enemy.” (“Pakistani Public Opinion Ever More Critical of U.S.: …”, 2012)

The Stanford Report provides a broader reality. It said:

(1) “[N]egative impacts US policies [..."> on the civilians living under drones.” (2) Presence of drones “terrorizes men, women, and children, giving rise to anxiety and psychological trauma among civilian communities. Those living under drones have to face the constant worry that a deadly strike may be fired at any moment, and the knowledge that they are powerless to protect themselves. These fears have affected behavior. The US practice of striking one area multiple times […] makes both community members and humanitarian workers afraid or unwilling to assist injured victims. Some community members shy away from gathering in groups, including important tribal dispute-resolution bodies, out of fear that they may attract the attention of drone operators. Some parents choose to keep their children home, and children injured or traumatized by strikes have dropped out of school.” (3) Drone strikes “have undermined cultural and religious practices”, and “families who lost loved ones or their homes in drone strikes now struggle to support themselves.”

Daniel Brunstetter tells the hard fact: “[I">t takes only one civilian death to fuel negative perceptions of the US in some parts of the world and all but guarantee a steady flow of terrorist recruits. (op. cit.)

Efforts to hide the victims from the rest of the world, to operate with low-key posture tell a basic weakness: The operation is not acceptable to the wider world, is not acceptable even within acceptable norms interests of status quo propagate and practice, is devoid of legitimacy.

The air assault completely concentrates into inner: its friend’s and foe’s strengths and weaknesses, interprets or misinterprets or circumvents laws and legal bindings, diplomatic tangles and international relations, etc. but fails to consider the objective social condition.

The flying machine exposes utter weaknesses of its owners whatever is the raison d’etre for waging a secret war, the secret killing of individuals, whether it is an anticipatory war doctrine of the Bush era or a doctrine of anticipatory drone strike, whether it’s a pre-emptive war or a preventive war.

The weakness is exposed when drone strikes turn as the threshold of last resort and claim that it leads to peace. The absence of accountability, transparency, debate, and the legal basis for killing missions complete the exposure.

The situation turns complicated and grave if an ally doesn’t turn accomplice to the flying machine mission, if an ally appears unreliable, if it is not possible to segregate an “ideological” foe from civilians, if an assault makes civilian population hostile as this mean failure in deeper zones of politics, diplomacy, inter-state relations. The failure has roots also. Machine can never overcome this failure or limit.

IF

What can today’s drones or tomorrow’s super-drones do if a populace turns aware, gets organized, rises in peaceful defiance, disobedience and non-cooperation, doesn’t resort to arms, doesn’t step into provocations, doesn’t walk into a tactical trap? Can drones or some other machine sense/survey/map the inner dynamics of the defiant people, their alliance, leadership or management of the rising? It is a disability machines bear as they are “born” out of human labor, as machines’ “labor” and human labor are not the same and the two don’t produce the same result. Otherwise the history of machines’ development and the history of humanity’s journey that we find would have been different and highly efficient machines would have trespassed humanity. History presents opposite evidence, which is not loved by mechanical minds.

There is another important “if”. Thomas Powers, author of Intelligence Wars, is blunter. “Drones are an unreliable and conspicuous way of killing individuals,” he told Global Post. “What seems inevitable today is going to cause you trouble tomorrow. Ask yourself if the United States would accept the right of another country to decide who among Americans they would kill. There are probably people in Arizona allied with drug cartels. Would we allow Mexican forces to use drones against them? Hell, no.”

This “if” pointed by Thomas Powers actually has no answer or has an answer, considered anomalous and despised by all rational beings.

To capital, war is justified as long as it appears necessary for its expansion and appropriation. Capital, having its own morality and ethics as feudal lords and slave owners had theirs, provides itself justification and rationale to murder innocents, invade countries and demolish life of peace-loving people. Capital always finds use of force not only necessary, but also morally justified.

But capital’s morality and logic for justification is its limit, the limit that its machines, killing machine or ruling machine, can never cross as sociopolitical dynamics can never be dominated and manipulated by mechanical force. Capital thus manufactures its killing machines in its temple of absurdity with a hope of hollowness.

* BROUGHT TO YOU BY PAMBAZUKA NEWS

* Please do not take Pambazuka for granted! Become a Friend of Pambazuka and make a donation NOW to help keep Pambazuka FREE and INDEPENDENT!

* Please send comments to editor[at]pambazuka[dot]org or comment online at Pambazuka News.