U.S. Marines in Helmand Province, Afghanistan, on March 3, 2011
Photo by Adek Berry/Getty Images.
Some people worry about the use of drones?unmanned aerial vehicles?in warfare. Others are concerned about potential deployment of lethal autonomous robots that are programmed to identify, track, and destroy targets and people without a human decision-maker in the loop. Many fret about cyberconflict. Some fear the effects of cognitive enhancers?pharmaceuticals that will reduce the sense of risk that troops carry with them, or enable manipulation of memories, or change moral postures?on the battlefield. Others wonder about the implications of augmented cognition as the complexity and speed of combat environments grow beyond the ability of human perceptual and cognitive capabilities. And there are other war technologies worth discussing, some of which already exist, some of which may never come to fruition: self-guiding bullets, exoskeletons for burdened soldiers, ?telepathic? brain-reading helmets, hummingbird- and insect-size cyborg platforms for surveillance or for attack, software to protect your networks and to attack those of your enemy.
It?s more than enough to feed technophobic dystopian fantasies or, alternatively, to generate techno-optimistic dreams of cultural conquest?especially, it seems, among those who have never served in military organizations. But the hyperventilation about specific technologies such as UAVs or lethal autonomous robots obscures two more subtle but fundamental challenges. The first is the need to understand, and manage, not just individual technologies but an accelerating rate of change across the entire technological frontier?indeed, each individual technology is only a small part of this much more complex reality. Second is the destabilizing impact these technologies have on the rules and norms that have evolved over centuries to try to reduce the awful impacts of war, especially on civilian populations.
Worse yet for those who would rather do their thinking in terms of bumper stickers, these challenges arise in an environment where unpredictable change in institutions, cultures, and practices of combat are all in fundamental flux. Currently, international behavior in cases of war is covered by the so-called the ?laws of war,? which govern when a state may go to war, how to carry out such a conflict, and, increasingly, how to end it ethically and legally. For example, a state may protect itself or, under some circumstances, clearly signal a potential strike, but it may not simply attack another state out of spite. In conducting a war, the force used may not be more than is necessary to achieve military objectives, and noncombatants may not be targeted or attacked, although collateral damage resulting from the otherwise lawful and necessary pursuit of military objectives is permitted. These laws of war have been developed over many centuries with contributions by many civilizations and are embodied in treaties and agreements, such as the Geneva Conventions and the U.N. Charter, as well as customary international law.
But consider these rules and their underlying assumptions in light of current conditions. For example, today?s practices and international laws assume a state-based regime?the traditional wars among countries?but a very different pattern emerges when nonstate global actors, such as radical jihadist Islam, become a major source of conflict. Worse yet, since nations are losing their traditional monopoly on military technologies, nonstate actors and even individuals are gaining the ability to impose damage on a far larger scale. The laws of war tell you when a country can respond to an attack from another country?but they say nothing about how to respond to attacks by terrorist networks woven into the fabric of nations around the world. The geographical assumption that ties combatant status to a particular physical battlefield, core to the existing framework of the laws of war, is questionable in a world of global terrorism and cyberspace confrontation. Similarly, does building software bombs or software backdoors into a potential opponent?s Internet systems, civilian and military, constitute an attack that may be responded to by force even if they remain unactivated? And what happens when nonstate actors use the laws of war against those that consider themselves bound by them (a practice called ?lawfare?)?
We are also seeing an eroding of the clear differences between a state of peace and a state of war, creating substantial institutional confusion. Drones operated by a single government such as the United States may be operated by the military or by intelligence entities, with private contractors involved to some extent. The U.S. armed forces operate under codified laws of war and strict rules of engagement, while intelligence organizations abide by very different standards (a complication that also exists in the cyberconflict domain). And it is not at all clear what formal rules may govern other players (private military contractors, for example, or nongovernmental organizations intervening in conflict zones for their own purposes). Sophisticated cyberattacks are not only difficult to attribute to any particular actor, but they often involve quasi-governmental organizations that may or may not be completely under the control, or even the sponsorship, of a particular government (like the pro-Putin youth organization Nashi). Contrary to popular belief, privateering is not just an 18th-century phenomenon.
There is an ongoing revolution in the nature of conflict as the battlefield, freed from the constraint of traditional weapons, extends globally through cyberconnections. The terrorist on leave from the Middle East may be tracked down and terminated in Peru by a drone operated from Nevada. Combat mixes with counterinsurgency mixes with policing mixes with nation-building, each a physically, culturally, and psychologically very different space subject to varying norms and rules. There is a revolution in civilian systems as well, as technologies developed for combat ooze into the civilian sector and societies re-evaluate the balance between security and personal freedom in spaces ranging from airports to social networking sites. Cartoons or films, protected free speech in Western societies, cause explosions of violence in Islamic societies on the other side of the world. The military itself is not immune. How, for example, does a military culture of obedience and discipline fit itself to gamers and cybergeeks?
The assumptions built into existing rules and norms about the clearly bounded geographies of battlefields, or the simple status of ?combatant? vs. ?noncombatant,? or the dominant role of states in international affairs, or even about what conflict actually is, are, at the least, unstable. At the worst, they are overthrown, with no obvious substitutes. Perhaps this is inevitable given the accelerating rate of disruptive technological change and the human desire to cling to institutions that may be growing obsolete but at least worked in the past. But given the importance of trying to reduce the impacts of warfare, this area requires far more serious attention than it has received to date. It isn?t that the established laws of war, many centuries in development, have suddenly become completely inapplicable?at the least, some future conflicts will consist of traditional state-to-state combat. But, clearly, technological evolution and concomitant changes in military, cultural, and social domains have rendered virtually all of the fundamental assumptions underlying the laws of war at least potentially contingent. It is unlikely that such fundamental and pervasive change will not affect in some meaningful way doctrines and principles that were formulated and tested under very different and less complex conditions.
We need to develop a sophisticated and adaptive institutional capability to recognize critical change as it happens, understand the implications across multiple domains, and respond in ways that are rational, ethical, and responsible. Calls for new treaties regarding specific technologies?cyberconflict, lethal autonomous robots, or drones?to some extent reflect this inchoate need. But useful as they may be as learning experience, they are mere attempts to update an already obsolete international regime. They neither appreciate, nor respond to, the enormity of the challenge before us: to create new and viable laws of conflict that represent a modern, sentient, and moral response to the human condition known as war.
This article was inspired by the 2012 Chautauqua Council on Emerging Technologies and 21st Century Conflict, sponsored by Arizona State University?s Consortium for Emerging Technologies, Military Operations, and National Security and Lincoln Center for Applied Ethics and held at the Chautauqua Institution in New York. Future Tense is a partnership of Arizona State, the New America Foundation, and Slate magazine.
Source: http://feeds.slate.com/click.phdo?i=d5753f05957767585d73f8a689f2fb0c
bob beckel anna paquin warren buffett 2012 nfl schedule dishonored april 18 delonte west
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.