Are we on the brink of robot wars?
Recently in Pakistan former cricket legend and now opposition party leader Imran Khan addressed a rally organised to protest against US military drone (unmanned aerial vehicle or UAV) strikes, which have increased significantly since the Obama administration took office. Khan and many others believe UAV strikes fuel increased terrorism as disillusioned Pakistanis seek revenge for the death of innocent civilians caused by UAVs incapable of distinguishing military from civilian targets.
How will the world react when a military superpower unleashes a robot army with full autonomous decision authority to strike enemy targets? This science fiction scenario is almost within technological reach. What is not clear is the degree of importance being attached to the ethical dilemmas posed by fully autonomous weapon systems.
The primary argument in favour of weapon systems with autonomous decision capability is the potential for reduced human casualties as robots replace human soldiers on the frontline. Furthermore robots are already faster, stronger and sometimes smarter than humans. “They don’t get hungry. They’re not afraid. They don’t forget their orders.”
Currently UAVs are defenceless against incoming missiles given they are controlled by personnel thousands of miles away causing a critical delay in response time. Giving drones decision capability will provide the opportunity for instant strikes against hostile aircraft.
Arguments against focus on the impossibility of autonomous robots distinguishing with certainty civilians from enemy combatants, which is contrary to international legal and humanitarian standards; and the inability of robots to make judgments such as limiting the use of excessive force when military objectives can be achieved using less aggressive means.
These arguments are unlikely to lead to consensus on either side. The most interesting ethical argument I have read supporting the use of autonomous military robots is that although less than perfect, robots will be capable of adhering to higher ethical standards than human soldiers. Ron Arkin, professor of computer science and director of the Mobile Robot Laboratory at Georgia Tech University (Atlanta), uses such an argument as moral support for developing this technology. He establishes his case by describing the reality of warfare, which takes an enormous psychological toll on soldiers.
Modern warfare is waged with minimal personnel and sometimes no clearly-identifiable enemy. Traumatic stress disorders result from battlefield fatigue in some cases causing hysteria, confusion, anxiety and obsessive compulsive disorders. The enemy is dehumanised with derogatory names. The Laws of War are perceived as unrealistic constraints. The cumulative result of these factors are errors of judgement and ethical violations; which increase in likelihood if a soldier loses a mate and the longer the time at war.
Arkin argues that robots potentially have longer endurance; are immune from emotional and psychological trauma; immune to chemical and biological weapons; are capable of integrating more information from multiple sources much faster thereby enabling a better informed decision whether or not to strike; can be equipped with greater sensory capability (advanced optics, radar, acoustics); and do not need to protect themselves so can be programmed for self sacrifice rather than reflexive self defence when confronted with an ambiguous target.
Initially there will be an enormous investment, much of which will be funded by taxpayers, to develop autonomous weapon systems into economically and politically efficient killing machines. The consequences of enemy and civilian deaths within the war zone will be further removed from military personnel. War might no longer be a weapon of last resort but rather a low cost, quick option.
Arkin believes war is inevitable and robots are potentially better decision makers than human soldiers weighed down by the stress and trauma of battle. However he ignores more ethical alternatives for spending billions of dollars rather than developing more technologically advanced methods of killing each other. Some of those alternatives will focus on the scarce resources (land, water, food and mineral wealth) over which wars are fought.
Geoff Lamberton is a senior lecturer in ethics and sustainability at Southern Cross University.