Australasian Science: Australia's authority on science since 1938

Will Enhanced Soldiers Fight a Just War?

Revision Military’s prototype TALOS suit has a powered lower-body exoskeleton supporting a body armour system that can protect 60% of the body from rifle rounds. To relieve weight, motorised actuators pick up each leg and move them. The weight of the helmet, armour and vest is supported by a rigid articulated spine. The suit’s power pack has a cooling fan, and a cooling vest pumps water through 3 metres of tubing under the suit. Credit: Revision Military

Revision Military’s prototype TALOS suit has a powered lower-body exoskeleton supporting a body armour system that can protect 60% of the body from rifle rounds. To relieve weight, motorised actuators pick up each leg and move them. The weight of the helmet, armour and vest is supported by a rigid articulated spine. The suit’s power pack has a cooling fan, and a cooling vest pumps water through 3 metres of tubing under the suit. Credit: Revision Military

By Adam Henschke

Technologies may be able to enhance a soldier’s strength, endurance, stress tolerance and cognitive ability, but could they reduce their moral capacity to follow the laws of armed conflict?

As combinations of nano-, bio-, info- and cognitive technologies converge and combine, humanity is increasing its capacity to actively change and direct our physical nature. In contrast to evolution by natural selection, human enhancement involves the use of technological interventions to shape us as individuals in ways that we have selected.

The military context is one area where such technological enhancements are being extensively researched. The guiding thought is that technologically enhanced soldiers can increase a military force’s chances of winning.

While wars are fought to be won, the tradition of a “just” war has applied moral principles to determine when and how war should be fought. These moral principles can both guide and complicate what technological interventions can be used to enhance soldiers during wartime.

This article focuses on ethical issues around research and the impacts of enhancement during conflict. The particular worry is that enhancements could impact on the behaviour of soldiers during conflict in ways that are morally relevant. The article will frame the moral discussions by reference to the tradition of a just war, specifically the discrimination and proportionality criteria that look at how one can fight a war justly.

Enhancements include exoskeletons to augment strength and endurance, “metabolic dominance” to alter a soldier’s eating and nutrition, and neurological intervention and stimulation to increase the capacity of a soldier to operate under stress and decrease the need for sleep. Lest this seem like mere science fiction or technological speculation to the point of fantasy, one need only look for the “tactical light operator suit”, search for human enhancement research supported by the US Defense Advanced Research Projects Agency (DARPA), or read Mind Wars by Jonathan Moreno to see that exoskeletons, direct biological interventions and cognitive enhancement receive extensive interest and funding for military purposes.

In parallel with this research there are a host of ethical discussions on the ethics of enhancement generally and on the ethics of military enhancement specifically. One of the highest-regarded centres for ethics research, The Oxford Uehiro Centre For Practical Ethics, states ( “Over the last decade, biomedical enhancement has become the focus of one of the liveliest and widest-ranging debates in practical ethics”.

The just war tradition, an extensive set of discussions about the ethics of warfare, goes back at least 2500 years. It has long been concerned with the conditions that must be met in order for a war to be justified. For this article, the criteria of discrimination and proportionality are the main interest.

We start with the notion that, in situations like wars of self-defence, those posing a threat to innocent civilians are legitimate targets of lethal force. The discrimination criterion then focuses on who is a legitimate target and why. Proportionality, on the other hand, focuses on ensuring that the particular harms of the lethal violence are bound by a set of good outcomes.

There are at least three ethical issues that are of interest.

  • Is there a moral responsibility to enhance a soldier?
  • Would those enhancements undermine the soldier’s capacity to follow the laws of armed conflict?
  • Would the fact of enhancement have any negative impact on how the adversary treats soldiers?

On the first issue, a comprehensive discussions of the general ethics of enhancement suggest that we have a responsibility to enhance people to make them more moral. Arguing that there is an urgent imperative to enhance the moral character of humanity, ethicists Ingmar Persson and Julian Savulescu suggest in the Journal of Applied Philosophy ( “If genetic and biomedical means of enhancement could counter such natural [racist] tendencies, they could have a crucial role to play in improving our moral character”. Persson and Savulescu’s position is highly controversial, but if this is at all possible then perhaps some enhancements might be morally obligatory in the military context. For instance, if certain cognitive enhancements enable a soldier to make decisions based on legitimate threat, and avoid irrelevant features like physical morphologies associated with race, then it might follow that cognitive enhancements could be used to increase the soldier’s good conduct in war.

On a different line of reasoning, research into the pharmaceutical propranolol suggests it could be helpful in reducing the likelihood and impact of post-traumatic stress disorder (PTSD). Propranolol is thought to work by reducing the intensity of fear-based emotional memories associated with a traumatic incident. Rather than focusing on the soldier’s conduct, this approach to enhancement reduces the long-term psychological burden of conflict on veterans and their families.

However, increased cognitive capacity and decreased PTSD bring us to one of the big concerns around deliberate technological enhancements to warfighting. Will such interventions decrease the capacity of a soldier to follow the laws of armed conflict? For instance, while increasing someone’s cognitive capacity might be tactically useful and could potentially to lead to increased moral behaviour, changing how people think can have a host of unintended side-effects.

Neurostimulation, for example, is a technology of interest for use in conflict. In Mind Wars, Jonathan Moreno notes that “DARPA has given grants to see if neurostimulation can improve impaired cognitive performance and reduce other effects of sleep deprivation on soldiers”. Direct neurostimulation by deep brain implants is a potentially useful intervention for patients with Parkinson’s disease, but it has a range of unwanted side-effects ranging from speech disturbances and memory impairment to increased aggression, hypomania, depression and suicide. It’s important to recognise that the numbers vary across studies (1.5–25% displayed depression), and some of the numbers are relatively low (increased aggression was only observed in 2% of cases in one study). Nevertheless, the impacts of increased aggression and depression in warfare give us reason to be very careful about how such enhancements are used in practice.

Similarly, while reducing PTSD is undoubtedly a good thing, some ethicists worry that reducing the emotional impact of traumatic events in wartime could lead to an increase in the number of wartime atrocities committed. In her assessment of ways that enhancements impact moral responsibility, Jessica Wolfendale states in The American Journal of Bioethics ( “Propranolol, it seems, modifies subjects’ capacity to respond to and assess information relevant to rational decision-making, and as a result it would arguably affect the degree of moral responsibility we could assign to them”.

In case this comes across as overly pessimistic, this is not to suggest that cognitive enhancements or efforts to reduce PTSD ought to be discarded; that would be throwing the baby out with the bathwater. Instead, the point here is to recognise that the technological interventions are not simple, and the context of decision-making in warfare is very complex. Things often don’t go as planned. What is needed is an active part of the research and development process that looks at any such technological intervention’s possible impacts on, and reduction in, the capacity to follow the laws of armed conflict.

Furthermore, as the side-effects of deep brain stimulation show, long-term and in-conflict monitoring of the enhanced soldiers is needed to see if the interventions actually work as hoped, and/or if they have any demonstrable or recognisable effects on following the laws of armed conflict. Much like a phase IV clinical trial, the interventions must be continually studied in their application. Moreover, thought needs to be given to who is legally culpable should such an intervention reduce a soldier’s adherence to the laws of armed conflict. It is, perhaps, an open question about who is morally responsible: individual soldiers, their direct commanders, those who implemented the enhancement program, or the designers and providers of the product? Questions of legal culpability need clear answers.

There is also an equivalently complicated set of questions about the way that enhancements could impact the adversary’s treatment of one’s own soldiers. Enhancements to increase endurance and reduce sensitivity to pain are of obvious interest for military applications, so if the enemy military and civilians hear that incoming soldiers have been enhanced in such ways, it would seem possible that these enhanced soldiers are perceived as something more than, or perhaps less than, human. Similarly, if a group of soldiers encased in exoskeletons bear down upon their enemy, it would seem possible that these enhanced soldiers are perceived as something other-than-human.

In Humanity: A Moral History of the Twentieth Century, Jonathan Glover writes : “There are far weaker social pressures against hostile treatment of members of other groups. And in war the pressures often support group hostility.” Seeing other soldiers as enhanced half-robots and potentially not-human can make hostile and harmful treatment by the other group even easier and likely. Here the worry is that the enhancement negatively impacts proportionality because of the way it changes perception by the adversary.

This is not to say that we jettison the laws of armed conflict, or lower the legal criteria around discrimination, proportionality or treatment of prisoners of war. Rather, it is to recognise that significantly changing our soldiers could have an impact on how our enemies treat them. Hence there is a case to be made for anthropological research in the military context, to get some idea of how those who we are fighting with understand the enhancements and what these different understandings mean in practice.

As a final note, it is important to recognise that much of what has been discussed here is speculative. While some of these enhancement technologies are being used and trialled, others are still in the research stage.

Moreover, the concerns that I’ve been pointing to are “what if” scenarios. What if these technologies cause a decline in adherence to the laws of armed conflict? What if the enemy starts treating all our soldiers like they can’t feel pain?

In this sense, what I’m speaking of is speculative. However, simply because this is speculative doesn’t mean it is fantasy. First, the military interest in enhancement technologies exists and is substantial. Second, in terms of the ethics of such future technologies, we can take an approach that technology philosopher Philip Brey calls “anticipatory ethics for emerging technologies”.

Rather than waiting to see what happens with these technologies in practice, and then patch any problems after they have caused unjustified death and destruction, given that we know there is interest in these technologies we can anticipate them and attend to the ethical concerns in a way that is evidence-based and pre-emptive. Attention to the full range of impacts of such technologies at the research stage, including complicated issues like adherence to the law and adversarial perception of enhanced soldiers, is a necessary element of the development of any enhancements.

Adam Henschke is an ethicist at the National Security College, the Australian National University. This research was supported by the Brocher Foundation in Geneva.