Killer robots? It’s only a matter of time
Artificial intelligence (AI) is everywhere in the news at the moment, but there is little discussion of an area where huge sums are being spent by all the major powers: automated intelligent weapons — or killer robots, as they are usually called.
Some strategists now envision military operations conducted entirely by automata. Such fully automated forces need have no relation to the human form — they could be huge flocks of small boats or submersibles, drones in the air, or fleets of automated armoured cars, all equipped with powerful weapons.
Present day developments have eerie connotations with Stanley Kubrick’s 1964 movie Dr Strangelove, which introduced (as satire) both a “Doomsday machine” that would respond automatically without human intervention to an atomic attack, and the character of the Doctor himself, played by Peter Sellers, a psychopathic character determined to kill on a huge scale.
US experts have conceded recently that the Doomsday Machine was nearer to the truth than was ever admitted at the time. It is even closer now. It’s a terrifying prospect, but it may in fact still be better to have automated controls of such weapons, because of the failings of human psychology.
There really are quasi-psychopaths like Dr Strangelove involved in military planning and in battle. For example, Frederick Lindemann, Churchill’s chief scientist in the Second World War, not only planned the mass bombing of Germany but wanted all German males castrated (advice that Churchill wisely ignored). And among the troops themselves, while most soldiers do not want to kill and go to great lengths to avoid doing so, 10 per cent of them do almost all the killing.
Clearly, careful vetting would be necessary to keep anyone with unsavoury psychological tendencies away from weapons development, deployment, and associated computer control of them. But assuming that this is possible, automated weapons could address both of these challenges: removing human decision-making from high-stakes situations, and leaving the actual killing to a machine (which will be better at it), rather than a reluctant human.
The additional — obvious — advantage of automated weapons would be the lack of human casualties on “our” side.
That’s the case for. Against, we have the ethical quandaries.
Ethical worries about particular weapons long predate AI — it is said that the Church forbade crossbows in the Middle Ages as “abhorrent to Christians”. The more powerful the technology, the more powerful the case against using it.
The main argument against AI-based weapons, urged by activists such as my colleague Professor Noel Sharkey as he tries to get the UN to ban such weapons, is that AI technology is not sufficiently advanced to separate combatants from non-combatants as targets, and so war crimes will inevitably be committed if it is deployed.
But this was always the case with weapons guided by humans, as the bombings of Dresden, London and Hiroshima show on a vast scale. And, indeed, the concern about AI’s lack of discretion may be temporary. If automated cars can successfully separate cyclists from pedestrians — and they can — we may hope that battlefield discriminations will soon be possible too, and perhaps be better than those made by human soldiers.
So if we are to trust automated cars on our roads to make life and death decisions about humans, as we are about to, why would we not eventually trust such decisions to a flock of armoured cars using similar technology? Might such cheap weapons not be a better investment than a £3bn aircraft carrier with no planes on it and that we cannot defend from cheap hyper-missiles?
One final warning, though, which is not about ethics but practicalities. Cheap assassination drones already exist on the battlefield, but as automated weapon technology develops, we may find them being used elsewhere. Are we really prepared for small drones with guns that can look into the windows of Downing Street, or Buckingham Palace?
The history of military technology shows that it cannot be kept out of private hands, but we have to hope that killer robots are an exception.
Main image credit: Getty