The armed forces bill is heading back to the Commons with two important amendments from the Lords. One is about turning serious crimes – including rape, domestic violence and murder – that occur in the UK, over to the civilian courts. The other is about subjecting the national government to the same standards of following the military covenant as local government.
What isn’t being sent back is a crucial amendment discussed late last Tuesday evening, that we could well look back on as a huge missed opportunity. It was tabled by Lord Browne of Ladyton, with backing of two of the House’s military stalwarts, Lord Houghton of Richmond and Lord Craig of Radley. It addressed the issues around what are formally known as LAWs (lethal autonomous weapons) and the broader place of artificial intelligence (AI) in warfare.
Killer robots: lethal autonomous weapons
Colloquially, LAWs are often called ‘killer robots’, as they were in a recent Times ‘Stories of Our Times’ podcast. They’re also going to be the subject of the upcoming Reith Lectures.
In the Lords debate, I referred to a book by Azeem Azhar: Exponential: How Accelerating Technology is Leaving Us Behind and What to Do About It. Its thesis is that there is an exponential gap: technologies are taking off at an exponential rate, but society is only evolving incrementally. And that’s clearly the case with AI military technology. As the Times podcast made clear, weapons not under direct human control are already killing people. Yet there are currently no international measures covering the issue.
Thankfully, the real world hasn’t yet got as far as another book, AI 2041, which is described as ‘scientific fiction’. This book posits the possibility of large quantities of drones learning to form swarms, with teamwork and redundancy. A swarm of 10,000 drones could wipe out half a city and theoretically cost as little as $10m.
The subject is on the international agenda. UN Secretary General António Guterres said, “The prospect of machines with the discretion and power to take human life is morally repugnant”.
UK embracing killer robots
Yet the UK seems to be heading in the direction of such weapons.
The United Nations Association of the UK has been working on this issue, and communicating with the government on it. In February, the government told the UN Association that UK weapons systems “will always be under human control” – known as ‘human in the loop’. But the government’s language has already got weaker – down to there being “context-appropriate human involvement”.
The recent integrated review of security, defence, development and foreign policy barely touched the issue. This is despite the fact that a crucial international arms control meeting – on the convention on conventional weapons – is being held this week, addressing this exact issue.
In the debate, Lord Browne highlighted the reassurance previously given by the Ministry of Defence, that is “alert to” the complex issues … and “has worked extensively on them over the … last 18 months”. That work is publicly invisible, however, without democratic oversight.
The government likes to talk, endlessly, about being world-leading, but in this field, as many others, New Zealand truly fits that category. It has just announced its call for a ban, although campaign groups there say even it should go even further. In total, 68 countries have called for a treaty regulating the weapons.
The threats to global security
This is one way in which the integrated review has failed to set the strategic scene for the next decade. There are many others.
It’s one reason why the alternative security review (ASR), launched at a recent meeting at which I spoke, is so important. The ASR is consulting civil society on what matters to it, highlighting particularly the climate emergency, the impact of poverty and inequality, and the failure of militaristic approaches to security. And with the international talks, and military and academic disquiet about ‘killer robots’ now emerging full pelt into the public realm, this subject is likely to also reach the ASR agenda.
These look like disparate issues, but what ties them all together is the huge number of threats to global security we face. The one underlying force behind all of these threats is easily identifiable in the title of the government’s review. It’s called ‘Global Britain in a Competitive Age’. And talk of being world-leading, ahead of others, winning races, is woven through it.
World leaders for good, or for bad?
Our foreign policy needs to have cooperation as its foundation, not competition. The Doomsday Clock is at 100 seconds to midnight. That’s as true for the people of China and Russia and Iran as it is for all of us – and they’re not going to be forced or outcompeted into working with us to face the challenges. As I wrote for the Green European Journal, we need ‘steward states’ to provide leadership, backing the increasingly powerful efforts of civil society, on everything from an ecocide law, to Magnitsky-style sanctions, to ‘killer robots’.
And the UK should – like the Scandinavians, New Zealand and a handful of others – being taking a leading role in supporting these civil society campaigns. Instead, we’re a minor player reliving past colonial ‘glories’ in dangerous 19th-century-style ‘Great Games’, with nuclear, AI and ‘conventional’ weapons.
There’s a petition, involving Amnesty International, calling for governments to act on autonomous weapons.