It’s strange how many things that “shouldn’t even need to be said” actually urgently need to be said. Here is something that shouldn’t need to be said: building a swarm of deadly armed robots, and then giving those robots the autonomous ability to decide who to kill, is one of the worst ideas imaginable. Fortunately, it’s such a colossally stupid and suicidal concept that human beings would never actually pursue it, let alone spend billions of dollars trying to make the robots as efficiently deadly and free of meaningful control as possible. This is the stuff of Black Mirror episodes, not reality, where (as Steven Pinker assures us) life on earth is getting more peaceful, more intelligent, and less horrifying all the time. Right?

Hah. Actually the Defense Department is proudly boasting of its swarm program, and investing considerable resources into making sure that the robots will not only be able to kill as many things as possible, but decide for themselves which things are worth killing. As the unsettling cheerful man in this DARPA video explains, the military is seeking to expand its drone swarm capabilities for “operations in urban environments” and is soliciting proposals that will answer the questions “What disruptive new capabilities does the proposed technology provide to a swarm?” and “How does the proposed technology scale to provide exponential advantage by implementing it in a swarm?” (Funny how the ubiquitous language of “scaling” and “disruption” is applied to building flying death robots.) The program is well underway and there have been plenty of tests (the Navy has been firing off swarms of drones as part of a program that is literally called LOCUST). As a FOX News article that reads like a Defense Department press release, called “How deadly drone swarms will help US troops on the frontline” explains:

No enemy would want to face a swarm of drones on the attack. But enemies of the United States will have to face the overwhelming force of American drone teams that can think for themselves, communicate with each other and work together in hundreds to execute combat missions…. Say you have a bomb maker responsible for killing a busload of children, our military will release 50 robots – a mix of ground robots and flying drones…Their objective? They must isolate the target within a 2 square city blocks within 15 to 30 minutes max… It may sound farfetched – but drone swarm tech for combat already exists and has already been proven more than possible. In 2016, the US military launched a drone swarm of 103 Perdix drones from three F/A-18 Super Hornet jet fighters.

We can see exactly why this is horrifying. It’s worth thinking about where the “urban environments” in question will end up being, and who the suspected “bomb makers” to be executed without trial will be. Drones as they are used already have inflicted unnecessary death and suffering on civilians; Micah Zenko of Foreign Policy reported that that Obama administration distorted and underreported the numbers of civilian casualties in drone strikes. That’s the Obama administration, with ordinary drones rather than swarms of autonomous ones, and we can imagine how much worse the situation will be under a president being advised by John Bolton, a man who seems to have made it his personal life’s ambition to start World War III. This should especially concern those who recognize how easily the United States devalues and destroys brown and black lives. The United States military has a presence in 50 out of 54 African countries, where it conducts nearly 10 missions per day (hardly ever discussed, thanks news media). The “urban environments” where the autonomous death robots will be tested are going to be those whose residents have no recourse when their wedding parties are bombed.

The scary thing here is not just the “swarming” aspect of the new drone technology, though that’s alarming enough. (In fact, the Black Mirror episode about a woman on the run from deadly robots distorted the situation for dramatic effect; in reality, humans would never have that much of a chance of survival.) More concerning is the fact that the Defense Department’s new call for proposals emphasizes “autonomy,” i.e. allowing the drones to use their own judgment free of human control. The military wants to be able to release a swarm of drones, give it a mission, and send it off on its way. Considering that nowhere in DARPA’s promotional materials can I find a discussion of the importance of making sure the drones don’t massacre a wedding party, robot autonomy is a concerning prospect.

This is why the Campaign to Stop Killer Robots (an amusing name, but a deadly serious cause) strongly emphasizes the need for international prohibitions on autonomous weapons systems. There’s a good xkcd comic emphasizing how odd it is that there is so much public fretting over “sentient artificial intelligence” and so comparatively little about “the swarms of self-controlled deadly drone swarms that are literally being built right now.” The Campaign to Stop Killer Robots seeks a “comprehensive, pre-emptive prohibition on the development, production and use of fully autonomous weapons–weapons that operate on their own without human intervention.” This is a good idea. But there has been little international pressure for such an agreement, and securing one will take a coordinated global campaign. An international treaty is the only way forward, though. Russia and China are building their own autonomous weapons systems, and as with nuclear weapons, no country is going to unilaterally disarm.

I don’t think there are many issues more important than arms control. Before all else, we need to make sure humanity doesn’t wipe itself out, and as weapons systems become more and more sophisticated, the potential for new forms of horrific violence multiplies. I think activists ought to see this as part of a broader agenda of protecting marginalized people, and campus progressives at the University of Maryland and Carnegie Mellon (both of which have partnered with the military on drone swarms) would do well to protest their universities’ involvement with the development of new weapons technology that will likely be used on Middle Easterners and Africans. Shaming the institutions that agree to work on these projects is important. (Personally, I think the chirpy guy in the video also deserves to have daily pickets outside his house.)

Nuclear weapons remain the most significant threat we face, and the reappearance of John Bolton should trouble anybody who prefers continuing to live over dying in a fireball. But creating a swarm of deadly robots and giving them the power to determine who should live or die is… also not something any country should be doing, and everyone should be part of the campaign to stop the insanity of killer robot programs. We have to find a way to spend our time and money on making life on Earth pleasant and peaceful, rather than racing to expend as many resources as possible on the worst idea imaginable.

If you appreciate our work, please consider making a donation or purchasing a subscription. Current Affairs is not for profit and carries no outside advertising. We are an independent media institution funded entirely by subscribers and small donors, and we depend on you in order to continue to produce high-quality work.