The Silicon Valley “war king” thinks an uncontrolled arms race is the route to peace. He is so, so wrong.
Again, I’m not quite sure I believe Luckey when he says he’s simply carrying out a socially necessary function. He admits that he also thinks weapons are cool, and people come to work for him in part because they enjoy working on cool weapons. Business Insider says he builds augmented reality headsets for the military that turn “warfighters into[…] ‘technomancers’ who slay [real-life] opponents while peering at a screen that looks straight out of ‘Call of Duty.’” Luckey began his career as a designer of video game equipment, and for a gamer who wants more of a rush, war is the ultimate thrill, because you’re playing with real lives. Luckey allegedly owns the world’s largest video game collection and is an enthusiastic Dungeons and Dragons player (his character is a “chaotic-neutral wizard named Nilrim V”). In childhood, his favorite manga character “was an anti-hero named Seto Kaiba who inherits a weapons empire and then ‘proceeds to kick everyone’s ass using this incredible technology.’” Now, he boasts, “I build cruise missiles, and I post on X.” Indeed, like fellow billionaire J.K. Rowling, he tends to get in more Twitter-reply arguments than you would expect of a person in his position.
He even once created a virtual reality headset that kills its user in real life if they lose the game they are playing—a bizarre, horrible device he described as a piece of “thought-provoking” “office art.” According to Luckey, “Pumped up graphics might make a game look more real, but only the threat of serious consequences can make a game feel real to you and every other person in the game.” So there you have it: someone’s brain exploding is little more than a “thought-provoking” consequence for losing a game. You’ll have to forgive me for thinking this man views real-life humans as little more than tiny, plastic figurines for him to creatively wipe off the map. After all, war is much like playing Risk for the people who get to wage it without incurring any personal danger.
Luckey presents the usual case that the more heavily armed the United States, the safer its citizens. In a TED talk last year, he said that he is all about preventing World War III, by “deterring” (an annoying word that means “threatening to murder”) others. The more we can be assured that any opponents of the U.S. suffer instant and agonizing death, the less likely they will do anything to challenge us. Luckey believes it is his duty to fight and win the arms race, and explicitly sees himself as preparing for a potential war against China, and soon. (According to MoneyWeek, “One of his teams has an in-house slogan: ‘China 27.’ It signifies that any products or features not ready for potential conflict in 2027 must be cast aside.)
Of course, a weapons manufacturer has a strong personal financial interest in claiming that the best way to assure a peaceful human future is to manufacture as many weapons as possible. If Palmer Luckey does not successfully convince others that we urgently need to spend billions developing autonomous killer robots, his company’s anticipated IPO will not be as lucrative. (The Iran war has been great for him.) Luckey, who once studied journalism, has also said that he no longer thinks with a journalist’s regard for truth: “Absolutely not, I’m a propagandist[…] I’ll twist the truth. I’ll put forward only my version if I think that that’s going to propagandize people to what I need them to believe.”
He also appears to have become a weapons maker in part because he was cast out of Silicon Valley’s mainstream, and wanted a kind of vengeance by turning to the dark side. Back in 2018, Mark Zuckerberg fired him from Meta, where Luckey worked on virtual reality, allegedly because Luckey donated to a pro-Trump group. Luckey has since hosted multiple fundraisers for Trump at which attendees paid up to $150,000 to attend, and donated over $15,000 to groups backing JD Vance in his Ohio Senate campaign. (As a venture capitalist, Vance invested in Anduril.) Weapons manufacturing and military work were still quite taboo in Silicon Valley, and “defense was a dirty word in tech,” as Business Insider puts it. At the time, “about 4,000 Google workers signed a letter asking the company to cancel Project Maven, an intelligence-gathering Pentagon program, and to get out of ‘the business of war.’” In contrast, Luckey seemed to almost relish the fact that by starting Anduril, in his words, “everyone was going to think I was evil.” He describes himself as “a crusader for vengeance” rather than a “crusader for truth.” Anduril co-founder Trae Stephens says Luckey has “a very innate sense of justice,” by which he does not mean that Luckey has a social conscience, but that “if he has been wronged by you, he has a long memory.” Luckey has suggested that as he pivoted from VR to weapons, he was also developing an “obsession” with “malice” after being “stabbed in the back”:
“[With Oculus] I was just trying to build toys that delighted people. There was no vengeance, there was no malice, there was no killer instinct involved. I think I didn’t develop that obsession or framing until I felt I was stabbed in the back by a lot of people who should have treated me better.”
Even though personal psychology seems more important than careful reasoning in explaining Luckey’s choice of profession, we should note the problems with his argument that “autonomous weapons ultimately promote peace by scaring adversaries away.” Luckey insists he’s helping us “stop World War III” rather than fighting it, using the same reasoning that justifies nuclear arsenals: when the consequences of war are too horrific to contemplate, there is a powerful incentive to avoid war.
There are two problems with the argument. The first is that it’s not at all clear an arms race will prevent a catastrophic war. Even if wars would be disastrous for all parties involved, they can still occur due to miscalculation or the perceived need to avoid dishonor by backing down. The catastrophe of World War I, which erupted over a small matter that should theoretically have been solvable diplomatically, and which ended up being immensely costly for all sides, shows that countries can blunder their way into war even if it would be a disaster. The U.S. and the Soviet Union were very close to an all-out nuclear war during the Cuban Missile Crisis, because neither side felt it could back down, and high-up figures in the Kennedy administration advocated courses of action that would certainly have led to nuclear war. That close call showed just how alarmingly plausible it is that a chain of events could occur that lead two superpowers to think they have no choice but to wage a nuclear war. And, as Andrew Bacevich, a former U.S. Army colonel, notes, investment in military power can make it more likely that military solutions to problems will be used: “Belief in the efficacy of military power almost inevitably breeds the temptation to put that power to work. ‘Peace through strength’ easily enough becomes ‘peace through war.’”
The second problem with Luckey’s argument is that it assumes that it is good for the U.S. to be able to impose its will on others. Luckey assumes that U.S. military power will be used defensively, and not for war crimes and aggression. But the history of U.S. foreign policy, from the napalming of Vietnamese villages to the recent missile attack on an Iranian elementary school, shows that U.S. weapons are not simply used to deter attacks by external aggressors. They are used to pursue what is called the “U.S. national interest,” and there is a long track record of U.S. military power being used in brutal, illegal, and outright horrifying ways. Of course, weapons developed for the U.S. will also likely be shared with Israel, a genocidal state that has used its military power to wipe out tens of thousands of Palestinians and enforce an apartheid regime that has been universally condemned by human rights groups. You will not be surprised to learn that Luckey describes himself as a “radical Zionist,” defending Israel’s right to a formal two-tiered system in which Jews are valued over non-Jews. Acknowledging that people call this “problematic” or “ethnostate adjacent” (or even, in the words of human rights groups, apartheid), Luckey’s response is blunt: “I don’t care.”
So one major problem with Luckey’s belief in giving the U.S. as much military strength as possible is that recent history indicates that power will not be used toward “good aims,” but to overthrow inconvenient governments. (We have already overthrown and kidnapped the Venezuelan head of state, and it looks like the president might invade Cuba before his term is up.) But perhaps Luckey believes companies should stop selling weapons to the U.S. government if it uses them for war crimes?
Nope. He believes that weapons manufacturers should not second-guess the president on policy, and he is staunchly critical of the AI corporation Anthropic for trying to restrict how the military used its products. Anthropic wanted a guarantee that its products would not be used for lethal autonomous weapons or for the mass surveillance of Americans. In response, the Department of War blacklisted Anthropic and lambasted them as a national security risk. Luckey is firmly on the side of the Trump administration in this dispute, saying that any attempt by a government contractor to add ethical guidelines to its work is an affront to democracy. (Luckey has also wrongly suggested that he cannot impose such restrictions on government use of his products.) This is because the president is elected, while corporate CEOs are not, therefore such restrictions amount to rule by corporations, and any company rules governing ethical use of products give “more power to corporate executives than the president of the United States.”
He asks: “Do you want to live in a corporatocracy where Big Tech CEOs decide foreign and military policy?” Personally, I think this is sophistry: we can draw a meaningful distinction between “rule” by corporations and the belief that corporations should have some ethical baselines for their work that they are not willing to violate just to improve their profits. I do not want to live in a corporatocracy, but neither do I want to live in a world where CEOs hold to the Friedman doctrine that the only social responsibility of business is to increase its profits.
But Luckey frequently pretends not to be able to make distinctions when it comes to evaluating ethics. For instance, on the question of whether companies should make offensive weapons without any human decision-maker in the loop, Luckey goes full Jordan Peterson: “What defines offensive? What defines no human in the loop? What defines weapon? What defines use?” Likewise, he rejects the idea that a weapons company should stop supplying weapons if they are used to target innocent civilians: “Well, what is targeting? Like is it inadvertent versus explicit? What is innocence? Is it determined by the courts? Is it determined by the Hague? Is it determined in Brussels? Is it determined by Dario [Amodei]? What is a civilian? And who decides? What if there are civilians mixed in with other things?” This is classic fallacious reasoning, assuming that because the lines between categories can often be fuzzy, we cannot make meaningful distinctions between the categories. The Geneva Conventions are quite real, many violations of them are obvious, and it’s not just the obligation of every corporate executive, but every person within any system, to refuse to participate in war crimes. Luckey pretends that it’s impossible to know what a war crime looks like in order to escape his basic moral responsibility as a human being.
Peter Thiel has called Anduril “the company that can save Western civilization.” Personally, I think it’s the single company most likely to hasten our doom. I cannot think of anything more dangerous than having people whose worldview was shaped almost entirely by video games being contracted by the U.S. government to design the deadliest weapons possible. The U.S. track record of gruesome violence is extreme—we are, after all, the only country to have dropped a nuclear bomb on a civilian population. Luckey shows none of the understanding of the horror of war that military veterans like Andrew Bacevich and J.R.R. Tolkien have had. I am sure he only ever imagines his weapons being used on Chinese people and Iranians, not his own wife and child. I fear the kind of world that such a morally frivolous gamer billionaire will drag us into, and if we are to have peace, men like this must not build our future.