American entrepreneur Palmer Luckey has contended that there will be “no moral high ground”’ if Western countries stall autonomous systems development and usage.
The virtual reality developer and founder of military contractor Anduril Industries made the comments during his recent “The AI arsenal that could stop World War III” TED speech with technologist Bilawal Sidhu on 8 April this year.
The pair discussed the global arms race to build AI-powered weapons, “killer robots” and autonomous fighter jets at scale.
When questioned about the ethics of building fully autonomous systems or “killer robots”, Luckey responded with a quip of his own.
“I love killer robots,” he said.
“The thing that people have to remember is that this idea of humans building tools that divorce the design of the tool from when the decision is made to enact violence, it’s not something new.
“We’ve been doing it for thousands of years. Pit traps, spike traps and a huge variety of weapons even into the modern era.
“Think about anti-ship mines, even purely defensive tools that are fundamentally autonomous.”
Luckey argued that current opposition to using artificial intelligence was a modern problem involving people who “haven’t usually examined the problem”.
“There’s people who say things that sound pretty good, like, ‘You should never allow a robot to pull the trigger.’ ‘You should never allow AI to decide who lives and who dies.’ I look at it in a different way,” he said.
“I think that the ethics of warfare are so fraught, and the decisions so difficult, that to artificially box yourself in and refuse to use sets of technology that could lead to better results is an abdication of responsibility.
“There’s no moral high ground in saying, ‘I refuse to use AI, because I don’t want mines to be able to tell the difference between a school bus full of children and Russian armour’.
“There’s a thousand problems like this. The right way to look at this is problem by problem. Is this ethical? Are people taking responsibility for this use of force?
“It’s not to write off an entire category of technology, and in doing so, tie our hands behind our backs and hope we can still win. I can’t abide by that.
“Usually non-technical people will say things, like, ‘Why not just make it all remote control?’ And they don’t recognise the scale of these conflicts we’re talking about. They don’t lend themselves to a one-to-one ratio of people to systems.
“To say nothing of the fact that if you’re a remotely piloted system, all you have to do is break the remote part and everything falls apart. There’s no moral high ground either in saying all you have to do is figure out how to jam us and you win.
“There is another point. It’s usually not one that I make on a stage, but I’ll get confronted by journalists who say, ’Oh well, we shouldn’t open Pandora’s box.’
“And my point to them is Pandora’s box was opened a long time ago with anti-radiation missiles that seek out surface-to-air missile launchers. We’ve been using them since pre-Vietnam (War) era.
“Our destroyers’ Aegis systems are capable of locking on and firing on targets totally autonomously. Almost all of our ships are protected by close-in weapon systems that shoot down incoming mortars, missiles and drones.
“I mean, we’ve been in this world of systems that act out our will autonomously for decades. And so the point I would make to people is you’re not asking to not open Pandora’s box. You’re asking to shove it back in and close it again, and the whole point of the allegory is that such cannot be done.”