In this new age of AI-enabled warfare, Australia’s Defence Department needs to rethink the way it procures military equipment, says Professor Toby Walsh, Scientia Professor of Artificial Intelligence at the University of NSW’s Department of Computer Science and Engineering. “The ADF typically puts out a tender, a specification, and throws a lot of money at it,” he says. “As opposed to the agile way you actually get AI technologies to work, which is to build prototypes and have a rapid cycle of iteration.”
Funding more of the work produced by nimble start-ups rather than the offerings of large defence procurement companies might be the answer, he adds.
“The platforms are changing from very sophisticated and very expensive in small numbers, to very small, very cheap large numbers of nevertheless smart platforms, whether they be aerial drones, underwater drones or surveillance sensors,” Walsh says.
The ADF is now coming around to the massive potential of AI-enabled small expendable pieces of equipment including autonomous unmanned drones and submersibles, along with nanosatellites and sensors, he adds. “I think we’re spending more on smaller, lighter, agile drones than we were.”
In September, the government announced it was investing $1.7bn over five years to acquire an unspecified number of extra-large, long-range autonomous submersibles called Ghost Sharks.
“They’re still quite an expensive piece of kit,” Walsh says. “They’re not as disposable as the aerial drones that you see. But it’s much more realistic than the manned AUKUS nuclear submarines that we will probably never receive.”
Last year the government said it would spend more than $10bn on drones in the next decade, including at least $4.3bn on uncrewed aerial systems. The AI-piloted MQ-28A Ghost Bat unmanned aerial vehicle or drone has been developed by Boeing Australia in partnership with the RAAF to fly alongside crewed aircraft for reconnaissance and electronic warfare, and potentially to carry weapons. In June, a single operator on board an airborne E-7A Wedgetail controlled two Ghost Bat aircraft in a mission against an airborne target.
In other fields, the ADF has deployed AI-enabled autonomous robots – mannequin-type humanoids bearing weapons, dressed in camouflage and mounted on mobile platforms. Produced by Marathon Targets, these robots can move autonomously and communicate with one another, providing realistic moving combatants for live target fire. AI can massively reduce the human effort required for offensive and defensive military manoeuvres, Walsh says. “You can do things at scale that previously would require lots and lots of humans. You can do things at scale that previously were actually impossible. You can do it at response times that humans don’t have, and you can do it potentially without any humans as well.”
He is increasingly concerned about the moral, legal and security challenges of handing over more and more decision-making to machines. As one of the 18 experts appointed to the Global Commission for the Responsible Use of AI in the Military, he says humankind has never before embraced a technology in which life-or-death decisions can be handed over to machines with so little oversight.
AI is not like human intelligence, he adds, noting it is impossible to provide guarantees about the performance of AI systems.
In September, Foreign Minister Penny Wong addressed the UN Security Council on the dangers of AI in war. “AI’s potential use in nuclear weapons and unmanned systems challenges the future of humanity,” she said. “Nuclear warfare has so far been constrained by human judgment, by leaders who bear responsibility and by human conscience. AI has no such concern, nor can it be held accountable.”
Professor Sarath Kodagoda, director of the Robotics Institute at University of Technology, Sydney, says unaccountable and potentially lethal AI usage is one of the major concerns of everyone seriously considering the potential of the technology.
“We don’t want AI to decide everything and do it by itself,” he adds. “I think there should be some sort of human intervention to take control, if it is required, and influence the decisions.”
Nevertheless, AI enables important tools for the ADF, Kodagoda says.
“One major area where robotics and AI are contributing is surveillance, either using satellite images, radars or drones or multiples of them,” he says. “We call it data fusion or sense fusion.”
That surveillance data is then put through AI processing, he adds, in order to identify targets, track targets, and identify certain scenarios, such as gatherings or harmful activities.
Twenty years ago, Kodagoda points out, military offensive and defensive actions used air, sea and ground equipment, with many personnel working on separate operations. He believes the three types of operation are now increasingly intertwined and, with AI, require less and less human oversight.
AI tools can provide ADF supervisors with a scene of battle, Kodagoda says, with information about the movement and goals of various pieces of equipment such as drones and land vehicles, while also tracking the enemy’s moves.
“The defence industry can develop very realistic simulators,” he adds. “They can plan scenarios and test them even before going into a particular scenario.”
ADF supervisors can use AI to input the type of battlefield, the enemy and its strengths and then ask for ways to defeat the enemy or at least hold the enemy back. “You can run that simulator on those plans and see where the bottlenecks are, what are the problems are beforehand, and then your logistics can be arranged based on the outcomes,” Kodagoda says.
The ADF should stay abreast of all AI developments and use it to the fullest and in the most careful ways, he says. “We have to be on top of it, because the way world is shaping, the countries which have invested earlier have much more advantage.”