Powered by MOMENTUM MEDIA
defence connect logo

Powered by MOMENTUMMEDIA

Powered by MOMENTUMMEDIA

Working to create a future-ready, future-proofed ADF

Working to create a future-ready, future-proofed ADF

Scientists from Defence Science and Technology Group and the end users across the Australian Defence Force are collaborating closely to build trust in autonomous systems, a critical step towards the widespread introduction of the technology across the branches of the ADF.

Scientists from Defence Science and Technology Group and the end users across the Australian Defence Force are collaborating closely to build trust in autonomous systems, a critical step towards the widespread introduction of the technology across the branches of the ADF.

Autonomy gives a robot the ability to become a teammate and work with human operators and other robotic systems, achieving such a capability would be incredibly beneficial, but its realisation remains elusive.

While Army released its Robotic and Autonomous Systems Strategy in 2018, the Royal Australian Air Force has sought to accelerate the use of artificial intelligence (AI) and autonomy through Plan Jericho and the Navy is pursuing multiple acquisition programs that rely on autonomy.

==============
==============

Defence scientist Robert Hunjet said the promise of autonomous systems has been discussed for decades, "Teleoperation – the concept of a remote-controlled vehicle, drone or tank is not representative of autonomy as the vehicle has no ability to make its own decisions or task itself. With recent advances in drone technology, the concept of swarming has attracted a lot of interest."

In order to make these machines truly smart enough, Defence is undertaking research in areas including contextual awareness, active perception, path planning, multi-agent system control and swarming. Critically, improving a robot’s ability to work intelligently requires more than investment in machine learning and is equally about investment in the critical, enabling systems that allow the autonomous platforms to work together.

Dr Hunjet added, "We observe swarming in nature, for example in the way birds flock and fish school. The individuals interact only with others in close proximity and the cascading effect of these local interactions provides a global behaviour through the swarm as a whole.

"Within robotics, we can emulate the creation of global behaviours in a similar fashion through local interactions with neighbouring systems, offering a potential scalable approach to generate mass with minimal computational complexity or communications overheads."

In order to be considered robust, autonomous systems must be able to operate in difficult or contested situations. Algorithms must be stable in the face of unexpected system inputs. Defence is also investigating approaches that would allow robotic systems to share their position and orientation information with others that would then fuse these estimates with their own data, enabling enhanced positioning accuracy.

Research is aiming to address how AI might be able to explain its decisions to a human operator in a manner that takes into account the operator’s state. That is, the machine would seek to provide an appropriate level of detail based on its understanding of the operator’s current cognitive load.

Trust is also gained through observation of repeated good performance, in order ensure its technology and the platforms currently under either acquisition or development work effectively and as expected, Defence is conducting research into verifiable autonomy.

Dr Hunjet added, "Interaction between entities no doubt plays a large part in human trust. As such, the interface between a human operator and a machine should be designed to assist the human and reduce cognitive load."

The concept of verification from the perspective of test and evaluation is also something to consider. With many AI-based systems being specifically designed to learn and evolve, they do not necessarily behave in the same manner when presented with the identical inputs, such as sensor information. In such systems, traditional regression-based approaches to testing are not appropriate.

Future testing processes may need to be more akin to the issuance of a driver’s licence, where a curriculum is designed and competency assessed, allowing for future improvement while performing a task. This concept is known as machine education.

Collaboration is at the heart of Defence’s pursuit of autonomy for future robotic platforms. Defence funds collaboration with Australian academic institutions and international partner organisations through its trusted autonomous systems strategic research initiative.

You need to be a member to post comments. Become a member for free today!