ECMS funding success across intelligence and security research
Congratulations to the ECMS researchers who have been awarded Australian Research Council grants under the National Intelligence and Security Discovery Research Grants scheme.
The NISDRG schemes cover Security Challenges and Intelligence Challenges, with only 10 awards (5 per scheme) made nationally, with the University of Adelaide being the only university in South Australia to be awarded funds for this scheme.
Congratulations to our researchers for their fantastic achievement!
Professor Dusan Losic
School of Chemical Engineering and Advanced Materials
Development of Wearable Wireless Sensors for Chemical Warfare Agents (CWA) Detection – $597,791
The 21st century soldier needs to be equipped with new generation of wearable sensors to be informed against all threats, both visible and invisible such as deadly chemical warfare agents (CWAs). Thanks to the exceptional properties of graphene, 2D materials and advanced printing technologies it is possible to design sensing devices for rapid detection of low-concentrations of toxic volatile organic chemicals. The aim of this project is to develop new wearable chemo-resistive sensors based on graphene that are wireless, low-cost, and provide rapid, continuous information about deadly airborne CWAs. This project will allow Australian Defence to access new cutting-edge technology and capabilities to be ready for future 2040 Detection challenges.
Professor Debi Ashenden
School of Computer Science
Defending Machine Learning Operations (MLOps) across the Human-Machine Interface – total $542,482 over 4 years
This project aims to address the socio-technical cybersecurity risks of operationalising machine learning models. It will generate new knowledge in the areas of computer security and human-computer interaction by using a transdisicplinary research approach that brings together social and behavioural science, computer science, and data science. The outputs from the research will be models of the behavioural risks to machine learning operations; a tool for facilitating experiments to manage the risks of human-machine teaming and novel algorithms that can be used to both defend and attack machine learning operations. The benefits arising from the research with be increased trust in the operationalisation of machine learning models.