Algorithms won’t solve the ADF’s recruitment crisis
By Huon Curtis
Using automated systems, such as those enabled with artificial intelligence, to assist with recruitment can create efficiencies, but the increasingly complex interplay of machines, ownership and control of data, contract performance and human factors makes the effects difficult to predict. While solutions may be found by integrating hiring technology with deep knowledge of organisational needs and culture and adapting skills standards, recruitment is only one part of the workforce crisis facing Australia’s defence organisation.
These issues should be front of mind in the transition of the Australian Defence Force recruiting contract from Manpower to Adecco in the context of shortfalls in recruitment. Issues of the contract with Manpower have not been canvassed in public, but there are, without doubt, commonalities with contracting in other countries, such as the UK, which also has faced myriad recruitment challenges.
Recruitment methods, of course, rely increasingly on data, analytics and algorithms, and globally there are efforts to push legacy personnel systems into the information age.
The UK Ministry of Defence has released an AI strategy that has as one pillar transforming the armed forces into an ‘AI ready’ organisation by addressing human talent.
The UK strategy comes after a long period of underperformance in recruiting. A parliamentary committee inquiry criticised both the British Army and the contractor, Capita. The army achieved only 60% of its targets in 2018–19 and identified issues in recruitment in future skills and capabilities, such as digital technology and cybersecurity.
The inquiry found that Capita had underestimated the complexity of defence recruiting and the specialist expertise needed. It concluded that Capita’s off-the shelf commercial solution was not fit for purpose, and noted that a bespoke application took longer to develop than anticipated.
An overly prescriptive contract containing 10,000 specifications negatively affected innovation in the recruitment process. The recruitment system was not owned by the British Army and was hosted on the contractor’s servers.
Problems clearly remain. The system went offline for six weeks this year after a possible hack. The information of 120 recruits was found on the dark web.
The UK issues will no doubt resonate with those familiar with Australia’s problems.
The ADF and the Department of Defence are both below their budgeted personnel numbers. Deputy Prime Minister and Defence Minister Richard Marles said last month that the recruitment issue must be dealt with urgently. He said the ADF was almost 3,000 people below its allocated force strength and the Department of Defence was more than 1,000 people below its budgeted size.
The contract between the ADF and the new contractor, Adecco, will likely reflect the revised defence intellectual property strategy that outlined a new regime for collaboration with industry.
The ownership of data and online systems by contractors creates dependencies that may limit the scope for employers to gain insights from that data and to innovate. The experience of the British Army is instructive. Despite multiple years of missed targets, the British Army didn’t amend the contract with Capita to secure the intellectual property for the online recruitment system that it had co-developed and co-funded.
In recruitment systems, there are good reasons to quarantine candidates’ files—which may contain the results of aptitude tests and other personal information—from other defence systems. But air-gapped data creates other points of vulnerability—the ADF confirmed in October that hackers had attacked an external service provider used by military personnel and public servants.
During Australia’s first Covid-19 shutdown, defence recruitment applications surged, rising 42% in 2020 according to the ABC. But converting interest into boots on the ground has proved difficult. Digitising the recruitment process seems to have also driven an increase in applications from ‘unsuitable’ candidates.
The ADF’s three services have each apparently insisted on tailoring standards—including medical, physical and psychological assessments—based on job roles and improving the speed of application. They essentially want more control over who is deemed ‘suitable’.
This potential change to suitability criteria will undoubtedly attract internal and external criticism about supposedly declining standards. The inquiry in the UK said found that the British Army wasn’t changing its entry standards but widening its entry criteria. The ADF has already adapted its recruitment process to assess the physical differences between men and women more fairly while also relaxing its grooming standards to adapt to social norms. And as different defence requirements emerge—cyber is a classic example—defence has had to adjust.
Addressing retention is also a key part of solving the workforce problem, as ASPI’s Marcus Hellyer argued recently. While some elements of the defence workforce model will always remain, a cookie-cutter model for the ‘right’ sailor, soldier or aviator is no longer organisationally beneficial.
Data might be able to help Defence determine why candidates fail at different points, and equally where management is failing. But that information will be limited if partial details have been shared between the contractor and Defence, and if senior management isn’t monitoring performance, as was the case in the UK.
Equally, recruitment methods need to be adapted to the needs of different groups. Evidence of those involved in recruitment of Indigenous candidates, for example, suggests that advancement often occurs in contexts where they have strong peer support. Indigenous candidates haven’t had high rates of recruitment, so machine-learning systems trained on historical data could reinforce the same exclusions. The same is true for other groups.
In various contexts, AI or advanced analytics are being proposed to deliver a range of benefits. These include increasing the ability of organisations to conduct background checks; assess candidates’ values, beliefs and attitudes; and suggest possible behavioural patterns in relation to job fit and performance. But these systems can lead to perverse outcomes and legal risks, and in some instances can look like 19th-century-style pseudo-science. Efforts to identify ‘trustworthiness’ and ‘criminality’ in people’s facial structures are modern-day phrenology, no matter how many data points are being processed and how sophisticated the algorithm.