--- Begin Message ---

[ I posted this article by Jeremy Nadel on 27 Feb:
Centrelink trials AI to detect fraud, prioritise debts
Controversial move prompts calls for transparency.
Jeremy Nadel
Information Age
Feb 27 2025 09:37 AM
https://ia.acs.org.au/article/2025/centrelink-trials-ai-to-detect-fraud--prioritise-debts.html

[ That was picked up by a Parliamentarian for another hearing, and a follow-on article is below.

[ Nadel's got Services Aust on the treadmill. They're busily misleading Parlt by pretending that the uses of AI are good for clients, when it's clear that a primary motivation is, as always in the agency, fraud and waste prevention and detection.

[ They're providing assurances about there being a human in the loop. But of course those assurances are valueless, because there's no formal control mechanism in place - and no retribution against anyone if the assurances are breached. ]


Centrelink defends AI pilots
Sheds light on fraud detection, debt allocation and Copilot trials.
Jeremy Nadel
Mar 04 2025 01:09 PM
https://ia.acs.org.au/article/2025/centrelink-defends-ai-pilots.html

Services Australia has defended its trial of artificial intelligence (AI) models and provided further detail on their use in fraud detection and sorting through Centrelink’s “potential debts backlog”.

Late last Thursday, officials addressed questions about the models’ functions, training data, risks and safeguards after Senator Penny Allman-Payne tabled an Information Age report which was published just hours prior.

“I can see from this article that part of the issue here is there being a sense that we're doing stuff secretly and people don't know about it, etcetera, and that’s not our intention,” CEO David Hazlehurst said.

He added that the agency was still “a long, long way from being able to deploy” anything to production and that it would “talk more about our use of AI and how we use it appropriately, and what we use it for” in its AI transparency statement, which Services Australia published the following day.

Fraud detection
CEO for payments and integrity Chris Birrer said machine learning was piloted to assist vetting disaster relief claims by predicting “where a claim might potentially be fraudulent…then staff look at it” in manual “pre-payment checks”.

Birrer used only the example of models that detect identity-theft when Allman-Payne asked what data the AIs were trained on.

“The archetypal example here would be indicators that the person whose name the claim is submitted in has not actually submitted it; that somebody else has — either by tricking that person or other forms of identity theft — taken their identity.”

UNSW Senior Lecturer Dr Scarlet Wilcock, who researched a discontinued model that Centrelink built in 2014 “to identify demographic characteristics associated with welfare claimants that had been overpaid”, said “there are genuine questions about Services Australia’s ability to deliver AI systems lawfully and ethically, especially when it comes to fraud.”


Senator Penny Allman-Payne tabled an Information Age news report on Centrelink's AI trials. Image: YouTube

Entitlements decisions
Hazlehurst said that there were “no current plans to use AI” in “customer facing activities; as in making decisions about entitlements”.

Minister for Government Services Katy Galagher added that “there would be some level of government decision making” required “to move into that space.”

One of Services Australia’s safeguards for its more invasive and high-risk technology — like its use of telecommunications metadata and password-cracking software — is only using it when fraudulent claims trigger criminal investigations, and not for investigating less serious forms of non-compliance like not reporting meagre changes to income.

The comments at estimates did not address whether there were plans to develop or trial AI in detecting non-compliance.

However, consistent with the whole-of-government standard, Services Australia listed “domains which apply to their use of AI in their transparency statement”, which included: “compliance and fraud detection: we use AI to detect hidden patterns in data and refer them to staff to investigate.”

“We have human oversight of AI in our compliance, auditing and decision-making processes,” it read.

Debt backlog reduction

One of the piloted AIs sorted through a backlog of potential debts and predicted a subset likely to be “finalised as no debt”, estimated to be around seven percent of debt determinations.

Birer said that these cases were then allocated to more junior staff because they are among the easiest to resolve.

“So, if there’s new staff within the payments and integrity group it’s a good thing for them to do as they’re on their skills ladder.”

The system “also helps” allocate more complex potential debts, which may need to be recovered, to more experienced staff.

“If you’ve got a very large and complex potential debt that relates to something that happened historically, where there might be different policy settings and they need to do a retrospective calculation there, that’s a much more complex task,” Birer said.

Welfare advocate Tom Studans said it was good “final decisions…rest with a trained human professional” but said Services Australia should explain on what basis the models are “identifying matters that are likely to result in no debt.”

“What characteristics are identified?” he asked.

“How are these matters analysed? What kind of machine learning software or products are being used?”

--
Roger Clarke                            mailto:[email protected]
T: +61 2 6288 6916   http://www.xamax.com.au  http://www.rogerclarke.com

Xamax Consultancy Pty Ltd 78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Visiting Professorial Fellow                          UNSW Law & Justice
Visiting Professor in Computer Science    Australian National University

--- End Message ---
_______________________________________________
apf-media-archive mailing list
[email protected]
https://lists.privacy.org.au/mailman/listinfo/apf-media-archive

Reply via email to