Deloitte Suffers From AI Hallucinations in Australia

Thom Weidlich 10.16.25

Share:  

So-called hallucinations caused by artificial intelligence are such a big topic of conversation you would think people would be on high alert to avoid crises caused by them. Apparently, you would be wrong. Our latest episode took place in Australia, brought to us by Big Four accounting and advisory firm Deloitte.

Earlier this month, the company admitted that a 237-page report it put together for the Australian government contained errors, which included citations to non-existent academic works and the misquotation of a judge (in other words, hallucinations). Deloitte agreed to refund the last payment it was owed for the report, whose total cost was A$440,000 (US$286,000).

The report, which examined automated penalties in Australia’s welfare system, appeared in July on the website of the Department of Employment and Workplace Relations. A revised version without the errors — but with the same substance, the department said — was published on Oct. 3, according to the Associated Press.

‘Fabricated References’

What gave rise to the revision was work by Chris Rudge, a Sydney University researcher of health and welfare law, who, according to the AP, “alerted the media that the report was ‘full of fabricated references.’” Rudge said he found about 20 errors. The first one to jump out at him was the contention that a law professor had written a nonexistent book outside her field of expertise.

The department said in an Oct. 7 statement that Deloitte “confirmed some footnotes and references were incorrect,” according to the AP. Deloitte wouldn’t say if the mistakes were due to AI, but the updated version of the report included a disclosure that Microsoft’s Azure OpenAI was used in writing it, the AP reported. “The matter has been resolved directly with the client” was all Deloitte would say.

The department said the amount Deloitte is to pay back will be made public once it is reimbursed. One politician said Deloitte should refund the full cost. The report included “the kinds of things that a first-year university student would be in deep trouble for,” said Senator Barbara Pocock.

PwC Scandal

As Bloomberg Opinion columnist Catherine Thorbecke pointed out, Deloitte should have been sensitive to the potential embarrassment (i.e., the crisis) because the incident occurred “at a time when Australians’ trust in government use of private consulting firms was already fraught.” Thorbecke referred to the 2023 PwC tax-advisory scandal in which a partner from the Deloitte competitor who was advising the government on new tax rules leaked drafts to colleagues and clients.

The moral of the story is that, while AI has its uses, what it spits out is, frankly, not to be trusted — at least until it’s verified. Crises arising from AI hallucinations must be added to everyone’s list of potential crisis scenarios.

Photo Credit: Karolis Kavolelis/Shutterstock

Sign up for our free weekly newsletter on crisis communications. Each week we highlight a crisis story in the news or a survey or study with an eye toward the type of best practices and strategies you can put to work each day. Click here to subscribe.