Fact Checking

How Originality.ai Could Have Prevented the $440k AI Mistake in the Deloitte Report Scandal

Deloitte is set to partially refund the Australian government a portion of $440k (Australian Dollars) due to a report that contained apparent AI errors, a clear example of why checking content quality isn’t optional in 2025 — it’s a must.

Deloitte is set to partially refund the Australian government for a report that cost $440,000 (Australian Dollars). 

This news was covered by major media outlets, including ABC News, Business Insider, and The Register, and comes in response to the discovery of apparent AI hallucinations in a report prepared by Deloitte in July 2025 for the Department of Employment and Workplace Relations.

It’s not the first time that AI hallucinations have made the headlines; earlier this year, several US newspapers published a summer book list with AI hallucinations — the AI book list scandal.

AI hallucinations are a real problem that impacts authenticity, reputation, and brand image… but could this have been prevented in the first place?

Let’s take a closer look.

Key Takeaways (TL;DR):

  • Deloitte is set to partially refund the Australian government for a report that cost $440,000 (Australian Dollars). 
  • The 237-page report was first published on the Department of Employment and Workplace Relations website in July 2025, cost $440k, and took 7 months to prepare.
  • Dr. Christopher Rudge (Sydney University) spotted up to 20 errors in the report, which he suspected were AI hallucinations.
  • The report’s errors (apparent AI hallucinations) included a misquoted/fabricated quote from a court case and a citation for a fake book attributed to a real university professor.
  • An updated version of the report was released, dated September 26, 2025, that acknowledged Azure OpenAI GPT-4o was used to help prepare it.

With AI Slop making its way into reports published by governments, using tools like Originality.ai to identify AI text that requires additional review and fact-check for accuracy is a must.

The Deloitte Report — What It Is and Where It Went Wrong

According to Business Insider, Deloitte’s report was a lengthy project taking seven months to complete.

The Deloitte report was:

  • First published in July 2025 on the Department of Employment and Workplace Relations website.
  • Involved a review of the Targeted Compliance Framework (TCF), which is incorporated into the IT system that provides welfare and benefits payments.
  • The full report was 237 pages long.

So, where did it go wrong?

  • AI use was not initially disclosed in the report’s initial publication. 
  • Errors (apparent AI hallucinations) were published in the initial report.
    • Back in July, when the report was published, Dr Christopher Rudge (Sydney University) identified a number of incorrect facts, up to 20 “errors” which he suspected were AI hallucinations, as reported by ABC News.
  • The errors in the report included a misquoted/fabricated quote from a court case and citation for a fake book attributed to a real university professor (Lisa Burton Crawford, Sydney University).

What happened next

Following Deloitte’s investigation into the errors:

  • Deloitte acknowledged that there were, in fact, errors.
    • Deloitte "confirmed some footnotes and references were incorrect” on the review…” - Business Insider.
  • Deloitte will provide the Australian government with a partial refund for the report, which cost the government $440,000 Australian dollars.
  • An updated version of the report was released on the Department of Employment and Workplace Relations (DEWR) dated the 26th of September.

Not only did Deloitte update the version of the report to correct the “errors”...

What happened next
Image Source: Targeted Compliance Framework, Final Report, page 2

…but they also included a section within the report’s methodology noting that AI (specifically Azure OpenAI GPT-4o) was used in its preparation.

What happened next
Image Source: Targeted Compliance Framework, Final Report, page 147

Using Originality.ai AI Detection + Fact Checking Could Have Prevented It

Our industry-leading Originality.ai AI Detector is an AI detection tool that accurately identifies AI-generated content from leading LLMs (learn more in our AI Accuracy Study).

Scanning the report with the Originality.ai AI Checker could have helped reviewers quickly identify sections of AI-generated text that required additional quality checks to ensure accuracy and authenticity were maintained.

For example, an excerpt from the updated report’s Executive Summary (page 6) is flagged as Likely AI with 100% Confidence by the Originality.ai AI Detector.

Using Originality.ai AI Detection + Fact Checking Could Have Prevented It

The initial report, which contained the AI hallucinations/errors, has since been updated, so unfortunately, we can’t run the original version with inaccuracies through our fact checker.

However, if you’d like to see our fact checker in action, read our article on the AI Book List Scandal, where the fact-checking software clearly flags numerous AI hallucinations. You can also read more about Originality.ai’s fact-checking accuracy in our Fact-Checking Study.

If fact-checking software had been used, it could have supported:

  • The Deloitte team that prepared the report by catching the errors before submission.
  • The Department of Employment and Workplace Relations in reviewing the submitted study and verifying what information was correct — and what wasn’t.

Accuracy Matters in 2025

AI slop is polluting the internet with low-quality content that can contain AI hallucinations and rapidly spread misinformation.

In this case, that meant a report that a government was supposed to be able to rely on contained inaccuracies that could have caused serious problems.

As Dr Christopher Rudge, who spotted the errors stated in ABC’s coverage, “That’s about misstating the law to the Australian government in a report that they rely on. So I thought it was important to stand up for diligence.” 

A quick check could have helped both Deloitte and the Australian government identify AI-generated content and catch the errors. 

Skipping that step came with a very real and high price tag.

Those errors are now going to cost Deloitte a partial refund of $440,000.

In 2025, content quality checks aren’t optional. They are essential.

If you are a journalist, reporter, reviewer, or editor and want to avoid scenarios where credibility and integrity are jeopardized, start using Originality.ai today. 

Interested in reading more about AI? Check out our AI Studies.

Madeleine Lambert

Madeleine Lambert

Madeleine Lambert is the Director of Marketing and Sales at Originality.ai, with over a decade of experience in SEO and content creation. She previously owned and operated a successful content marketing agency, which she scaled and exited. Madeleine specializes in digital PR—contact her for media inquiries and story collaborations.

More From The Blog

Al Content Detector & Plagiarism Checker for Marketers and Writers

Use our leading tools to ensure you can hit publish with integrity!