World

Australian lawyer caught using ChatGPT filed court documents referencing ‘non-existent’ cases

News Mania Desk / Piyal Chatterjee / 1st February 2025

Australian attorney discovered utilizing ChatGPT submitted court filings mentioning ‘made-up’ cases.

The immigration minister states that this behavior must be “nipped in the bud,” as a lawyer has been referred to the office of the NSW Legal Services Commissioner for review. An Australian attorney has been reported to a state legal complaints authority after it was found that he employed ChatGPT to draft court documents in an immigration matter, resulting in the AI platform producing non-existent case citations.

In a decision by the federal circuit and family court on Friday, Justice Rania Skaros directed the attorney, whose name was removed from the ruling, to the Office of the NSW Legal Services Commissioner (OLSC) for review.

The court listened to an appeal of a ruling made by an administrative appeals tribunal where the lawyer submitted an amended application to the federal circuit and family court in October 2024, along with a submission outline. Skaros stated that “the two documents included references to cases and supposed quotes from the tribunal’s ruling that were not real.”

On 19 November, the lawyer wrote to the court stating the errors were unintentional, and that he deeply regretted them. At a hearing on 25 November, the lawyer admitted to using ChatGPT to write the documents.

“The [lawyer] stated that he had used AI to identify Australian cases, but it provided him with nonexistent case law,” Skaros said. “The court expressed its concern about the [lawyer]’s conduct and his failure to check the accuracy of what had been filed with the court, noting that a considerable amount time had been spent by the court and my associates checking the citations and attempting to find the purported authorities.”

In an affidavit provided to the court, the lawyer said that due to time constraints and health issues, he decided to use AI.

“He accessed the site known as ChatGPT, inserted some words and the site prepared a summary of cases for him,” the judgment said. “He said the summary read well, so he incorporated the authorities and references into his submissions without checking the details.”

The attorney was reportedly quite humiliated by the event and has made efforts to enhance his understanding of AI.

The attorney representing the immigration minister contended that the lawyer did not demonstrate sufficient diligence, and considering the public concern regarding the improper use of AI in legal cases, it was in the public interest for cases of AI misuse to be directed to the OLSC. “It was argued [by the minister] that this behavior would persist and should be ‘stopped early’.”

Skaros stated that the application of generative AI in legal matters is a current and developing concern, and it is in the public interest of the OLSC to be informed about such actions. This marks the second legal instance in Australia involving a lawyer being referred to a regulatory authority for using AI, following the referral of a Melbourne lawyer to the Victorian legal complaints body last year for acknowledging the use of AI in a family court case that produced inaccurate case citations.

In a practice note released by the NSW Supreme Court late last year, effective Monday, the court has imposed restrictions on the utilization of generative AI by NSW lawyers, emphasizing that it cannot be employed to create affidavits, witness statements, character references, or any other documents presented in evidence or used during cross-examination.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button