ChatGPT and Legal Writing, By Ian Wilson LL. B
There are cases of lawyers, some from Australia, as below, but also in the United States, in a New York case, where lawyers have taken the easy AI road with ChatGPT and used it to generate briefs and court documents. The problem is that ChatGPT produced false references, ones it invented.
I was fascinated about how this happens and decided to ask ChatGPT itself. Chat old boy, I wrote, that false legal reference stuff, what went wrong, were you having a bad day or hungover from late night AI drinking binges with AI buddies? It had no sense of humour in reply. It said words to the effect: I am not an old boy, being a computer program. I recall the case. What happens is that I am not connected to legal data bases, so I may generate false references to complete the task. I suggested that it be connected to such data bases. It replied, that would be an added expense, that may not want out be met.
So, the easiest thing to do, is to use the standard legal data bases, which are usually reliable, and check references. Lay people are increasing having to defend themselves in tribunals, such as employment tribunals, and they should avoid the temptation of using ChatGPT for short cuts, but keep it just for first drafts. Everything must be checked.
"An Australian lawyer has been referred to a state legal complaints commission, after it was discovered he had used ChatGPT to write court filings in an immigration case and the artificial intelligence platform generated case citations that did not exist.
In a ruling by the federal circuit and family court on Friday, Justice Rania Skaros referred the lawyer, who had his name redacted from the ruling, to the Office of the NSW Legal Services Commissioner (OLSC) for consideration.
The court heard in an appeal of an administrative appeals tribunal ruling the lawyer filed an amended application to the federal circuit and family court in October 2024, as well as an outline of submissions. Skaros said "both documents contained citations to cases and alleged quotes from the tribunal's decision which were nonexistent".
On 19 November, the lawyer wrote to the court stating the errors were unintentional, and that he deeply regretted them. At a hearing on 25 November, the lawyer admitted to using ChatGPT to write the documents.
"The [lawyer] stated that he had used AI to identify Australian cases, but it provided him with nonexistent case law," Skaros said. "The court expressed its concern about the [lawyer]'s conduct and his failure to check the accuracy of what had been filed with the court, noting that a considerable amount time had been spent by the court and my associates checking the citations and attempting to find the purported authorities."
In an affidavit provided to the court, the lawyer said that due to time constraints and health issues, he decided to use AI.
"He accessed the site known as ChatGPT, inserted some words and the site prepared a summary of cases for him," the judgment said. "He said the summary read well, so he incorporated the authorities and references into his submissions without checking the details."
The lawyer was said to be deeply embarrassed about the incident and has taken steps to improve his knowledge of AI.
Counsel for the immigration minister argued the lawyer had failed to exercise adequate care, and given the public interest in the misuse of AI in legal proceedings, it was in the public interest for misuses of AI cases to be referred to the OLSC.
"It was submitted [by the minister] that such conduct would continue to occur and must be 'nipped in the bud'."
Skaros said the use of generative AI in legal proceedings is a live and evolving issue and it was in the public interest of the OLSC to be made aware of such conduct.
It is the second legal case in Australia where a lawyer has been referred to a regulatory body over using AI, after a Melbourne lawyer was referred to the Victorian legal complaints body last year after admitting to using AI in a family court case that generated false case citations.
In a practice note issued by the NSW supreme court late last year, which will come into effect on Monday, the court has put limits on the use of generative AI by NSW lawyers, including the stipulation that it must not be used to generate affidavits, witness statements, character references or other material tendered in evidence or used in cross-examination."
Comments