The courts warned lawyers and non-lawyers about the inaccuracies of information from generative AI tools
The Supreme Courts of Victoria and Queensland have published guidelines for the use of AI in court.
One requirement disclosed by the Victorian Supreme Court is that lawyers and parties who use AI in litigation should inform the courts as well as the other parties.
“The use of AI programs by a party must not indirectly mislead another participant in the litigation process (including the Court) as to the nature of any work undertaken or the content produced by that program. Ordinarily parties and their practitioners should disclose to each other the assistance provided by AI programs to the legal task undertaken”, the court detailed on its website. “Where appropriate (for example, where it is necessary to enable a proper understanding of the provenance of a document or the weight that can be placed upon its contents), the use of AI should be disclosed to other parties and the court”.
Self-represented litigants and witnesses are also expected to advise the court and other parties if generative AI has been used to help draft documents. The court added that lawyers are responsible for ensuring that AI-generated info is updated, accurate, complete, applicable to the jurisdiction and unbiased.
“Generative AI and Large Language Models create output that is not the product of reasoning. Nor are they a legal research tool. They use probability to predict a given sequence of words. Output is determined by the information provided to it and is not presumed to be correct. The use of commercial or freely available public programs such as ChatGPT and Google Gemini, is more likely to produce results that are inaccurate for the purpose of current litigation”, the court wrote. “Generative AI does not relieve the responsible legal practitioner of the need to exercise judgment and professional skill in reviewing the final product to be provided to the Court”.
The court warned lawyers and parties to be especially careful in using generative AI to put together affidavits and witness statements. Moreover, the court emphasised that AI “is not presently used for decision making nor used to develop or prepare reasons for decision because it does not engage in a reasoning process nor a process specific to the circumstance before the court”.
The Queensland Supreme Court’s guidelines, which are geared specifically towards non-lawyers, highlighted that while useful in explaining laws and legal principles, generative AI tools are “not a substitute for a qualified lawyer” and “have been known to provide inaccurate information on Australian law”.
“The currently available Generative AI chatbots have limited ‘training’ on Australian law and court procedure. Even when the training for Generative AI chatbots improves, there will be a limitation based on the currency of the data on which they have been trained”, the court indicated in the guidelines.
The court also warned generative AI users to refrain from providing “private, confidential, suppressed or legally privileged information” to a chatbot.
“Some Generative AI chatbots will remember every question that you ask them, as well as any other information you put into them. That information could then be repeated in response to queries from other users”, the court wrote. “As a result, anything you put into a Generative AI chatbot could become publicly known. This could result in you unintentionally breaching suppression orders, or accidentally disclosing your own or someone else’s private or confidential information”