By -

The Supreme Court of NSW recently released Practice Note SC Gen 23, outlining guidelines for using generative artificial intelligence (Gen AI) in legal proceedings. Intended to provide clarity and safeguard the integrity of the legal process, the note has ignited debate within the legal tech community.

The practice note, issued by Chief Justice Andrew Bell, highlights potential risks of Gen AI, such as “hallucinations,” data inaccuracies, and confidentiality breaches. It limits AI use in drafting affidavits, witness statements, and expert reports.

Professor Mimi Zou, Head of the School of Private & Commercial Law at UNSW Law & Justice, welcomes the practice note, acknowledging the need for clarity and guidance in this emerging area – effectively providing rules for how practitioners in NSW should be using or not using generative AI programs.

However, Zou acknowledges it has sparked debate in the legal tech world – a community she is well plugged into. Critics argue the restrictions are overly broad, stifle innovation, and fail to differentiate between general-purpose AI chatbots and specialised legal AI platforms.

The practice note takes a conservative stance compared to other jurisdictions, restricting the use of both open-source and proprietary AI programs across common litigation activities. This has drawn criticism from the technology sector, with legal tech providers expressing concerns that the blanket restrictions limit AI’s potential benefits in improving efficiency and access to justice.

“Legal tech companies have raised concerns about the practice note because, in its current form, it takes a more conservative approach to the use of generative AI programs, both open-source and closed-source large language models,” Zou says.

“I think law, in many ways, is still quite a conservative profession, and being risk averse, the approach taken by the Chief Justice is understandable.”

Samuel Junghenn, CEO of AI Legal Assistant, argues these restrictions are based on a misunderstanding of Gen AI’s capabilities and limitations.

“The practice note lumps together general-purpose AI chatbots like ChatGPT with purpose-built enterprise-level applications designed specifically for legal practice,” Junghenn states in a recent open letter to the Supreme Court on LinkedIn.

“This is akin to suggesting that a junior lawyer fresh out of law school performs the ‘same job’ as a specialist lawyer with 25 years of experience.”

In the 2,700 word letter, Junghenn criticises the lack of consultation with technology specialists in drafting the practice note. He argues that this omission has led to “factually inaccurate assumptions” and a failure to recognise the robust privacy measures employed by specialised legal AI platforms.

Furthermore, the letter contends that the practice note’s focus on risk undermines AI’s potential benefits in enhancing efficiency and accuracy in legal practice. “The problem is not the tool itself but rather a lack of education and the actions of bad practitioners within the industry,” Junghenn states.

Zou says that legal tech companies are “a bit anxious about how the practice note will restrict the use of Gen AI programs in the legal sector.”

While she understands the legal tech community’s concerns, Zou believes the Practice Note at least provides some ground rules. She suggests viewing the practice note as a work in progress, open to refinement as AI technology evolves.

“[The practice note] is not fixed, but at least it provides some guidance which has been lacking until now,” Zou says.

Zou says while some vendors might say that consultation with technology specialists in developing the practice note wasn’t enough, it is difficult because the legal profession remains divided about the use of some AI programs.

She refers to the potential exacerbation of the digital divide where larger law firms with greater resources can invest in sophisticated AI tools, giving them a significant advantage over smaller practices and self-represented litigants with limited resources.

The practice note does not appear to offer a solution to this issue. During the practice note’s briefing earlier this month, Chief Justice Bell said he was “conscious” of the digital divide but noted that it was not the court’s role to address or solve this inequity issue of people’s ability to access generative AI programs.

“I don’t think we can regulate the use of that technology on an equity basis. I don’t think our remit extends that far legitimately,” Chief Justice Bell said.

image description

If it is not the court’s role to address this inequity problem, whose role is it?

While advanced legal tech AI tools are currently focused on serving large law firms, Zou says there’s a real need for investment in AI solutions that cater to individuals and improve access to justice.

Zou’s leadership in the UK and Asia-Pacific’s legal tech communities for many years has given her insights into the growing trend of AI legal tech tools for consumers and those seeking legal aid. In her view, Australia needs more of these initiatives. She emphasises the necessity for government-backed, industry-led programs and support, similar to LawtechUK (which Zou is a Panel member), to cultivate the sector in Australia.

Without a concerted effort to develop and deploy AI technology for broader use, the benefits of AI will primarily be limited to large law firms and this may further widen the access to justice gap.

The practice note, which comes into effect at the beginning of the 2025 law term, is a crucial first step, but it’s just the beginning of the conversation. Drawing on international experiences can help inform future iterations of responsible AI use.