By -

Gen AI will likely become more prevalent in legal practice, but if principals and supervisors fail to adopt appropriate processes to monitor its use, then solicitors and legal practices could be at risk of breaching their ethical and regulatory obligations and being subject to disciplinary action.

The case of Murray on behalf of the Wamba Wemba Native Title Claim Group v State of Victoria[1] (Wamba Wemba) is one of the latest in an expanding number of cases dealing with the drafting and filing of legal documents containing ‘hallucinated’ material. The notable difference in Wamba Wemba is that the solicitor who drafted the documents containing the hallucinations was a solicitor whose practising certificate was subject to the statutory condition of supervised legal practice (also known as ‘Condition 2’), and her supervisor failed to review the offending documents before they were filed.

This case serves not only as another example of the growing use of generative AI (Gen AI) in legal practice, but more importantly, as a warning that the use of Gen AI must be supervised.

Case Summary

In Wamba Wemba, the substantive proceedings were a native title determination application brought on behalf of the Wamba Wemba native title claim group. During these proceedings, the applicant’s legal representative, Massar Briggs Law, filed two court documents containing multiple footnotes referencing anthropological and historical reports. When the matter reached case management, questions were raised about these footnotes, and the Court tasked First Nations Legal and Research Services (FNLRS) with producing the footnoted documents. FNLRS found that most of the documents either did not exist or did exist but were incorrectly cited. FNLRS concluded that the citations were “hallucinated”, that is, produced by Gen AI in a way that looks accurate and reliable, but is not based in fact.

Two solicitors from Massar Briggs Law gave evidence about how these documents were produced. One was a junior solicitor, whose practising certificate was subject to the condition of supervised legal practice (the supervised solicitor). She deposed that she had drafted the documents while working remotely as a result of which she did not have physical access to the footnoted documents held in the office. She further deposed that she used Google Scholar to search for the footnotes, and Google Scholar produced a list of search results. After considering the search results, she used the first result as the footnote, believing it to be the correct citation. She also gave evidence that when FNLRS raised concerns about the footnotes, she made the same searches in Google Scholar, but this time Google Scholar produced different results.

The second solicitor who gave evidence was Mr Briggs, the principal of the law practice and the supervisor of the supervised solicitor. He deposed that the work to produce the documents had been done “collaboratively” between multiple team members, but that he was not aware of whether anyone had checked the supervised solicitor’s work. He accepted that it was his error to firstly, allow the work to be performed remotely, and secondly, to not ensure that the work was checked.

Decision

The Court acknowledged that the supervised solicitor made the following errors:

  • preparing citations for court documents while working remotely and without access to the documents being cited,
  • using an (apparently AI-assisted) research tool to produce citations, and
  • failing to check and verify the output of the search tool.

The Court considered that the principal solicitor had made the error of failing to have systems in place to ensure that the supervised solicitor’s work was appropriately supervised and checked.

The Court did not consider it necessary to refer either solicitor’s conduct to the regulatory body, because Massar Briggs Law had addressed the problem as soon as they became aware of it, and apologised and expressed their regret.

The Court did, however, acknowledge that the actions of Massar Briggs Law had caused cost, inconvenience, and delay, and had compromised the effective administration of justice. Accordingly, the Court ordered that Massar Briggs Law pay, on an indemnity basis, the costs incurred by the respondent’s solicitors because of the applicant solicitor’s use of Gen AI.[2]

Existing supervision rules in NSW

Although this matter occurred in Victoria, it nonetheless illustrates the importance for NSW practitioners to appropriately supervise the use of Gen AI in legal practice.

In NSW, the principal of a law practice holds ultimate responsibility and liability for all work performed in the law practice.[3] This not only includes work done by a solicitor who must only engage in supervised legal practice (as per condition 2 of their practising certificate),[4] but also extends to work, including research using Gen AI by a solicitor who is not subject to Condition 2.

Condition 2 is a statutory condition known as supervised legal practice (SLP). Section 6 of the Legal Profession Uniform Law (Uniform Law) defines SLP as legal practice by a solicitor working under the supervision of either an authorised principal[5] or another solicitor who holds a practising certificate authorising them to supervise others.[6] For the first two years of full-time work (or part-time equivalent), a solicitor can only engage in SLP.[7] Once the solicitor has completed the necessary period of SLP, they can apply to the NSW Law Society for removal of condition 2 from their Practising Certificate.

Rule 37 of the Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 (Conduct Rules) provides that a solicitor with designated responsibility[8] for a matter must exercise reasonable supervision over solicitors working under them.[9] Reasonable supervision will vary according to the employee’s experience, qualifications and role, and with the type and complexity of the work[10]. In Wamba Wemba a junior lawyer subject to supervision was drafting and footnoting court documents. As indicated by the Court’s findings, in this context, it would have been reasonable to exercise a higher level of supervision, given that the solicitor was inexperienced, and performing the tasks remotely.

The Law Society of NSW encourages solicitors subject to condition 2 to enter a remote supervision plan with their supervisors, including where the firm allows hybrid work or supervision is conducted remotely. In Wamba Wemba, the filing of misleading evidence may have been avoided if the supervisor and supervised solicitor had established and followed a remote supervision plan, which referred to the law practice’s policy and processes for using Gen AI, and the supervisor ensured all work performed by the supervised solicitor was thoroughly reviewed.

In Wamba Wemba, the filing of misleading evidence may have been avoided if the supervisor and supervised solicitor had established and followed a remote supervision plan, which referred to the law practice’s policy and processes for using Gen AI, and the supervisor ensured all work performed by the supervised solicitor was thoroughly reviewed.

Why supervisors must recognise the growing influence of Gen AI

Owing to technological advancements and the inherent pressures of the legal profession, more solicitors may eventually end up using Gen AI tools. This technology can save us time, costs and perhaps stress by assisting with repetitive tasks that can be automated, such as researching applicable laws, summarising relevant cases, generating rough first drafts, and footnoting.

Gen AI does of course come with risks, such as those repeatedly showcased in the courts. In the matter of JNE24 v Minister for Immigration and Citizenship,[11] hallucinations featured prominently, and the Court used some colourful metaphors to warn lawyers that although Gen AI offers the “attraction” of helping to navigate a “labyrinth” of legal content, it can also prove to be a “dangerous mirage.”[12]

In a similar vein, Chief Justice Bell recently identified eight “species” of “hallucinated content”[13] that repeatedly feature in court matters, illustrating the rapidly expanding reach of this technology. We suggest reading His Honour’s address, to better understand the various complexities inherent in using Gen AI in the legal profession.

Risky though it may be, the truth is that Gen AI is forming deep roots in our profession; and all lawyers, especially principals and supervisors, must be attuned to it. This was emphasised in Ayinde,[14] a recent English matter, where the Court suggested that lawyers with “leadership responsibility” [15] should start implementing practical measures to ensure that they, and other lawyers in their practice, uphold their duties to court and clients when using Gen AI.[16]

If the matter of Wamba Wemba has taught us anything, it is that Gen AI must be given due consideration in law practice supervision plans and protocols.

Law practice systems and processes must incorporate supervision of Gen AI

Principals and supervisors should review and, where necessary, update their supervision systems and processes to ensure Gen AI is used ethically and within the legal framework. This may involve:

  • Emphasising to staff that Gen AI is primarily a tool to help with initial research and drafting, never to produce ‘final work’;
  • Establishing protocols on producing, recognising, checking, and correcting Gen AI content;
  • Introducing new checks and balances when drafting and reviewing both their own work and the work of others, to ensure that hallucinations do not slip through the cracks;
  • Ensuring that solicitors who work remotely do not heavily rely on Gen AI, in place of their colleagues and other office resources;
  • Rethinking office culture, and normalising Gen AI use so that solicitors do not conceal its use because they link it to shame and poor work ethics; and
  • Initiating internal education programs to help staff understand how Gen AI works and identify both its regulatory and ethical limitations.

Gen AI will likely become more prevalent in legal practice. If the matter of Wamba Wemba has taught us anything, it is that Gen AI must be given due consideration in law practice supervision plans and protocols. If principals and supervisors fail to adopt appropriate processes to monitor its use, then solicitors and legal practices risk breaching their ethical and regulatory obligations and being subject to disciplinary action.


Endnotes

[1] [2025] FCA 731.
[2] Wamba Wemba Native Title Claim Group v State of Victoria [2025] FCA 731 at [5]
[3] Sections 34 and 35 Legal Profession Uniform Law (NSW)
[4] As per sections 6 and 49(1) of Legal Profession Uniform Law (NSW), and Rule 7 of Legal Profession Uniform General Rules 2015.
[5] Legal Profession Uniform Law (NSW), section 6.
[6] Rule 7 Legal Profession Uniform General Rules 2015 (NSW).
[7] Section 49(1) Legal Profession Uniform Law (NSW).
[8] “Solicitor with designated responsibility” means the solicitor ultimately responsible for a client’s matter, or responsible for supervising the solicitor with carriage of, a client’s matter: Glossary of Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 (NSW).
[9] Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 (NSW).
[10] Legal Services Commissioner v Michael Vincent Baker [2005] LPT 002, at [42].
[11] [2025] FedCFamC2G 1314.
[12] JNE24 v Minister for Immigration and Citizenship [2025] FedCFamC2G 1314 at [21]
[13] The Hon A S Bell CJ, ADDRESS TO THE AUSTRALIAN BAR ASSOCIATION “CHANGE AT THE BAR AND THE GREAT CHALLENGE OF GEN AI” 29 August 2025: https://supremecourt.nsw.gov.au/about-us/speeches/chief-justice.html
[14] Ayinde v London Borough of Haringey [2025] EWHC Admin 1383.
[15] Ibid at [9].
[16] Ibid at [8].