Smart wills, intelligent sentencing, and automated due diligence are only the beginning of what artificial intelligence can do to improve legal practice.
Artificial intelligence (AI) is both closer and further away than most lawyers think. The Australian Law Reform Commission hosted a webinar in August about automated decision-making in administrative law, pointing to a potential inquiry. Machines make decisions with legal outcomes every day, and there are significant advantages to implementing this technology in many practice areas.
It enables faster outcomes, increased consistency, reduced red tape, and lower costs, and it will change the way law is practised in Australia. However, the consequences of getting automation wrong can be very serious, which is why experts say it’s time for lawyers to learn how AI actually works.
Democratising legal processes
Wills are a good example of AI capability. About six years ago, Adelaide tax lawyer Adrian Cartland saw an opportunity to transform the process with the help of AI.
“I came to the realisation that not only is automation possible, it’s inevitable,” he tells LSJ. “I looked around at various technological advances that have happened over time and thought we were right at the cusp of integrating AI and law.”
He left his job, started his own firm and his own technology company, and “bet it all on a harebrained idea”.
He created the Artificially Intelligent Legal Information Research Assistant, affectionately known as Ailira. Clients can ask Ailira for help with legal problems, and it can point them to helpful information. It can generate tailored wills, help with business structuring, and conduct legal research. Cartland has big plans for its future. He’s already opened two Law Firms Without Lawyers, one in Darwin and one in Karratha, with more to come, where people can simply pop into a shop and find answers to pressing legal questions.
“Anyone can draft a will themselves – they don’t need a lawyer – but if they pick up a post office will kit, there might be 50 pages of questions, and 50 pages of definitions, and it becomes too hard. People want something simple,” he explains.
“Where we have Law Firm Without Lawyers offices, they’re in low socio-economic areas … demographics that might not have access to legal services.”
A huge part of Cartland’s mission is making legal information accessible, and his company has processed “thousands and thousands of wills and other documents” so far. There is a technological assistant at each office, so visitors can speak to a real person, but Ailira provides the information. It’s important here to note the difference between information and advice, because only a lawyer can give the latter. Ailira functions as a legal search engine, creating forms and providing legal information in a user-friendly way.
Applications of AI don’t have to be complicated. British-American entrepreneur Joshua Browder created a mobile chatbot called DoNotPay in 2015, which he calls “the world’s first robot lawyer”. It identifies the most likely reason that someone’s ticket may have been issued incorrectly, and tells them how to fight it effectively. Like Ailira, DoNotPay thrives on completing simple and repetitive tasks, making legal information accessible, and empowering people to take simple matters into their own hands.
The fear of the unknown is what we want to address. Have we missed something? We want to let people have the fullest range of information in terms of legal research and tactics.
Shan Mukerjee, LexisNexis
What it takes to create ‘automagic’
AI also has enormous benefits in legal research. LexisNexis is at the cutting-edge of this field, using AI and machine learning (an application of AI that allows machines to learn from data without being specifically programmed) to create a range of innovative analysis tools. Shan Mukerjee, Executive Manager of Core Product at LexisNexis, says these processes are key to addressing pain points.
“The fairytale persists that if we use the right keywords to search, or browse the right database, we’ll find the perfect case. The fear of the unknown is what we want to address. Have we missed something? We want to let people have the fullest range of information in terms of legal research and tactics,” she says. “Where you can’t find the magic case, you’re looking for the closest match – someone who has run an argument similar to what you want to run with a different fact matrix.”
It’s not as simple as consuming data and spitting out answers, which Mukerjee calls “automagic”. Algorithm-crunching wouldn’t be possible without an enormous contribution from real people. Blending the technical expertise of developers with the legal expertise of lawyers presents a huge opportunity for further growth, and it has created a new subset of legal practice: legal knowledge engineering.
“We need people who know the law, and have the kind of metacognition to know how they know it, and explain how they know it, so the patterns and processes can be taught to computers,” she says. “If you think about the principles we take for granted, even as a lawyer it can be difficult to identify what the ratio actually is, and which decisions were a finding of fact or a finding of law. Those are basic principles, and you can identify either end of the spectrum with ease, but it gets grey very fast in the middle.”
Consequences of machine failure
Despite the benefits, AI implementation does come with serious risks. It only takes the term “robo-debt” to raise eyebrows about automated decision-making processes in Australia. AI takes a rigid approach to problem-solving, which can have detrimental effects on everything from finance to freedom.
In 2016, an investigation by American news organisation ProPublica found that software used to streamline sentencing decisions in a number of US courts was biased against people of colour. The program generated scores that predicted the likelihood of recidivism by comparing an individual’s circumstances against years of court data. The risk assessments were intended to reduce sentencing bias, but the developers failed to realise the machine was only as good as the data it was fed. It didn’t know that black people were arrested, convicted and imprisoned at disproportionately higher rates than white people. It recognised a pattern, and followed it precisely, with disastrous results.
It has happened in myriad other contexts. Microsoft had similar troubles in 2016, with a chatbot called Tay, which was designed to talk like a teenage girl. Tay was trained using huge amounts of data from Twitter, which gave it a crash course in bigotry. Within just 24 hours of being exposed to the Internet, its tweets spiralled from “humans are super cool” to “Ricky Gervais learned totalitarianism from Adolf Hitler, the inventor of atheism”. Tay was a public relations disaster. Meanwhile, in 2018, Amazon was forced to scrap a sexist AI recruiting tool. It consumed a decade’s worth of hiring data in order to mechanise the search for top talent, however, developers realised the system had taught itself to prioritise men. In fact, it actively penalised resumes that included the word “women’s”, because in a heavily male-dominated industry, the algorithm grabbed onto the trend and wouldn’t let go.
In some ways AI is like looking into a mirror, in that it uses data to identify and continue trends associated with people’s behaviour. It begs the question, how can organisations implement AI while protecting themselves against the risk that it will pull together all the worst traits of humanity?
There are a lot of things lawyers will be needed for, and a lot of things that technology can’t do, and won’t be able to do, for the foreseeable future..
Felicity Bell, Research Fellow, Future of Law and Innovation in the Profession, UNSW
The techno-legal age is here
Professor Dan Hunter is Executive Dean of the Faculty of Law at Queensland University of Technology and Chief Investigator at the Australian Research Council Centre of Excellence for Automated Decision Making and Society. He says traditionally, it has taken law a long time to dive into technology and embrace change. He wrote his first research paper on AI back in 1992, and says increases in processing power over the last decade have changed things from a rule-based to a data-centric approach.
In legal decisions, the data usually comes in the form of text. Natural language processing allows systems to predict what words come next, which makes AI technology relevant to almost every aspect of law. Hunter says it will be used more and more, which is why lawyers will need to learn how it works. The line between legal practice and technology is blurring, and although lawyers don’t necessarily need to become technologists, they need to understand how techno-legal systems will impact clients.
“Lawyers will have to be part of it,” he says. “They’ll have to engage with [technology] in different ways than they’re used to. If a lawyer is supposed to represent their client and write a contract, we know they’ll be principled, but these days, smart contracts are essentially built by engineers. Lawyers say the blockchain is very complicated, and it is, but it has an impact in the real world… If lawyers don’t dig under the hood a little bit and understand how it’s supposed to work, they’re relying on engineers.”
Understanding the intersection between data and AI also key to understanding appeal mechanisms, because while judges give reasons, machines may not give explanations. Automated decisions can’t work unless lawyers understand how it was decided, and how to appeal it. A large part of this will come down to data literacy. “Computers are good at making decisions, but they’re only as good as their data, so we need to get people interested in this,” Hunter says. “Lawyers need to be able to interrogate the data and determine if it’s correct or incorrect, or whether failed to account for X, Y, or Z.”
Robo-lawyers vs real lawyers
AI will have a big impact on the legal sector. Automation won’t replace courts, or judges, but it will impact a lot of everyday legal decision-making. McKinsey Global Institute, a management consulting firm based in the US, estimates 22 per cent of a lawyer’s job, and 25 per cent of a law clerk’s job, could be automated. However, the legal community is divided on exactly what this will mean in practice.
“I believe that AI will become critical to law firms, and we were in an interesting phase where hype is being replaced by real examples of implementation. Implementation has its own challenges, mainly understanding the limitations and capabilities of AI systems,” says Connor James, Principal at Law Quarter in Sydney. He tells LSJ his firm developed a system called Titan, which reviews contracts and other legal documents to identify potential errors and omissions and promote efficiency.
“In our experience, machine learning systems are quickly outpacing [lawyers] in their capacity to deliver legal services. Many people say that AI is not going to replace lawyers. We largely disagree with that sentiment; our differentiating factor is that we have actually developed a system and are using it.”
Felicity Bell, a research fellow for the Law Society of NSW’s Future of Law and Innovation in the Profession (FLIP) team, has a different view. She has co-written a book called Artificial Intelligence and the Legal Profession, which will be published in November. The book covers a lot of ground, exploring different applications of AI in legal practice while bringing lawyers up to speed with evolving technology.
“What we became most interested in is the ethical issues for lawyers in using this technology, and some of the limitations of these types of systems,” she says. “There are a lot of things lawyers will be needed for, and a lot of things that technology can’t do, and won’t be able to do, for the foreseeable future.”
Defence lawyers, for example, have a strong role to play in questioning automated decisions in criminal matters. Commercial lawyers need to be able to explain to the clients how AI systems might affect the outcome of their matter. Family lawyers will still need to negotiate the nuanced and emotional factors affecting their clients. All lawyers have a duty to facilitate access to justice in their communities.
And at the end of the day, she says, nothing will replace conversing on the courtroom steps.