The question of whether to accept or reject artificial intelligence (AI) in society is highly controversial and would probably result in a deadlock among jurors if it ever went to trial.
AI has both positive and negative implications for the legal field, as many experts have pointed out. Some of the advantages include automating writing tasks and analyzing large amounts of data quickly. However, some of the challenges include data bias, accuracy issues and accountability gaps.
AI also poses both a risk and an opportunity for legal professionals. The Law Society in the UK warned in 2021 that AI could cause a “savage reduction” in human jobs. And a 2021 study by three US universities ranked the legal sector as the most affected by AI.
On the other hand, AI can also help lawyers with research and case preparation. But there are also examples of how AI can go wrong, such as the case of Steven Schwartz, a New York lawyer who used ChatGPT, a popular AI system, to find precedents for a personal injury lawsuit. He ended up using six fake cases that the AI had generated.
Some lawyers may be wary of using such systems after this incident, but Ben Allgrove, the chief innovation officer at Baker McKenzie, an international law firm, thinks differently.
He says: “It’s not a technology story, it’s a lawyer story. You have to deal with the unprofessionalism [by Mr Schwartz] and the ethical breach first, before you get to the fact that he used a tool that was inappropriate.”
Since 2017, Baker McKenzie has been monitoring the progress of AI, and has formed a group of lawyers, data scientists and data engineers to evaluate the new systems that are emerging in the market.
Mr Allgrove believes that most of the AI usage in his firm will come from adopting the new AI-enhanced versions of existing legal software providers, such as LexisNexis and Microsoft’s 365 Solution for Legal.
LexisNexis introduced its AI platform in May, which can respond to legal questions, produce documents and summarise legal issues. In addition, Microsoft’s AI-tool, Copilot, will be available for commercial customers next month, as an extra-cost add-on for 365.
“We already use LexisNexis and Microsoft, and they will increasingly get capabilities driven by generative AI. And we will buy those things if they make sense and are at the right price.”
Generative AI is the type of AI that everyone is talking about. It is the AI that can generate text, images and music based on the data it was trained with.
The drawback is that currently, premium, paid-for versions of such tools are costly. Paying for Microsoft’s Copilot alone would “double our technology spend”, Mr Allgrove says.
The alternative is for law firms to pay a lower amount to access AI systems not specifically aimed at the legal market, such as Google’s Bard, Meta’s Llama, and OpenAI’s ChatGPT. The firms would connect to such platforms, and adapt them for their own legal use.
Baker McKenzie is already testing several. “We are going out to the market and saying we want to test the performance of these models,” says Mr Allgrove.
Such testing is essential, he explained, to “validate performance”, because all the systems will all make errors.
RobinAI is a legal software system that uses an AI co-pilot to assist with creating and reviewing contracts, for both corporate legal departments and individuals.
The AI co-pilot is mainly based on an AI system developed by Anthropic, a company founded by a former OpenAI research VP and funded by Google.
RobinAI also has its own AI models that learn from contract law details. Every contract that the system handles is uploaded and labelled, and then used as a learning resource.
This allows the firm to have a large database of contracts, which Karolina Lukoszova, co-head of legal and product at RobinAI in the UK, believes will be essential for the application of AI in the legal field.
“Companies will need to train their own smaller models on their own data within the company,” she says. “That will give them better results and ones that are ringfenced.”
RobinAI is an AI platform that works with human lawyers to ensure the accuracy of information.
Alex Monaco, an employment lawyer, runs both a solicitor practice and a tech firm called Grapple.
Grapple provides the public with “an ontology of employment law”, and gives guidance on various workplace issues such as bullying, harassment, and redundancy. It can also create legal letters and summarize cases.
Mr Monaco believes that AI can make the legal profession more accessible to everyone.
“Most of the people who contact us are those who cannot afford lawyers,” he says.
But with free AI tools available online, people can now prepare their own legal cases. Anyone with an internet connection can use Bard or ChatGPT to help write a legal letter. It may not be as good as a lawyer’s letter, but it is free.
“AI is not taking over humans or lawyers. It is enhancing people’s knowledge and application of their legal rights,” he says.
And in a world where everyone is using AI, he says that this could be very important.