FeatureHow Legal Professionals Are Actually Using Generative AI
With stories like the lawyer who used ChatGPT to create a brief, but didn’t check the legal precedents the system made up before submitting it to a judge, it’s not surprising that the legal industry is leery about using AI. But an increasing number of legal professionals are taking advantage of AI tools.
Table of Contents
- How Many Legal Professionals Use AI?
- Top AI Use Cases in the Legal Industry
- The Need for Guardrails on AI in Law
- Case Study: Legal Aid Uses GenAI to Assist Operations
How Many Legal Professionals Use AI?
According to a 2025 report, the use of generative AI has nearly doubled year-over-year in the legal space from 14% in 2024 to 26% in 2025. More than one-third of legal professionals (36%) are still considering whether to use it, while just 22% said they have no plans to use it.
Consider these quick facts:
- 45% of law firm respondents said they either currently use GenAI or plan to make it central to their workflows within one year
- 85% of said they believe AI can be applied to industry work
- 59% said it should be applied to industry work, while 14% said it should not and 27% said they didn't know
The primary concern for legal professionals was that generative AI would support the unauthorized practice of law, while other concerns included: job impact, less need of or work for attorneys and revenue impact.
Top AI Use Cases in the Legal Industry
Thomson Reuters
The most common use cases for GenAI in the legal industry include:
- Document review
- Legal research
- Document summarization
- Drafting memos & briefs
- Contracts
- Correspondence
These use cases were cited by 50% or more of legal industry users, the report noted, adding “This means that if somebody is using GenAI, they likely are using it in several different ways.” The least common use case was for environmental, social and governance (ESG) and reporting.
Law firm respondents lag slightly behind other professional service populations among active GenAI users, but 69% said they use GenAI tools at least weekly. However, 48% of law firm professionals said they still lack formal GenAI policies.
The conservative legal industry is cautious about adopting new technology, the report noted, pointing out that “the American Bar Association was still releasing guidance around how to handle ‘reply all’ emails as recently as November 2022.”
The Need for Guardrails on AI in Law
As in any industry, it’s important to add guardrails and keep humans involved, said Paul Britton, CEO of London-based legal firm Britton & Time.
Human Oversight
“The most important aspect is human oversight when deploying AI in legal practice. Any AI-supported outcome must be subject to human review and a human appeal process. This is the only way to maintain fairness, especially when it comes to court hearings.”
In fact, AI could act as a backup to humans in the legal industry, Britton said. “An example would be if a judge drafted a judgement that used the wrong law or conflicting law — an AI could pick up on this and make recommendations. That approach would also reduce the number of cases that get appealed when a judge gets it wrong.”
The book "Noise: A Flaw in Human Judgment" points out a number of studies where judges were found to make inconsistent decisions based on random factors, such as the weather or the time of day, and notes that algorithms, such as those used in AI, provide more consistent results.
Mitigating Bias
At the same time, it’s important to remember that AI systems trained on historical data can be biased, and look for ways to mitigate that bias so that it isn't perpetuated. “AI models must, of course, be tested and retrained regularly to avoid perpetuating historic inequalities (e.g., racial or socio-economic bias),” Britton said. “Training data should be representative, and independent audits should be mandatory by the human operators.”
Transparency & Explainability
Another important element should be transparency and explainability, Britton added. “Legal AI systems must be auditable and explainable to the everyday person, not using technical or complicated explanations. Any decision or recommendation should come with a rationale that a human can interpret and challenge if necessary. ‘Black box’ (algorithmic models whose internal workings are not transparent or easily understandable) systems are incompatible with the principles of justice and should be outlawed from the start.”
Ultimately, AI in the legal system should be used as a tool, Britton said. “If the legal system is going to use AI, it should treat it not as a judge, but as a tool for justice. Helping reduce delay (by typing up judgments, for example) and inconsistency in the administration of justice, while never replacing the human conscience of the law.”
Case Study: Legal Aid Uses GenAI to Assist Operations
Right now, AI is used by an increasing number of the 129 Legal Aid Societies (LAS) in the United States. The Legal Aid Society of Middle Tennessee and the Cumberlands, for example, is using AI tools to streamline applying for expungements.
If a case against you is dismissed, if you're acquitted or if prosecution decides not to pursue it, people think, because they weren't found guilty, that it isn't on their record, explained Zachary Oswald, senior deputy director of client services for the Nashville-based organization, the eight offices of which cover about half the state. “But it still exists on their criminal record.” People have been denied housing, certifications and employment based on those records, he added.
Fortunately, residents of Tennessee and a number of other states are legally entitled to have that record erased. But as with most legal things, paperwork is involved — often up to two hours’ worth. A LAS paralegal got the idea to use an AI system to fill out that paperwork instead, and cut that time down to just four minutes, Oswald said.
But there are more benefits than just saving time. First of all, it means LAS can offer same-day service. Previously, when LAS held clinics, people would have to pre-register two or three weeks in advance to get the documents from court clerks and find out what was expungable. In addition, GenAI provides a better experience for LAS volunteers and pro bono attorneys. Because expungement falls into a gap between civil and criminal practice, attorneys who don’t have experience with it can be intimidated, explained Oswald. “Providing them with the tool allows them to come to a clinic and feel more comfortable doing legal work than they would be otherwise." It also allows the attorneys to interact with the clients more rather than focusing on filling out paperwork.
Now, LAS is working on expanding its tool to perform conviction expungements as well. In Tennessee, certain convictions can be expunged, but with a complex set of requirements based on the year of the offense, the type of offense, whether it was violent and so on, Oswald explained. LAS developed its own system, based on ChatGPT, and is teaching it to search for potential opportunities meeting those requirements. “This separates us from the fear a lot of people have,” he said. “We’re not asking it to interpret the law or be an attorney. We’re asking it to be a computer.”
LAS is presenting their GenAI processes at annual Legal Aid Society technology conferences and working with other societies to help them develop their own programs.