Lawyers say AI is already changing the practice of law

In 2019, a metal trolley slid in the aisle of an Avianca flight from El Salvador to New York and hit Roberto Mata in the knee. Mata hired two personal injury lawyers to sue the airline, resulting in a case that set a legal precedent — but not for the reason Mata or his lawyers had hoped. Instead, the two lawyers investigated Bad reputation In 2023 when the judge in the case fined the couple $5,000 for citing six fake cases, invented by an artificial intelligence program, in a memorandum submitted to the court.
This penalty was the first of its kind and was widely circulated in legal circles as a cautionary tale for lawyers about the dangers of relying on artificial intelligence. This cautionary tale was not heeded. In the two years following the incident, there was no less 139 cases of legal briefs containing AI-invented citations, including more than 30 in May alone. While many of these suits come from self-represented litigants, they also include briefs filed by prominent national law firms. In one case, the judge to hit Expert testimony provided by a data scientist from AI giant Anthropic – on the grounds that it contained “hallucinations” generated by Anthropic’s own software.
These stories of AI gone wrong in the courtroom are a lively source of conversation around law firms’ water coolers. However, these cases are also considered extreme cases. In fact, the vast majority of lawyers who use AI do so responsibly — including by verifying that AI-generated citations are actually verified before they are presented to a judge.
Many are also happy with how useful it is. Lawyers say AI excels at research and document review, and they’ve recently found it can handle more challenging tasks like drafting legal briefs. All of this progress has major implications for the future course of the legal profession and the business of law.
A growing set of artificial intelligence tools
Most lawyers came to appreciate the transformative power of AI at the same time as everyone else: November 2022. That’s when OpenAI released the initial version of ChatGPT to the general public, and it’s no surprise that about six months later the first wave of fake citations started appearing in courtrooms.
In fact, the idea of using AI to guide legal research has been around for more than a decade. Leaders in this space include companies such as Ironclad and DISCO, which respectively offer automated ways to create and manage contracts and accelerate the discovery process.
The difference today is that, as in other industries, the past three years have seen the legal profession acquire and deploy AI technology at a broader and faster pace than ever before – especially as generative AI becomes more readily available and more powerful.
In February, the Federal Bar Association published a survey of more than 2,800 law professionals: 21% said they used AI in their practices in 2024, and the number was much higher — 39% — for lawyers at large law firms. A separate survey of lawyers, published by legal transcription service Rev in April, found a much higher overall adoption rate, with 48% of respondents saying they had used AI in research.
Unsurprisingly, the number of AI offerings targeting the legal profession is increasing. There are services like Harvey, which is built on AI models from companies including OpenAI, Anthropic, and Google and specializes in drafting and research; and Clio, based in Vancouver, British Columbia, which offers practice management tools. Clio raised $900 million in 2024, valuing the company at $3 billion. In late June, Harvey raised $300 million in a financing round that values the company at $5 billion.
Currently, there is no clear data on how widely lawyers deploying AI in their jobs are using these specialized tools, or whether they are instead relying on popular LLM degrees created by the likes of OpenAI and Googlewhich absorbed all laws and decisions related to public law.
According to Mark Lemley, a professor of intellectual property at Stanford University and an expert in legal technology, most law students and junior lawyers are likely to use public services like ChatGPT, since they are the most accessible.
In larger companies, a popular AI tool is CoCounsel, which was developed by legal startup Casetext. Data giant Thomson Reuters bought Casetext in 2023 and now offers CoCounsel as part of Westlaw, a popular but expensive service widely used by so-called big law firms.
Darrell Landy, an attorney at the national firm Morgan Lewis, says he likes CoCounsel in part because it is designed to take into account the legal profession’s special obligations regarding client confidentiality and data protection. In practical terms, this means that the AI claims created by Landy and his colleagues do not become part of a training corpus that could inform research conducted by people outside his company, an important issue in a profession where practitioners often have access to highly sensitive information about their clients.
Changing professional standards
For decades, most law firms conformed strictly to the “finders, escorts, and millers” model. The phrase describes a hierarchical structure in which a few rainmakers at the top bring in clients and hand over the bulk of the work to a second tier of senior lawyers, who in turn oversee a larger group of partners.
Where does the AI agent fit into the pyramid? It has been clear for some time that AI is well-suited to replace large swaths of mundane “mill” work – and already this is happening with tasks such as document review, contract drafting, and basic research. This is a good thing in many respects. AI is already eliminating some of the drudgery in legal work, much of which traditionally comes in the pursuit of billable hours — a notorious aspect of the profession that often leaves clients wondering how many hours on a bill represent busy work rather than indispensable tasks.
In light of this, it’s hard to argue with technology that dramatically reduces time spent on tedious tasks, reduces costs, and promises to free lawyers for deeper, more strategic matters. However, there are still disadvantages, most notably when it comes to training the next generation of lawyers.
“One way you can get good legal instincts for logical arguments is to get delegates involved — trying things out, doing them over and over again,” Lemley says. He is concerned that wider adoption of AI means that young lawyers may not acquire some of the analytical skills traditionally acquired through extensive reading of case law and statutes.
Lemley also notes that AI is increasingly taking over tasks such as writing briefs and crafting arguments, which many lawyers consider a fun part of the job, and which typically fall to senior lawyers.
All of this raises questions not only about how lawyers will spend their time in the future, but also about how they will charge for their services. For years, clients have struggled – usually with limited success – to offer alternative fee arrangements in order to escape hourly pay for junior associates who resort to falling down legal rabbit holes.
Lemley says the situation may reverse in the future, as clients — who realize that AI can perform many legal tasks in minutes, not hours — ask to be billed at set intervals, while law firms propose fixed fees and other arrangements.
The legal profession has long been adept at protecting its business model, so such changes are unlikely to be on the horizon. But now, it’s becoming clear that AI is actually saving lawyers something valuable: extra time.
According to a Thomson Reuters poll, widespread adoption of AI will likely save the average lawyer four hours a week, or 200 hours a year, hours that can be used to expand his client base, develop additional skills, or of course, stack billable hours.
Rule of law for artificial intelligence lawyers
Pablo Arredondo, who created the popular CoCounsel AI tool and is now a vice president at Thomson Reuters, frequently runs seminars for judges on the intersection of law and AI.
In the course of his work, he often thinks of a nearly century-old phrase that appears in Section I of the Federal Rules of Civil Procedure — that judges must encourage “fair, prompt, and inexpensive” resolution of every action.
“Those three words are not how anyone would describe American lawsuits,” Arredondo says dryly.
This may change if courts adopt AI tools in a widespread and effective manner. In fact, this adoption is already beginning, including in Massachusetts where the court system is adopting automated tools to help tenants facing eviction navigate complex legal forms.
“As much as this technology can be used to achieve better outcomes, there is an ethical imperative to use it,” Arredondo says. However, he cautions that any deployment of AI by courts and lawyers must be done under strict supervision.
Arredondo is not alone in this sentiment, as the legal profession, like many other industries, is trying to balance the efficiency gains provided by AI with safety. In 2023, reflecting on how technology was poised to disrupt the work of judges, Chief Justice John Roberts wrote: “Any use of artificial intelligence requires caution and humility.”



Post Comment