The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam.
“The debacle that was the February 2025 bar exam is worse than we imagined,” Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. “I’m almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable.”
In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier.
People pay to take this exam. Someone decided to pocket some of that money for their org and have an AI org do some of it instead of qualified professionals. They didn’t bother to check the output. It came out poorly and now they have to eat the cost of going back and fixing it. The students and proctors are not compensated for the added time and stress, but paid the same for an overall worse experience. It’s a microcosm of everything wrong with the way AI is being used.
“Although there might be public skepticism of the emerging technology in the legal profession at this time, “we will be worried in the future about the competence of lawyers who don’t use these tools,” Perlman predicted.”
It’s pretty clear from this shitshow they should be worried about lawyers who are using these tools.
OR MORE specifically the non-lawyers writing bar exam questions with AI.
Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, “It’s a staggering admission.”
“The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam,” she said. “They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored.”
Non lawyers writing questions for a competency exam for lawyers with zero oversight by anyone in the profession?
As a lawyer—this is bullshit.
Considering the now multiple cases of LLMs citing non-existent case law, I thought we already were. Seems like using the “there are only two R’s in strawberry” machine to do your job would potentially be grounds for losing your professional license.
AI will never, ever make anyone better at their job. It might in some cases help them to be faster.
If competency is the concern, AI has no role here.
And there’s the crux of the issue. Capitalism doesn’t give a fuck if LLMs are right. It only cares if it raises its stock option payouts.
That’s extremely worrisome for me, as well. It’s bad enough AI was used already to argue cases badly, are lawyers now feeling entitled to their exorbitant fees without even the abuse of paralegals and researchers?
Man, lawyers just DO NOT UNDERSTAND THE LIMITATIONS OF LLMs.
That’s part of the problem here. Lawyers didn’t make up the questions on the exam.
"Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable.”
I’m perfectly fine with non-lawyers, and/or LLMs writing the questions. What was stupid was that it wasn’t independently validated and proofread by a third party.
“The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam,” she said. “They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored.”
It’s a bar exam. One of the most important tests these people will take in their entire career, and they half assed the implementation.
Is the most important test the test takers will take. For the test givers, it was Tuesday.
It seems like a failure of the LLM creators for marketing it the way they do
I’m in law school right now. AI is pretty helpful for researching briefs. You can ask it super specific legal questions and get answers with citations. But what you can’t do is just copy and paste shit and not check the citations to make sure they’re real.
The idea that they’d use AI to write the test questions and not independent verify them is ridiculous.