California’s February 2025 bar exam was, in technical terms, a hot-ass mess. Nearly 60 percent of post-exam survey respondents reported that the exam software froze, crashed, or otherwise became unresponsive during the multiple-choice question portion of the exam. Over 60 percent of respondents said the wording of the multiple-choice questions did not align with standard legal terminology and phrasing. And for the written portion, 75 percent reported that the copy and paste functions didn’t work.
In response to these problems, the California State Bar issued a press release last week announcing that it was making “scoring adjustments” for the exam. The press release also contained an admission, unassumingly positioned more than halfway through the statement, that may at least explain some of the exam’s strange wording: Some of the multiple-choice questions were developed by nonlawyers using “artificial intelligence.”
This is—in technical terms, again—straight-up bonkers. Judges have repeatedly sanctioned lawyers across the country for citing fake, AI-generated cases in their legal arguments. California nonetheless thought it was a good idea to allow environment-destroying lying machines to help determine the employability of its 4,193 February examinees. (More than 5,600 people originally signed up to take the test, but many backed out two weeks before test day when the Bar, already spying trouble on the horizon, said it was “extremely sorry” for causing “a lot of frustration, confusion, and anxiety” and offered “all applicants who wish to withdraw from the February exam a full refund (less bank fees).”
When putting together their bar exams, most state bar associations use multiple-choice questions developed by the National Conference of Bar Examiners, in addition to essays and some state-specific questions. And for years, the California State Bar has contracted with a “psychometric consulting company” called ACS Ventures to evaluate whether the breadth and depth of the exam’s content reflect the knowledge and abilities expected of entry-level attorneys. ACS Ventures also recommends California’s cut-off passing score, which should theoretically reflect the minimum competence necessary for entry-level attorneys in the state.
In February, though, California administered its own multiple-choice questions for the first time, having announced last summer that it planned to save money by turning away from the NCBE and instead contracting with a Kaplan subsidiary to create the state’s bar exam questions. That contract specified that “any Al-generated content shall be de minimis,” and that AI tools could solely be used “to enhance limited elements of existing human-created Work Product.”
That agreement notwithstanding, a recent presentation ACS Ventures gave to California’s Committee of Bar Examiners indicates that roughly 40 percent of the exam’s 171 scored questions came from sources other than Kaplan. Specifically, 48 questions came from the state’s First-Year Law Students’ Exam, which the state uses to assess law students who completed their first year of study at an unaccredited law school, and which is not intended to screen entry-level attorneys. The remaining 23 questions came from ACS Ventures itself. I want to stress here that this means that the same company charged with evaluating bar exam questions’ validity were writing some of the questions, and using AI in the development process.
Defenders of bar exams justify their existence by claiming that they fulfill an important gatekeeping role, barring unqualified people from entry to the legal profession. That purported purpose cannot be reconciled with incidents like this one, where would-be lawyers’ competence was tested in significant part by questions that were either intended for a different purpose, or were carelessly generated by nonlawyers with help from a computer program that can’t tell you how many Rs are in the word “strawberry.”
California’s debacle also underscores how useless the exam is in the first place—or rather, how ineffective it is at achieving its stated goal. (It was admittedly very effective at making me so stressed I missed menstrual cycles and gained 30 pounds.) Most of those who take it spend months learning the minute details of topics they will never need to think about again. At the same time, many of the practical skills those future lawyers will need—conducting legal research, effectively communicating with clients, and so on—don’t appear on the exam at all.
The bar is a good test of who can afford to take expensive prep courses and not work for three months; it is also a good test of rote memorization and speed-reading skills. But it is not a good test of one’s ability to competently practice law. Stuffing it full of AI slop doesn’t help.