Thousands of people were ready to take part in the new California bar exam in February and participate in the state's ranks of 195,000 lawyers.
But a series of failures by the institutions responsible for legal licensing have cast thousands of early legal careers into frustrating scope.
First, there was a failed test software used during the test. The candidate had a hard time logging in. The software often crashed and lacked important features such as copying and pasting, which made many people unable to complete the exam. State bars in California, the organisation that administers the test, had to provide adjustments to test takers scores and other remedies.
Then came the news that at least multiple choice questions had been developed with the help of artificial intelligence. For many people who took the exam, it was hardly shocking – they already had suspicions that AI was used, based on some questions that said they had attacked them as strangely verbal or legally unhealthy.
And now, future California lawyers may have to wait a little longer to find out if they've made the cut.
State Barr said it would take more time to get approval from the California Supreme Court to adjust the test scores against the issue. The results of the February exam were scheduled to be released on Friday, but that could be delayed.
“I just wanted a fair chance to become a lawyer,” said Edward Brickell, a 32-year-old graduate of Southwestern Law School in Los Angeles, who took the test, in an interview. “And I feel like there's another thing every week that says, 'We didn't give you a fair chance.' ”
Brickell and others who have been tested are flooded with plans to organize horror stories and protests on Reddit and other social media sites, and to demand accountability. A small number of test takers used the public period to express discomfort and frustration at a state bar committee meeting on Tuesday.
“You're an organization that determines whether we have the capacity to make a living,” Test Taker Dan Morina told the state's contract committee in a virtual meeting. “Finances are being destroyed. Life is being destroyed and more is about to be destroyed.”
Because of the high thresholds for passing, California bar exams have long been considered one of the most challenging in the country. That threshold has been declining in recent years.
In October, the state bar received approval from the California Supreme Court to introduce questions developed by the new contractor and options to allow testing to be filmed remotely. State bars have made changes to save money.
State Bar previously used and prepared trials developed and prepared at the National Conference, the organization behind the exams used in most states that are considered gold standard on-site. NCBE does not allow remote testing.
California candidates were told that many exams prepared the same way for the NCBE version of the test, as new exams do not require substantial changes to prepare.
In November, Statebar conducted an experimental test that served as a test run. People who took it reported technical difficulties. The test taker then stated that the research guide had an error. That guide was quietly revised and re-released a few weeks before the February exam.
Kaplan, a new contractor for testing development, disputed the fact that the research guide contained a considerable number of errors.
Indications that state bars were expecting some challenges, they offered over 5,000 registered test takers the option to take the exam until July, the next test day.
After the February exam, Statebar acknowledged a widespread technical failure.
“We know and say that these issues continue to be for those who are still testing, unacceptable in scope and rigour,” California Barr said in a statement. “We apologize again, and we don't make excuses for the mistakes that have occurred.”
State Bar added that it will assess whether Meazure Learning, the vendor that provided the technology and suggested services to manage the exam, has failed to fulfill its contractual obligations. He also said that in order to come up with score adjustments for test takers who have experienced difficulties, they will focus on experts who focus on measuring intangible qualities such as knowledge and intelligence.
The proposed test score adjustments for State Bar were announced last week. The proposal significantly lowered the raw pass score.
The recommendation was filed on Tuesday at a request for approval from the state Supreme Court – three days before the results were announced. Given the late submissions, state bars told Test Takers that the release of exam results could be delayed, prolonging a series of uncertainties that are dizzy for many.
What was deeply buried in the announcements on scoring adjustments was the new development. Some of the multiple choice exam questions were developed by Statebar's psychosurveyor provider ACS Ventures, with the support of artificial intelligence rather than Kaplan.
ACS Ventures did not respond to requests for comment.
State lawyers said the committee of the Judiciary Committee, the body that oversees the examinations, had not been aware of the use of AI by the state Supreme Court last year.
“However, the court has not approved and has not been approved for the broader use of AI,” Judicial Committee Chairman Alex Chan said in a statement. “AI could ultimately play a role in the future of trial development, but in the absence of specific judicial guidance, the committee has not considered or approved its previous use.”
The Supreme Court said it was not aware that the technology was being used to develop the tests and called for an investigation.
In a petition filed Tuesday with the Supreme Court, the state bar said the ACS venture had prompted the AI chatbot to create a multi-select question. ACS Ventures did not accurately check the questions generated by Openai's ChatGPT. However, those and other questions were sent to the state bar panel for review.
The disclosure that AI was used at all, such as Brickell, appeared to provide some explanation for their confusion. Some questions did not read as if he and others who took the test had been drafted by humans and only listed false multiple choice answers.
Aspiring entertainment lawyer Ceren Aytekin said he also noticed specificity in a few questions, but initially refused to believe that AI was used.
“I initially thought, 'Maybe I'm wrong,'” Aytekin said. “Maybe I'm putting their examiners accountable to an organization that never does this,” she added. “All the issues I found make great sense because AI is involved. We didn't want to believe it.”
Two other large state bar associations in New York and Illinois said they have never used AI to develop exam questions. NCBE, which prepares exams for New York, Illinois and most other states, said it has never used AI for that purpose.
April Dawson, associate dean of the Center for Technical Law and Policy at the Faculty of Law at North Carolina Central University, said the use of AI in developing test questions is not an issue in itself. She said the problem lies in the fact that it was done without clarity.
“It's really embarrassing to have a license agency involved in such irresponsible behavior,” she said.
If he does not pass, Brickell could take the exam in July. Those who fail the February exam can take it for free. State Bar says it will not use questions developed with AI in its July exams.
If the exam wasn't offered free of charge in July, Brickell was thinking of taking it in another state.
“I don't want them to be a lawyer for the rest of my life and give them my bar fees,” Brickell said of California's state bars. “This made me so much worse.”

