Collecting Ultimate Bar Passage Data: Weighing the Costs and Benefits

The bar exam is an important outcome measure of whether our graduates are learning the basic competencies expected of new lawyers. As the ABA Managing Director reminded us in his memo of June 2015, however, it can no longer be the principal measure of student learning. Thus, we’re directed to look for other evidence of learning in our programs of legal education, hence the new focus on programmatic assessment.

Nevertheless, the ABA has wisely retained a minimal bar passage requirement in Standard 316, described in greater detail here. It is an important metric for prospective students. It is also an indicator of the quality of a school’s admission standards and, indirectly, its academic program. Indeed, it has been the subject of much debate recently. A proposal would have simplified the rule by requiring law schools to demonstrate that 75% of their graduates had passed a bar exam within two years of graduation. For a variety of reasons, the Council of the Section of Legal Education and Admission to the Bar recently decided to postpone moving forward with this change and leave Standard 316 written as-is.

With that background, Friday afternoon the ABA Associate Deans’ listserv received a message from William Adams, Deputy Managing Director of the ABA.  In it, he described a new process for collecting data on bar passage. A copy of the memo is on the ABA website. This change was authorized at the June 2017 meeting of the Council.  Readers may remember that the June meeting was the one that led to a major dust-up in legal education, when it was later revealed that the Council had voted to make substantial (and some would say, detrimental) changes to the Employment Questionnaire. When this came to light through the work of Jerry Organ and others, the ABA wisely backed off this proposed change and indicated it would further study the issue.

The change that the ABA approved in June and announced in greater detail on Friday is equally problematic.  In the past, schools would report bar passage as part of the Annual Questionnaire process. The bar passage section of the questionnaire asked schools to report first-time bar passage information. If a school was going through a site visit, it would also report this information on the Site Evaluation Questionnaire. If a school could not demonstrate compliance with Standard 316 with first-time bar passage, it was asked to show compliance using ultimate bar passage in the narrative section of the SEQ, specifically question 66, or as part of an interim monitoring or report-back process, described here (page 6).

Now, per the ABA, all schools—even those that can show that their graduates meet the minimums of Standard 316 through first-time passage data—must track, collect, and report ultimate bar passage information going back two years. (There is a phase-in process as outlined in the memo.) Hypothetically, let us assume that a school always has a pass rate of 80% (for sake of argument with 100% of graduates reporting) in a state with a consistent average of 75%. The school is in compliance with Standard 316, but it must nevertheless track the 20% of graduates who did not pass on the first attempt to see if they passed on subsequent attempts.

I have several problems with this change. As with the Employment Questionnaire issue, this change to the collection of bar passage data was done without notice and comment. While notice and comment is not required under the ABA rules for changes to questionnaires, a more open dialogue with schools would have likely highlighted the issues I raise below. Not all of us have the time to scour the ABA’s website for agenda and minutes for the various entities involved with the accreditation process (Council, Accreditation Committee, Standards Review Committee). A change this significant should have been done with input from those of us—deans and vice/associate deans—who are on the front lines of ABA compliance.

From a substantive perspective, the new change in data collection adds significant burdens without much benefit to the accreditation process.  Tracking graduates two years out will not be easy, particularly for schools in states that do not release data to schools or the public on bar passage. This is on top of the employment data that is collected every year, which is a significant undertaking if done correctly. Compliance with ABA Standards, state and federal Department of Education rules, state court rules (e.g., New York, which has a number of quasi-accreditation rules, such as its Skills Requirement and Pro Bono Requirement), and regional accreditors is increasingly taking up much of the work of associate deans of law schools. Time spent on compliance and accreditation is time that could otherwise be spent managing our institutions, helping students, or teaching.

That said, if reporting is a means to achieve an important end, I’m all for it.  The disclosures related to graduate employment, for instance, are important to prospective students. Collecting such data takes time but serves valuable purposes in transparency. Much of the Standard 509 report is valuable to applicants when comparing schools, and I fully support the transparency that it promotes.

Here, though, requiring all schools to track ultimate bar passage serves little purpose. Most schools can satisfy the minimums of Standard 316 with first-time bar passage data. In order to comply with Standard 316 using first-time data, a fully approved school must demonstrate that for three of the last five calendar years, the school’s bar passage rate was not more than 15 points below the first-time bar passage rate for graduates of ABA-approved law schools taking the bar exam in the same jurisdictions in the relevant years. This is a ridiculously easy standard to meet for nearly all schools. If I’m reading the ABA summary data on bar passage correctly, roughly 15-20 schools each year have a composite passage differential below 15% and thus must use the ultimate bar passage calculations to demonstrate compliance. All others are in compliance with the first-time standard. Why, then, require all schools to go through the process of tracking down every graduate to see if he or she passed a bar exam within two years of graduation? How would that data serve the purposes of Standard 316? A better approach is to leave the status quo and require the more onerous ultimate bar passage data collection only for schools that cannot demonstrate compliance with the first-year standard.

On the other hand, there may be external benefits to collecting and reporting ultimate bar passage two years out. For schools that struggle with first-time passage, I suppose being able to report on ultimate passage will be helpful from a marketing perspective, but nothing in the existing system prevents them from doing so now. I worry about schools misrepresenting ultimate bar passage results—”Look how great we are! 99.5% of our grads pass the bar exam!” with a tiny footnote explaining that this “great” result is based on reporting two years out. If I were a prospective student, first-time passage would be much more important in determining which school to attend. With the bar exam only being offered twice a year, having to retake the bar exam 2, 3, or 4 times can be disastrous to one’s career.

There may also be value in collecting this data from a research perspective. With the proposed reforms to Standard 316 on hold for now, collecting ultimate bar passage from all schools may help the ABA determine what the threshold should be. On the other hand, the ABA should develop standards based on what, in the Council’s professional judgments, schools should achieve, not what they are achieving right now. Moreover, if the goal is to gather research for future amendments to the standards, the Council should be upfront that this is its goal and it should consider doing a voluntary sampling. Again, notice and comment would be helpful in this regard.

There is one aspect of the new Bar Passage Questionnaire that is a positive change. By de-coupling bar passage from the Annual Questionnaire, prospective students will have more timely information on this important outcome. Currently, the Annual Questionnaire, which is completed in October each year, asks for bar passage data from the previous calendar year. For example, if the process was left unchanged, this AQ season we would have been reporting bar passage from calendar year 2016, even though most schools now have full 2017 data available.

I have great respect for the staff of the Managing Director’s Office. In these types of matters, they are the proverbial messenger, so I don’t fault them. I have three requests of the Council, however:

  1. First, the Council should give greater thought to the costs of data collection, particularly where it’s unclear whether or how such data will translate to assessing compliance with the existing standards. The Council has done a terrific job of streamlining the data collected for site visits, but more work can be done on the AQ.
  2. Second, the Council should withdraw its proposed implementation of the section of the new Bar Passage Questionnaire that asks all schools to report ultimate passage until these issues can be more fully aired.
  3. Finally, if significant changes are proposed to data questionnaires in the future, the Council should engage in a more open and collaborative process with the law schools and the broader legal education community to get feedback.

The Bar Exam and Assessment

Recently, the Standards Review Committee of the ABA announced that it will be proposing a change to ABA Standard 316, which governs the minimum bar exam passage rate that a law school must show in order to remain accredited. As has been reported elsewhere, the current rule provides law schools with a number of ways to demonstrate compliance.  The proposed rule streamlines the path to compliance.  Under the proposal, 75% of a school’s graduates who took a bar exam must pass within two years of graduation.  First-time bar passage will still be reported on a school’s Standard 509 report.

It is important to put this proposal into a broader context:

  • A declining applicant pool has led some schools to lower admissions standards.
  • Critics have argued that some law schools are admitting students who have little to no hope of passing the bar exam, at least not on the first attempt.
  • There has been public bickering between some deans and the National Conference of Bar Examiners about who is to blame for recent declines in bar passage nationwide, with the former blaming the test, and the latter blaming law schools for lowering admissions standards.

I argue that, along the way, there has been a silent rethinking of how we view the bar exam. Many schools take great pride in their bar passage rates, as they reflect well on their programs of legal education.  In the past, a school’s bar exam passage rate was thought to be the measure of knowledge acquired in law school.  But there are a few reasons why the bar exam does not fully assess the learning that goes on in law school: Continue reading