Collecting Ultimate Bar Passage Data: Weighing the Costs and Benefits

The bar exam is an important outcome measure of whether our graduates are learning the basic competencies expected of new lawyers. As the ABA Managing Director reminded us in his memo of June 2015, however, it can no longer be the principal measure of student learning. Thus, we’re directed to look for other evidence of learning in our programs of legal education, hence the new focus on programmatic assessment.

Nevertheless, the ABA has wisely retained a minimal bar passage requirement in Standard 316, described in greater detail here. It is an important metric for prospective students. It is also an indicator of the quality of a school’s admission standards and, indirectly, its academic program. Indeed, it has been the subject of much debate recently. A proposal would have simplified the rule by requiring law schools to demonstrate that 75% of their graduates had passed a bar exam within two years of graduation. For a variety of reasons, the Council of the Section of Legal Education and Admission to the Bar recently decided to postpone moving forward with this change and leave Standard 316 written as-is.

With that background, Friday afternoon the ABA Associate Deans’ listserv received a message from William Adams, Deputy Managing Director of the ABA.  In it, he described a new process for collecting data on bar passage. A copy of the memo is on the ABA website. This change was authorized at the June 2017 meeting of the Council.  Readers may remember that the June meeting was the one that led to a major dust-up in legal education, when it was later revealed that the Council had voted to make substantial (and some would say, detrimental) changes to the Employment Questionnaire. When this came to light through the work of Jerry Organ and others, the ABA wisely backed off this proposed change and indicated it would further study the issue.

The change that the ABA approved in June and announced in greater detail on Friday is equally problematic.  In the past, schools would report bar passage as part of the Annual Questionnaire process. The bar passage section of the questionnaire asked schools to report first-time bar passage information. If a school was going through a site visit, it would also report this information on the Site Evaluation Questionnaire. If a school could not demonstrate compliance with Standard 316 with first-time bar passage, it was asked to show compliance using ultimate bar passage in the narrative section of the SEQ, specifically question 66, or as part of an interim monitoring or report-back process, described here (page 6).

Now, per the ABA, all schools—even those that can show that their graduates meet the minimums of Standard 316 through first-time passage data—must track, collect, and report ultimate bar passage information going back two years. (There is a phase-in process as outlined in the memo.) Hypothetically, let us assume that a school always has a pass rate of 80% (for sake of argument with 100% of graduates reporting) in a state with a consistent average of 75%. The school is in compliance with Standard 316, but it must nevertheless track the 20% of graduates who did not pass on the first attempt to see if they passed on subsequent attempts.

I have several problems with this change. As with the Employment Questionnaire issue, this change to the collection of bar passage data was done without notice and comment. While notice and comment is not required under the ABA rules for changes to questionnaires, a more open dialogue with schools would have likely highlighted the issues I raise below. Not all of us have the time to scour the ABA’s website for agenda and minutes for the various entities involved with the accreditation process (Council, Accreditation Committee, Standards Review Committee). A change this significant should have been done with input from those of us—deans and vice/associate deans—who are on the front lines of ABA compliance.

From a substantive perspective, the new change in data collection adds significant burdens without much benefit to the accreditation process.  Tracking graduates two years out will not be easy, particularly for schools in states that do not release data to schools or the public on bar passage. This is on top of the employment data that is collected every year, which is a significant undertaking if done correctly. Compliance with ABA Standards, state and federal Department of Education rules, state court rules (e.g., New York, which has a number of quasi-accreditation rules, such as its Skills Requirement and Pro Bono Requirement), and regional accreditors is increasingly taking up much of the work of associate deans of law schools. Time spent on compliance and accreditation is time that could otherwise be spent managing our institutions, helping students, or teaching.

That said, if reporting is a means to achieve an important end, I’m all for it.  The disclosures related to graduate employment, for instance, are important to prospective students. Collecting such data takes time but serves valuable purposes in transparency. Much of the Standard 509 report is valuable to applicants when comparing schools, and I fully support the transparency that it promotes.

Here, though, requiring all schools to track ultimate bar passage serves little purpose. Most schools can satisfy the minimums of Standard 316 with first-time bar passage data. In order to comply with Standard 316 using first-time data, a fully approved school must demonstrate that for three of the last five calendar years, the school’s bar passage rate was not more than 15 points below the first-time bar passage rate for graduates of ABA-approved law schools taking the bar exam in the same jurisdictions in the relevant years. This is a ridiculously easy standard to meet for nearly all schools. If I’m reading the ABA summary data on bar passage correctly, roughly 15-20 schools each year have a composite passage differential below 15% and thus must use the ultimate bar passage calculations to demonstrate compliance. All others are in compliance with the first-time standard. Why, then, require all schools to go through the process of tracking down every graduate to see if he or she passed a bar exam within two years of graduation? How would that data serve the purposes of Standard 316? A better approach is to leave the status quo and require the more onerous ultimate bar passage data collection only for schools that cannot demonstrate compliance with the first-year standard.

On the other hand, there may be external benefits to collecting and reporting ultimate bar passage two years out. For schools that struggle with first-time passage, I suppose being able to report on ultimate passage will be helpful from a marketing perspective, but nothing in the existing system prevents them from doing so now. I worry about schools misrepresenting ultimate bar passage results—”Look how great we are! 99.5% of our grads pass the bar exam!” with a tiny footnote explaining that this “great” result is based on reporting two years out. If I were a prospective student, first-time passage would be much more important in determining which school to attend. With the bar exam only being offered twice a year, having to retake the bar exam 2, 3, or 4 times can be disastrous to one’s career.

There may also be value in collecting this data from a research perspective. With the proposed reforms to Standard 316 on hold for now, collecting ultimate bar passage from all schools may help the ABA determine what the threshold should be. On the other hand, the ABA should develop standards based on what, in the Council’s professional judgments, schools should achieve, not what they are achieving right now. Moreover, if the goal is to gather research for future amendments to the standards, the Council should be upfront that this is its goal and it should consider doing a voluntary sampling. Again, notice and comment would be helpful in this regard.

There is one aspect of the new Bar Passage Questionnaire that is a positive change. By de-coupling bar passage from the Annual Questionnaire, prospective students will have more timely information on this important outcome. Currently, the Annual Questionnaire, which is completed in October each year, asks for bar passage data from the previous calendar year. For example, if the process was left unchanged, this AQ season we would have been reporting bar passage from calendar year 2016, even though most schools now have full 2017 data available.

I have great respect for the staff of the Managing Director’s Office. In these types of matters, they are the proverbial messenger, so I don’t fault them. I have three requests of the Council, however:

  1. First, the Council should give greater thought to the costs of data collection, particularly where it’s unclear whether or how such data will translate to assessing compliance with the existing standards. The Council has done a terrific job of streamlining the data collected for site visits, but more work can be done on the AQ.
  2. Second, the Council should withdraw its proposed implementation of the section of the new Bar Passage Questionnaire that asks all schools to report ultimate passage until these issues can be more fully aired.
  3. Finally, if significant changes are proposed to data questionnaires in the future, the Council should engage in a more open and collaborative process with the law schools and the broader legal education community to get feedback.

Database on Law Schools’ Learning Outcomes

The Holloran Center at St. Thomas Law School (MN)—run by Jerry Organ and Neil Hamilton—has created a database of law schools’ efforts to adopt learning outcomes.  The center plans to update the database quarterly.  

One of the very helpful aspects of the database is that it has coding so that a user can filter by school and by learning outcomes that go above and beyond the ABA minimum.  This will be a terrific resource as schools roll out the new ABA standards on learning outcomes.  In addition, for those of us interested in assessment as an area of scholarship, it is a treasure trove of data.

Frustratingly, it looks like many schools have decided not to go beyond the minimum competences set forth in ABA Standard 302, what the Holloran Center has categorized as a “basic” set of outcomes. The ABA’s list is far from exhaustive.  Schools that have essentially copied-and-pasted from Standard 302 have missed an opportunity to make their learning outcomes uniquely their own by incorporating aspects of their mission that make them distinctive from other schools.  Worse, it may be a sign that some schools are being dragged into the world of assessment kicking-and-screaming. On the other hand, it may indicate a lack of training or a belief that the ABA’s minimums fully encapsulate the core learning outcomes that every student should attain. Only time will tell.  As schools actually begin to assess their learning outcomes, we’ll have a better idea of how seriously assessment is being taken by law schools.

Assessment and Strategic Planning

Over at PrawfsBlawg, my friend Jennifer Bard, dean of Cincinnati Law School, has a post on “Learning Outcomes as the New Strategic Planning.” She points readers to Professors Shaw and VanZandt’s book, Student Learning Outcomes and Law School Assessment. The book is, to be sure, an excellent resource, although parts of it may be too advanced for schools that are just getting started with assessment.  Still, it’s a great book, one that sits on the corner of my desk and is consulted often.  (Dean Bard also gave a nice shoutout to my blog as a resource.)

Citing an article by Hanover Research, Dean Bard draws a key distinction between strategic planning activities of yesteryear and what’s required under the new ABA standards.

Traditionally, law school strategic plans focused on outcomes other than whether students were learning what schools had determined their students should be learning. These often included things like faculty scholarly production, diversity, student career placement, fundraising, and admissions inputs. Former ABA Standard 203 required a strategic planning process (albeit not a strategic plan per se) to improve all of the goals of a school:

In addition to the self study described in Standard 202, a law school shall demonstrate that it regularly identifies specific goals for improving the law school’s program, identifies means to achieve the established goals, assesses its success in realizing the established goals and periodically re-examines and appropriately revises its established goals.

The old standard used the term “assessment” in a broad sense, not just as to student learning. In contrast, new Standard 315 focuses on assessment of learning outcomes to improve the curriculum:

The dean and the faculty of a law school shall conduct ongoing evaluation of the law school’s program of legal education, learning outcomes, and assessment methods; and shall use the results of this evaluation to determine the degree of student attainment of competency in the learning outcomes and to make appropriate changes to improve the curriculum.

This is the “closing the loop” of the assessment process: using the results of programmatic outcomes assessment to improve student learning.

So, what to do with the “old” way of strategic planning? Certainly, a school should still engage in a  strategic planning process that focuses on all of the important outcomes and goals of the school, of which assessment of student learning is just one piece. Paraphrasing a common expression, if you don’t measure it, it doesn’t get done. Indeed, one can interpret Standards 201 and 202 as still requiring a planning process of some kind, particularly to guide resource allocation.

Still, much of the way that some schools engage in strategic planning is wasteful and ineffective. Often, the planning cycle takes years and results in a beautiful, glossy brochure (complete with photos of happy students and faculty) that sits on the shelf. I’m much more a fan of quick-and-dirty strategic planning that involves efficiently setting goals and action items that can be accomplished over a relatively short time-horizon. The importance is not the product (the glossy brochure) but having a process that is nimble, updated often, used to guide allocation of resources, and serves as a self-accountability tool. (Here, I have to confess, my views on this have evolved since serving on the Strategic Priorities Review Team of our University. I now see much more value in the type of efficient planning I have described.)

In this respect, strategic planning and learning outcomes assessment should both have in common an emphasis on process, not product. Some of the assessment reports generated by schools as a result of regional accreditation are truly works of art, but what is being done with the information? That, to me, is the ultimate question of the value of both processes.

Cultural Competency as a Learning Outcome in Legal Writing

Eunice Park (Western State) has a short piece on SSRN, featured in the SSRN Legal Writing eJournal and published in the AALS Teaching Methods Newsletter, about assessing cultural competency in a legal writing appellate advocacy exercise. Cultural competency is listed in Interpretation 302-1 as an example of a “professional skill” that would satisfy Standard 302’s requirement that a school’s learning outcomes include “[o]ther professional skills needed for competent and ethical participation as a member of the legal profession.”

Professor Park writes:

Legal writing courses provide an ideal setting for raising awareness of the importance of sensitivity to diverse cultural mores. One way is by creating an assignment that demonstrates how viewing determinative facts from a strictly Western lens might lead to an unfair outcome.

In writing a recent appellate brief problem, I introduced cultural competence as a learning outcome by integrating culturally-sensitive legally significant facts into the assignment.

She goes on to describe the appellate brief problem and how it helped meet the goal of enhancing students’ cultural competency.

Publishing Learning Objectives in Course Syllabi

The new ABA standards are largely focused on programmatic assessment: measuring whether students, in fact, have learned the knowledge, skills, and values that we want them to achieve by the end of the J.D. degree. This requires a faculty to gather and analyze aggregated data across the curriculum. Nevertheless, the ABA standards also implicate individual courses and the faculty who teach them.

According to the ABA Managing Director’s guidance memo on learning outcomes assessment, “Learning outcomes for individual courses must be published in the course syllabi.”  Continue reading

The Bar Exam and Assessment

Recently, the Standards Review Committee of the ABA announced that it will be proposing a change to ABA Standard 316, which governs the minimum bar exam passage rate that a law school must show in order to remain accredited. As has been reported elsewhere, the current rule provides law schools with a number of ways to demonstrate compliance.  The proposed rule streamlines the path to compliance.  Under the proposal, 75% of a school’s graduates who took a bar exam must pass within two years of graduation.  First-time bar passage will still be reported on a school’s Standard 509 report.

It is important to put this proposal into a broader context:

  • A declining applicant pool has led some schools to lower admissions standards.
  • Critics have argued that some law schools are admitting students who have little to no hope of passing the bar exam, at least not on the first attempt.
  • There has been public bickering between some deans and the National Conference of Bar Examiners about who is to blame for recent declines in bar passage nationwide, with the former blaming the test, and the latter blaming law schools for lowering admissions standards.

I argue that, along the way, there has been a silent rethinking of how we view the bar exam. Many schools take great pride in their bar passage rates, as they reflect well on their programs of legal education.  In the past, a school’s bar exam passage rate was thought to be the measure of knowledge acquired in law school.  But there are a few reasons why the bar exam does not fully assess the learning that goes on in law school: Continue reading

Links to ABA resources added

Under the Resources tab, I added a page with links to ABA resources, including the full text of the new ABA standards on outcomes and assessment, the Managing Director’s memo on the subject, and the “legislative history” behind Standard 301, 302, 314, and 315.

Why a Blog on Assessment in Legal Education?

When the American Bar Association first began discussing revision of its accreditation standards for the J.D. degree to include a full-blown assessment requirement, I was skeptical. I saw “assessment” as more higher ed-speak with no benefit to students. “We’re already assessing students – we give final exams, writing assignments, and projects, and we track bar passage and career outcomes, right?” Later, as I learned more about assessment—including the differences between course-level and programmatic assessment—I came to the conclusion that, stripped of its at-times burdensome lingo, it was a simple process with a worthy goal: improving student learning through data-driven analysis. The process, I learned, was rooted in a scholarly approach to learning: define outcomes, measure and analyze direct and indirect evidence of student learning, and then use the information learned to improve teaching and learning.

Legal education is one of the last disciplines to adopt an assessment philosophy. Looking at assessment reports from programs, such as pharmacy, that have used assessment for years can be daunting. They have come a long way in a relatively short period of time. There is a dearth of information about assessment in legal education and, hence, this blog was born.  My goal is to bring together resources on law school assessment in one place while also offering my observations and practical insights to help keep assessment from drowning in lingo and endless report writing.  I hope readers find it valuable.