Preparing for the NextGen Bar Exam: Questions to Consider

Last week, the National Conference of Bar Examiners released a preliminary set of content outlines for the NextGen Bar Exam. If the timeline holds, this new bar exam will be implemented in 2026. While this seems a long time from now, it is not. Part-time students who enroll this Fall will take this new bar exam. As a result, many schools are now considering how best to adapt to the new bar exam, which has a (1) reduced number of doctrinal subjects tested and (2) increased testing of skills.

Here, I list the questions that I am thinking about both generally and for my school:

  1. What are our school’s goals besides bar passage? It may seem bizarre to start with this question on a blog post about the bar exam, but I think it is the most important. Law schools are not three-year bar preparation programs. We have other goals for our students and ourselves. Career placement, for instance, is an important outcome for law schools. Preparing students for careers in the legal profession requires a curriculum that may not necessarily align with one specifically designed just for the bar exam. Entry-level employers may wish to see students who have knowledge, skills, and values that are not tested on a bar exam. In addition, lifelong success in the legal profession—however we define it—may warrant preparation that does not align with the bar exam. Knowing at the outset what our other goals are will help us to balance what may be competing priorities for a limited number of credits in the curriculum. 
  2. What are our ethical and consumer protection obligations to students? Professor Melissa Shultz (Mitchell Hamline) makes a compelling case in a forthcoming piece in the Journal of Legal Education that law schools have an obligation to prepare students for the new bar exam by taking action now so that students who will sit for that exam will be fully prepared for it. “These monumental changes to the bar exam,” Professor Shulz writes, “do not allow for the legal academy to take a tempered ‘wait-and-see’ approach before taking action.” Her article shares a number of helpful strategies for doing so.
  3. To what extent does our existing curriculum align with the Uniform Bar Exam? If a school has high bar passage and its curriculum is not particularly aligned with the UBE subjects, it may be that only minor modifications are needed. These are likely schools that enroll students who are excellent test takers and will do well with any format of exam. For most schools, however, their curricula may be somewhat or significantly aligned with the UBE content outline. As a result, they will need to do a more significant re-alignment of their curricula to meet the new bar exam while still achieving other curricular goals (#1).  
  4. To what extent does the NextGen bar exam differ from the UBE in doctrinal subjects that are tested? We know that some subjects, such as Secured Transactions, are being dropped from the exam. What are they? Within each subject that remains, how is coverage changing? What is added or dropped? We must all become experts on what the new bar exam is and is not so that we can speak and act thoughtfully on the subject.
  5. Where is the doctrine tested on the NextGen bar exam taught in our existing curriculum? Are there any curricular gaps? Since the NextGen exam largely removes doctrine rather than adding it, I imagine the answer at most schools is that there are no gaps in substantive and procedural law.
  6. Which skills will be tested (or tested more heavily) on the NextGen bar exam? This is the most signficant change to the bar exam in my view—the heavy testing of lawyering skills. 
  7. Where are those skills taught in the existing curriculum, if at all? Are they taught in-depth or just in a cursory fashion? How are they assessed? A curriculum map may be helpful, as are focus groups and surveys of faculty who teach what may currently be specialized electives with low enrollments. 
  8. How do we adapt the curriculum? This is the most significant question and requires consideration of: a school’s other goals (#1), its ethical and consumer protection obligations (#2), the extent to which a school focuses on bar passage (#3), the gaps in doctrine (#5) and skills (#7) between the current curriculum and the new bar exam, and whether there are gaps in faculty expertise that may require a new approach to hiring (#10).
  9. How should our teaching methods and assessments adapt to prepare students for the format of the new bar exam? The new bar exam will employ different assessment tools than we are used to. To what extent should we expose students to them while they are in law school?
  10. How does the NextGen bar exam impact faculty hiring? The number, type, and subject matter expertise of new faculty may need to be reconsidered if a major curriculum realignment is expected.  
  11. What timeline should we follow? Has the NCBE stayed on track with its timeline, suggesting that a 2026 implementation is likely? If so, what steps do we have to take and when to meet our ethical and consumer obligations (#2) and prepare our students for this new exam while still ensuring that current students taking the UBE in interim are well-prepared for that exam?
  12. What should be the process of educating faculty, administrators, and students about the new bar exam and getting buy-in from constituent groups about the new exam? Education and buy-in are two separate considerations.
  13. Will our state’s supreme court adopt the NextGen bar exam? Just as law schools are thinking about the bar exam, state supreme courts are looking at whether they will sign on to the NCBE’s new test or go in a different direction. If most of a law school’s graduates will sit for the bar exam in a state that does not adopt the NextGen bar exam, much of these considerations are mooted. However, the school will still need to think about how it is preparing students for the bar exam that they will take. In addition, a school will need to consider the students who will sit for a NextGen bar exam out-of-state.
  14. Is the NextGen bar exam such a significant shift that it warrants rethinking our admissions criteria? ABA Standard 501(b) requires that law schools only admit students who appear capable of being admitted to the bar exam. If the bar exam changes, it may be that predictors of success change. Should we put different weight on the LSAT/GRE, UGPA, work experience, and references than we do now? Unfortunately, we will not have data on whether our existing admissions framework remains predictive until after the first few cohorts sit for the new bar exam.

It may be there are other questions a school should consider, and I will keep adding to the list as I think of them.

Cross Cultural Competency as a Learning Outcome

The ABA received a flood of comments regarding a series of proposed changes to the Standards. The proposals involve professional identity formation and, broadly, the topic of diversity and inclusion. I added my two cents, suggesting areas where the proposals got it right and where they could use improvement. I noted an overarching concern about the ABA Standards being used to advance particular views of legal education that are better left to individual schools to decide whether to adopt. As I wrote, I can understand fully why schools may wish to specialize or distinguish themselves through professional identity formation or, for that matter, law and economics, public interest law, or international perspectives. That does not mean those views should be imposed on the other roughly 200 schools governed by the Standards as a matter of accreditation.

In any event, one of the proposals would required all law schools to provide training in “bias, cross-cultural competency, and racism” at two points in a student’s education: once at the beginning and at least once before graduation. If a student takes a clinic or externship, the second training must take place before (or concurrent with) enrollment in the clinic or externship. This proposal would add the requirement to Standard 303, the same Standard that requires (a) three broad course sequences (professional responsibility, legal writing, and the 6-credit experiential requirement) and (b) substantial opportunities for clinics, externships, and pro bono work.

In my comment, I suggested a different approach: adding cross-cultural competency to Standard 302.

Continue reading

Collecting Ultimate Bar Passage Data: Weighing the Costs and Benefits

The bar exam is an important outcome measure of whether our graduates are learning the basic competencies expected of new lawyers. As the ABA Managing Director reminded us in his memo of June 2015, however, it can no longer be the principal measure of student learning. Thus, we’re directed to look for other evidence of learning in our programs of legal education, hence the new focus on programmatic assessment.

Nevertheless, the ABA has wisely retained a minimal bar passage requirement in Standard 316, described in greater detail here. It is an important metric for prospective students. It is also an indicator of the quality of a school’s admission standards and, indirectly, its academic program. Indeed, it has been the subject of much debate recently. A proposal would have simplified the rule by requiring law schools to demonstrate that 75% of their graduates had passed a bar exam within two years of graduation. For a variety of reasons, the Council of the Section of Legal Education and Admission to the Bar recently decided to postpone moving forward with this change and leave Standard 316 written as-is.

With that background, Friday afternoon the ABA Associate Deans’ listserv received a message from William Adams, Deputy Managing Director of the ABA.  In it, he described a new process for collecting data on bar passage. A copy of the memo is on the ABA website. This change was authorized at the June 2017 meeting of the Council.  Readers may remember that the June meeting was the one that led to a major dust-up in legal education, when it was later revealed that the Council had voted to make substantial (and some would say, detrimental) changes to the Employment Questionnaire. When this came to light through the work of Jerry Organ and others, the ABA wisely backed off this proposed change and indicated it would further study the issue.

The change that the ABA approved in June and announced in greater detail on Friday is equally problematic.   Continue reading

Database on Law Schools’ Learning Outcomes

The Holloran Center at St. Thomas Law School (MN)—run by Jerry Organ and Neil Hamilton—has created a database of law schools’ efforts to adopt learning outcomes.  The center plans to update the database quarterly.  

One of the very helpful aspects of the database is that it has coding so that a user can filter by school and by learning outcomes that go above and beyond the ABA minimum.  This will be a terrific resource as schools roll out the new ABA standards on learning outcomes.  In addition, for those of us interested in assessment as an area of scholarship, it is a treasure trove of data.

Frustratingly, it looks like many schools have decided not to go beyond the minimum competences set forth in ABA Standard 302, what the Holloran Center has categorized as a “basic” set of outcomes. The ABA’s list is far from exhaustive.  Schools that have essentially copied-and-pasted from Standard 302 have missed an opportunity to make their learning outcomes uniquely their own by incorporating aspects of their mission that make them distinctive from other schools.  Worse, it may be a sign that some schools are being dragged into the world of assessment kicking-and-screaming. On the other hand, it may indicate a lack of training or a belief that the ABA’s minimums fully encapsulate the core learning outcomes that every student should attain. Only time will tell.  As schools actually begin to assess their learning outcomes, we’ll have a better idea of how seriously assessment is being taken by law schools.

Assessment and Strategic Planning

Over at PrawfsBlawg, my friend Jennifer Bard, dean of Cincinnati Law School, has a post on “Learning Outcomes as the New Strategic Planning.” She points readers to Professors Shaw and VanZandt’s book, Student Learning Outcomes and Law School Assessment. The book is, to be sure, an excellent resource, although parts of it may be too advanced for schools that are just getting started with assessment.  Still, it’s a great book, one that sits on the corner of my desk and is consulted often.  (Dean Bard also gave a nice shoutout to my blog as a resource.)

Citing an article by Hanover Research, Dean Bard draws a key distinction between strategic planning activities of yesteryear and what’s required under the new ABA standards.

Traditionally, law school strategic plans focused on outcomes other than whether students were learning what schools had determined their students should be learning. These often included things like faculty scholarly production, diversity, student career placement, fundraising, and admissions inputs. Former ABA Standard 203 required a strategic planning process (albeit not a strategic plan per se) to improve all of the goals of a school:

In addition to the self study described in Standard 202, a law school shall demonstrate that it regularly identifies specific goals for improving the law school’s program, identifies means to achieve the established goals, assesses its success in realizing the established goals and periodically re-examines and appropriately revises its established goals.

The old standard used the term “assessment” in a broad sense, not just as to student learning. In contrast, new Standard 315 focuses on assessment of learning outcomes to improve the curriculum:

The dean and the faculty of a law school shall conduct ongoing evaluation of the law school’s program of legal education, learning outcomes, and assessment methods; and shall use the results of this evaluation to determine the degree of student attainment of competency in the learning outcomes and to make appropriate changes to improve the curriculum.

This is the “closing the loop” of the assessment process: using the results of programmatic outcomes assessment to improve student learning.

So, what to do with the “old” way of strategic planning? Certainly, a school should still engage in a  strategic planning process that focuses on all of the important outcomes and goals of the school, of which assessment of student learning is just one piece. Paraphrasing a common expression, if you don’t measure it, it doesn’t get done. Indeed, one can interpret Standards 201 and 202 as still requiring a planning process of some kind, particularly to guide resource allocation.

Still, much of the way that some schools engage in strategic planning is wasteful and ineffective. Often, the planning cycle takes years and results in a beautiful, glossy brochure (complete with photos of happy students and faculty) that sits on the shelf. I’m much more a fan of quick-and-dirty strategic planning that involves efficiently setting goals and action items that can be accomplished over a relatively short time-horizon. The importance is not the product (the glossy brochure) but having a process that is nimble, updated often, used to guide allocation of resources, and serves as a self-accountability tool. (Here, I have to confess, my views on this have evolved since serving on the Strategic Priorities Review Team of our University. I now see much more value in the type of efficient planning I have described.)

In this respect, strategic planning and learning outcomes assessment should both have in common an emphasis on process, not product. Some of the assessment reports generated by schools as a result of regional accreditation are truly works of art, but what is being done with the information? That, to me, is the ultimate question of the value of both processes.

Cultural Competency as a Learning Outcome in Legal Writing

Eunice Park (Western State) has a short piece on SSRN, featured in the SSRN Legal Writing eJournal and published in the AALS Teaching Methods Newsletter, about assessing cultural competency in a legal writing appellate advocacy exercise. Cultural competency is listed in Interpretation 302-1 as an example of a “professional skill” that would satisfy Standard 302’s requirement that a school’s learning outcomes include “[o]ther professional skills needed for competent and ethical participation as a member of the legal profession.”

Professor Park writes:

Legal writing courses provide an ideal setting for raising awareness of the importance of sensitivity to diverse cultural mores. One way is by creating an assignment that demonstrates how viewing determinative facts from a strictly Western lens might lead to an unfair outcome.

In writing a recent appellate brief problem, I introduced cultural competence as a learning outcome by integrating culturally-sensitive legally significant facts into the assignment.

She goes on to describe the appellate brief problem and how it helped meet the goal of enhancing students’ cultural competency.

Publishing Learning Objectives in Course Syllabi

The new ABA standards are largely focused on programmatic assessment: measuring whether students, in fact, have learned the knowledge, skills, and values that we want them to achieve by the end of the J.D. degree. This requires a faculty to gather and analyze aggregated data across the curriculum. Nevertheless, the ABA standards also implicate individual courses and the faculty who teach them.

According to the ABA Managing Director’s guidance memo on learning outcomes assessment, “Learning outcomes for individual courses must be published in the course syllabi.”  Continue reading

The Bar Exam and Assessment

Recently, the Standards Review Committee of the ABA announced that it will be proposing a change to ABA Standard 316, which governs the minimum bar exam passage rate that a law school must show in order to remain accredited. As has been reported elsewhere, the current rule provides law schools with a number of ways to demonstrate compliance.  The proposed rule streamlines the path to compliance.  Under the proposal, 75% of a school’s graduates who took a bar exam must pass within two years of graduation.  First-time bar passage will still be reported on a school’s Standard 509 report.

It is important to put this proposal into a broader context:

  • A declining applicant pool has led some schools to lower admissions standards.
  • Critics have argued that some law schools are admitting students who have little to no hope of passing the bar exam, at least not on the first attempt.
  • There has been public bickering between some deans and the National Conference of Bar Examiners about who is to blame for recent declines in bar passage nationwide, with the former blaming the test, and the latter blaming law schools for lowering admissions standards.

I argue that, along the way, there has been a silent rethinking of how we view the bar exam. Many schools take great pride in their bar passage rates, as they reflect well on their programs of legal education.  In the past, a school’s bar exam passage rate was thought to be the measure of knowledge acquired in law school.  But there are a few reasons why the bar exam does not fully assess the learning that goes on in law school: Continue reading

Links to ABA resources added

Under the Resources tab, I added a page with links to ABA resources, including the full text of the new ABA standards on outcomes and assessment, the Managing Director’s memo on the subject, and the “legislative history” behind Standard 301, 302, 314, and 315.

Why a Blog on Assessment in Legal Education?

When the American Bar Association first began discussing revision of its accreditation standards for the J.D. degree to include a full-blown assessment requirement, I was skeptical. I saw “assessment” as more higher ed-speak with no benefit to students. “We’re already assessing students – we give final exams, writing assignments, and projects, and we track bar passage and career outcomes, right?” Later, as I learned more about assessment—including the differences between course-level and programmatic assessment—I came to the conclusion that, stripped of its at-times burdensome lingo, it was a simple process with a worthy goal: improving student learning through data-driven analysis. The process, I learned, was rooted in a scholarly approach to learning: define outcomes, measure and analyze direct and indirect evidence of student learning, and then use the information learned to improve teaching and learning.

Legal education is one of the last disciplines to adopt an assessment philosophy. Looking at assessment reports from programs, such as pharmacy, that have used assessment for years can be daunting. They have come a long way in a relatively short period of time. There is a dearth of information about assessment in legal education and, hence, this blog was born.  My goal is to bring together resources on law school assessment in one place while also offering my observations and practical insights to help keep assessment from drowning in lingo and endless report writing.  I hope readers find it valuable.