Moving to Best Practices in Online Learning

By now, all law schools are—or are about to be—fully online. My sense is that professors are mostly teaching synchronously (i.e., live) using Zoom, Webex, or some other platform. In this respect, faculty are trying their best to replicate the in-person law school classroom; in some cases, this even includes cold calling on students to recite cases. Class sizes remain as they were before the Coronavirus hit. That means that some professors are teaching in this format to 60, 80, or even 100 students.

To experts in online teaching and learning, this is not how any of this is supposed to work. Successfully moving a class online requires time, effort, and training, as this helpful post describes. And it typically requires smaller class sizes than we’re used to in legal education, although experts acknowledge that there is no magic number.

A common misconception about moving a course to an online format is that the professor merely delivers the same content in the same manner as a face-to-face classroom … just on video. Not so. Professors new to online learning often view their teaching—incorrectly—through the “lens” of face-to-face classes. In actuality, adapting a course for online learning requires starting anew with course objectives and working backwards from those outcomes the professor hopes to achieve. The steps in students’ development become “modules” (typically on a week-by-week schedule) that are thoughtfully designed to achieve smaller outcomes, like guideposts on a trail. The course is not a march through a casebook.

Each module uses the learning tools that are best designed to achieve that module’s objectives. A module may include a synchronous discussion (as Nina Kohn [Syracuse] persuasively notes) or not. Asynchronous activities, which can be completed at different times in the week, can also be effective. Pre-recorded video lectures (of no more than 15-20 minutes, since attention spans wander after that point), quizzes, discussion posts, writing assignments, readings, and creative projects (such as students creating their own narrated PowerPoints on a subject) are all arrows in the professor’s asynchronous instructional quiver. Many of these activities are themselves formative or summative assessments of students’ progress in meeting the outcomes of the modules and, in turn, the course. The key is to make deliberate, strategic choices about what mode of delivery (synchronous or asynchronous) works best for a particular module or course. Often the best courses will include a blend of the two approaches.

So let us be clear: what we’re doing right now is not best practices; it is a bandaid. It is an emergency response so that we can continue instruction in some format so that our students do not lose the entire semester.

We have a responsibility, however, to do better in the weeks ahead, not just to satisfy accreditors but as part of our obligation to provide a quality education to our students. We should embrace a culture of continuous, quality improvement. This is especially true if the current crisis continues for a significant period and requires law schools to continue to be fully online into the Fall semester.

The Short-Term

There are things we can do in the short-term to improve teaching and learning this semester. In the coming weeks, we should:

  • For faculty who are just recording and posting lectures, encourage them to move to synchronous delivery so students can ask real-time questions and otherwise participate actively.
  • For faculty who are already meeting synchronously, encourage them to adopt active, rather than passive, learning activities in their virtual classrooms. Most videoconference platforms have polling or quizzing features, for instance, which can engage all students, not just those who are participating in a discussion. Get out of the mindset of looking at your online course through a face-to-face lens.
  • Encourage small group interactions to spur discussion and participation. Both Zoom and Webex have “breakout” rooms, and I have become a big fan of using them to spur think-pair-share type exercises. Like polling, breakout sessions turn students into active, rather than passive, participants.
  • Encourage faculty to supplement synchronous sessions with optional, asynchronous exercises, such as quizzes or discussion boards. I recommend making them optional at this point, since we do not want to overwhelm students, who are also adapting to new circumstances.
  • Make ourselves reasonably available for “office hours” through drop-in Zoom or Webex sessions. Not only does this foster personal connection during this time of isolation, but these meetings also give students an opportunity to clarify points they may have missed during pre-recorded lectures or live videoconferences.
  • Help each other.  Having all been thrown in the deep end of the pool at the same time, we are all learning how to swim, some better than others. We can help each other to say afloat. Faculty should share tips, suggestions, and solutions to common problems with one another. As an example, several of us at my school have been struggling with playing videos through our videoconference platform because of “lag” issues. Last night, an adjunct professor came up with an ingenious, but low tech, solution. He turned his laptop (with webcam) to a secondary monitor, brought it close to the monitor, and played the video from the secondary monitor at full volume. It wasn’t perfect, but it allowed students to hear and see the video with decent quality.

I recognize that this is a difficult time for many law professors. Some may be dealing themselves with illness or adjusting to disruption at home. Some of my colleagues now have two full-time jobs: teaching and taking care of school-aged children. Therefore, in my list above, I use words like “try,” “encourage,” and “reasonably available,” not “must” or “require.” The suggestions I have offered must be adapted to each individual faculty member’s circumstances and capabilities at this time.

Nevertheless, to the extent we are able, we should all be thinking about how to get better at teaching in this new format over the next few weeks. For some, that may mean exploring many of the asynchronous tools at our disposal. For others, it may mean trying to facilitate greater discussions versus lecture. Some of the suggestions I list above require no additional time on the professor’s part. If we take these steps, the teaching taking place at the end of April should be better than that being offered now.

The Long-Term

While we are all hopeful that things will get back to normal soon, law schools must nevertheless plan for the possibility that online teaching will need continue into the Fall (depending, of course, on how the virus progresses). What we can get away doing now on an emergency basis is not going to be acceptable to either our accreditors or our students by the Fall. They can, and should, expect a higher quality of instruction from us given the lead-up time we have until the next semester. Here, time is on our side but only if we begin planning now. The key, in my view, is training. Faculty need training on the best practices for online teaching and learning. The Quality Matters Standards and Rubric are a great place to start in terms of where we need to be (eventually) with online learning. The summer months provide an opportunity for such training and development.

Of course, someone has to develop and deliver such training. Law schools that are attached to universities can take advantage of courses that likely already exist through their universities’ centers for teaching and learning. Typically those courses include ones on Blackboard/Canvas, online pedagogy, and similar topics. Standalone law schools should consider working together to develop such courses. Training law faculty, full-time and adjuncts, on a massive scale to deliver “best practices” online teaching will require a great deal of planning to prioritize and rollout development opportunities in a smart way. That planning should take place now, even if the medium-term future is uncertain. I’ll likely have more to say, in a future blog post, on what that planning should look like.

Something else to consider for the Fall is section size. In the academic research, 20 is the recommended, outer limit for class size, and that number is for undergraduate courses. Graduate courses typically have smaller enrollments. Class size affects the ability of a professor to provide interactive experiences and feedback.  The more interactive a course is, the better the learning experience for students. Faculty can successfully provide an interactive experience only to a limited number of students. This may require reconsideration of the class schedule, such as cutting back on electives so that additional, smaller sections of required or core courses can be offered while a law school is in a fully online mode.

Needless to say, these are difficult times for everyone. Adopting a continuous improvement mindset, though, will benefit all of our stakeholders, especially our students.

Assessment in a Time of Coronavirus and Closed Campuses

Today is March 15, 2020, and, by now, most law schools have either announced a transition to fully online teaching or set a date when they will begin doing so.  Although many schools have said that this situation is temporary and will last for no more than a few weeks, my personal prediction is that most schools will not resume face-to-face teaching this semester.  This post invites faculty and administrators to think now about the consequences for assessment during this challenging time.

Although my usual interest is in programmatic assessment, here I am writing specifically about course-based assessment.  On the one hand, the next six seeks or so may be an opportunity for faculty to provide more formative assessments to students, such as low-stakes quizzes, essays, and discussion posts. Such activities are a way to keep students engaged with material.

However, there is a looming assessment issue that will require some attention sooner rather than later: how to engage in the typical end-of-semester summative assessments, such as final exams and, for skills classes, final activities. The questions that a law school must answer are several and complex: Continue reading

Suskie: Course vs. Program Learning Goals

Linda Suskie has a new blog post up about the difference between course and program learning goals.  She begins by cutting through some of the jargon and vocabulary to summarize learning goals as:

Learning goals (or whatever you want to call them) describe what students will be able to do as a result of successful completion of a learning experience, be it a course, program or some other learning experience. So course learning goals describe what students will be able to do upon passing the course, and program learning goals describe what students will be able to do upon successfully completing the (degree or certificate) program.

I encourage readers to check out the full post from Ms. Suskie.

Dean Vikram Amar on Constructing Exams

Dean Vikram Amar (Illinois) has an excellent post on Above the Law about exam writing.  He offers four thoughts based on his experience as a professor, associate dean, and dean.  First, Dean Amar talks about the benefits of interim assessments:

Regardless of how much weight I attach to midterm performance in the final course grade, and even if I use question types that are faster to grade than traditional issue spotting/analyzing questions — e.g., short answer, multiple-choice questions, modified true/false questions in which I tell students that particular statements are false but ask them to explain in a few sentences precisely how so — the feedback I get, and the feedback the students get, is invaluable.

Second, Dean Amar articulates an argument in favor of closed book exams: Continue reading

Quick Resources on Self Assessment

A colleague and I were just chatting about time efficient ways to incorporate more assessment activities in our writing courses, and we began talking about the value of self-assessment in the writing process.  Here are some quick resources on the subject:

Publishing Learning Objectives in Course Syllabi

With the Fall semester about a month away (eek!), many faculty are turning their attention to refreshing their courses and preparing their syllabi. This is an opportune time to repost my thoughts on course-level student learning outcomes, which the ABA requires us to publish to our students. Much ink has been spilled on what verbs are proper to use in our learning outcomes; as I noted in August 2016, I hope that we in legal education can take a more holistic view.

Law School Assessment

The new ABA standards are largely focused on programmatic assessment: measuring whether students, in fact, have learned the knowledge, skills, and values that we want them to achieve by the end of the J.D. degree. This requires a faculty to gather and analyze aggregated data across the curriculum. Nevertheless, the ABA standards also implicate individual courses and the faculty who teach them.

According to the ABA Managing Director’s guidance memo on learning outcomes assessment, “Learning outcomes for individual courses must be published in the course syllabi.” 

View original post 696 more words

NLJ: Feedback on Feedback

Karen Sloan of the National Law Journal reports on a symposium issue of the University of Detroit Mercy Law Review about formative assessment.  She compares two studies that seem to reach different conclusions on the subject.

First up is an article by a group of law professors Ohio State, led by Ruth Colker, who conducted a study offering a voluntary practice test to students in Constitutional Law.  Those who opted for the voluntary test and mock grade did better on the final exam.  Those students also did better on their other subjects than non-participants.

The second article was by David Siegel of New England.  He examined whether individualized outreach to low performing students would benefit their end-of-semester grades.  In his study, he sent e-mails to students in his course who scored low on quizzes.  He also had follow-up meetings with them.  His control group was students who scored slightly higher on the quizzes but didn’t receive any individualized feedback or have one-on-one meetings.  He found that there was no statistical difference between the final grades of the groups.

From this, Ms. Sloan concludes:

There’s enough research out there on the benefits of formative assessments to put stock in the conclusion the Ohio State professors reached, that more feedback on tests and performance helps. But I think Siegel’s study tells us that the manner and context of how that feedback is delivered makes a difference. It’s one thing to have a general conversation with low performing students. But issuing a grade on a practice exam—even if it doesn’t count toward their final grade—I suspect is a real wake-up call to students that they may need to step up and make some changes.

I agree 100% with Ms. Sloan’s takeaway.  One additional point: the two studies are really measuring two different things. Professor Colker’s was about formative assessment, while Professor Siegel’s was about the efficacy of early alerts. After all, all students in his class took the quiz and got the results. I also note that Professor Siegel’s “control group” wasn’t really one, since they received higher grades on the first quiz, albeit ones that were only slightly higher. It may be that this group benefitted just from taking and scoring the quiz.  An interesting way to re-run the study would be to do as Professor Colker and her colleagues did at Ohio State: invite participants from all grade ranges to participate in the extra feedback.  Of course, there’s still the problem of cause-and-effect versus correlation.  It may be that the students in Professor Colker’s study were simply more motivated, and it is this fact—motivation—that is the true driver of the improvement in grades.  Nevertheless, these are two, important studies and additions to the conversation about assessment in legal education. (LC)

 

Minnesota Study: Formative Assessment in One First-Year Class Leads to Higher Grades in Other Classes

Over at TaxProf, Dean Caron reports on a University of Minnesota study that found that students who were randomly assigned to 1L sections that had a class with individualized, formative assessments performed better in their other courses than those who did not.  Daniel Schwarcz and Dion Farganis authored the study, which appears in the Journal of Legal Education.

From the overview section of the study:

The natural experiment arises from the assignment of first-year law students to one of several “sections,” each of which is taught by a common slate of professors. A random subset of these professors provides students with individualized feedback other than their final grades. Meanwhile, students in two different sections are occasionally grouped together in a “double- section” first-year class. We find that in these double-section classes, students in sections that have previously or concurrently had a professor who provides individualized feedback consistently outperform students in sections that have not received any such feedback. The effect is both statistically significant and hardly trivial in magnitude, approaching about one-third of a grade increment after controlling for students’ LSAT scores, undergraduate GPA, gender, race, and country of birth. This effect corresponds to a 3.7-point increase in students’ LSAT scores in our model. Additionally, the positive impact of feedback is stronger among students whose combined LSAT score and undergraduate GPA fall below the median at the University of Minnesota Law School.

What’s particularly interesting is how this study came about. Minnesota’s use of “double sections” created a natural control group to compare students who previously had formative assessment with those who did not.

The results should come as no surprise. Intuitively, students who practice and get feedback on a new skills should outperform students who do not. This study advances the literature by providing empirical evidence for this point in a law school context. The study is also significant because it shows that individualized, formative assessment in one class can benefit those students in their other classes.

There are policy implications from this study. Should associate deans assign professors who practice formative assessment evenly across 1L sections so that all students benefit? Should all classes be required to have individualized, formative assessments? What resources are needed to promote greater use of formative assessments—smaller sections and  teaching assistants, for example?

What Would a Small, Assessment-Rich Core Course Look Like?

I just finished slogging through 85 final exams in my Evidence course, and it got me thinking about how I would teach the course if it was offered in a small format of, say, 20 students. Evidence at our school is a “core” course, one of five classes from which students must take at least four (the others are Administrative Law, Business Organizations, Tax, and Trusts and Estates). Naturally, therefore, it draws a big enrollment. I love teaching big classes because the discussions are much richer, but the format hampers my ability to give formative assessments. This semester, I experimented with giving out-of-class, multiple choice quizzes after each unit. They served several purposes. They gave students practice with the material, and they allowed me to see students’ strengths and weaknesses. I was able to backtrack and go over concepts that students had particular difficulty mastering.

But having read 255 individual essays (85 times three essays each), I’m left convinced that students would benefit from additional feedback on essay writing. In lieu of a final exam, I’d love to give students a series of writing assignments throughout the semester. They could even take the form of practice writing documents, like motions. But to be effective, this change requires a small class. So that got me thinking: how would I change my teaching if my Evidence course had 20 students instead of 85? Continue reading