Suskie: How to Assess Anything Without Killing Yourself … Really!

Linda Suskie (former VP, Middle States Commission on Higher Education) has posted a great list of common-sense tips about assessments on her blog. They’re based on a book by Douglas Hubbard, How to Measure Anything: Finding the Value of “Intangibles in Business.” My favorites are:

1. We are (or should be) assessing because we want to make better decisions than what we would make without assessment results. If assessment results don’t help us make better decisions, they’re a waste of time and money.

4. Don’t try to assess everything. Focus on goals that you really need to assess and on assessments that may lead you to change what you’re doing. In other words, assessments that only confirm the status quo should go on a back burner. (I suggest assessing them every three years or so, just to make sure results aren’t slipping.)

5. Before starting a new assessment, ask how much you already know, how confident you are in what you know, and why you’re confident or not confident. Information you already have on hand, however imperfect, may be good enough. How much do you really need this new assessment?

8. If you know almost nothing, almost anything will tell you something. Don’t let anxiety about what could go wrong with assessment keep you from just starting to do some organized assessment.

9. Assessment results have both cost (in time as well as dollars) and value. Compare the two and make sure they’re in appropriate balance.

10. Aim for just enough results. You probably need less data than you think, and an adequate amount of new data is probably more accessible than you first thought. Compare the expected value of perfect assessment results (which are unattainable anyway), imperfect assessment results, and sample assessment results. Is the value of sample results good enough to give you confidence in making decisions?

14. Assessment value is perishable. How quickly it perishes depends on how quickly our students, our curricula, and the needs of our students, employers, and region are changing.

15. Something we don’t ask often enough is whether a learning experience was worth the time students, faculty, and staff invested in it. Do students learn enough from a particular assignment or co-curricular experience to make it worth the time they spent on it? Do students learn enough from writing papers that take us 20 hours to grade to make our grading time worthwhile?


Assessment and Strategic Planning

Over at PrawfsBlawg, my friend Jennifer Bard, dean of Cincinnati Law School, has a post on “Learning Outcomes as the New Strategic Planning.” She points readers to Professors Shaw and VanZandt’s book, Student Learning Outcomes and Law School Assessment. The book is, to be sure, an excellent resource, although parts of it may be too advanced for schools that are just getting started with assessment.  Still, it’s a great book, one that sits on the corner of my desk and is consulted often.  (Dean Bard also gave a nice shoutout to my blog as a resource.)

Citing an article by Hanover Research, Dean Bard draws a key distinction between strategic planning activities of yesteryear and what’s required under the new ABA standards.

Traditionally, law school strategic plans focused on outcomes other than whether students were learning what schools had determined their students should be learning. These often included things like faculty scholarly production, diversity, student career placement, fundraising, and admissions inputs. Former ABA Standard 203 required a strategic planning process (albeit not a strategic plan per se) to improve all of the goals of a school:

In addition to the self study described in Standard 202, a law school shall demonstrate that it regularly identifies specific goals for improving the law school’s program, identifies means to achieve the established goals, assesses its success in realizing the established goals and periodically re-examines and appropriately revises its established goals.

The old standard used the term “assessment” in a broad sense, not just as to student learning. In contrast, new Standard 315 focuses on assessment of learning outcomes to improve the curriculum:

The dean and the faculty of a law school shall conduct ongoing evaluation of the law school’s program of legal education, learning outcomes, and assessment methods; and shall use the results of this evaluation to determine the degree of student attainment of competency in the learning outcomes and to make appropriate changes to improve the curriculum.

This is the “closing the loop” of the assessment process: using the results of programmatic outcomes assessment to improve student learning.

So, what to do with the “old” way of strategic planning? Certainly, a school should still engage in a  strategic planning process that focuses on all of the important outcomes and goals of the school, of which assessment of student learning is just one piece. Paraphrasing a common expression, if you don’t measure it, it doesn’t get done. Indeed, one can interpret Standards 201 and 202 as still requiring a planning process of some kind, particularly to guide resource allocation.

Still, much of the way that some schools engage in strategic planning is wasteful and ineffective. Often, the planning cycle takes years and results in a beautiful, glossy brochure (complete with photos of happy students and faculty) that sits on the shelf. I’m much more a fan of quick-and-dirty strategic planning that involves efficiently setting goals and action items that can be accomplished over a relatively short time-horizon. The importance is not the product (the glossy brochure) but having a process that is nimble, updated often, used to guide allocation of resources, and serves as a self-accountability tool. (Here, I have to confess, my views on this have evolved since serving on the Strategic Priorities Review Team of our University. I now see much more value in the type of efficient planning I have described.)

In this respect, strategic planning and learning outcomes assessment should both have in common an emphasis on process, not product. Some of the assessment reports generated by schools as a result of regional accreditation are truly works of art, but what is being done with the information? That, to me, is the ultimate question of the value of both processes.

What is the point of curriculum mapping?

Curriculum mapping is the process of identifying where in a school’s curriculum each of its learning outcomes is being taught and assessed. We recently posted our curriculum maps on our assessment webpage, including the survey instrument we used to collect data from faculty.

Curriculum mapping was a big discussion item at an assessment conference in Boston last Spring and understandably so. But, to be clear, curriculum mapping is, itself, not assessment. It is, rather, a tool to assist with the programmatic assessment process.  It also furthers curricular reform.

Mapping is not assessment in the programmatic sense because even the best of curriculum maps will not show whether, in fact, students are learning what we want them to learn. Curriculum mapping helps with assessment because it enables an assessment committee to identify where in the curriculum to look for particular evidence (“artifacts” in the lingo) of student learning.

It also helps with curricular reform in two ways:

  • by enabling a faculty to plug holes in the curriculum.  If an outcome has been identified as desirable but it is not being taught to all or most students, a new degree requirement could be created. Our school did this with negotiation. We had identified it as a valuable skill but realized, through a curriculum mapping exercise done several years ago, that it was not being taught to a sufficient number of students. We then created a 1L course specifically on negotiation and other interpersonal skills.
  • by restructuring degree requirements so that smarter sequencing occurs. In theory, advanced instruction should build upon introductions.  A curriculum map will help show the building blocks in particular outcomes: introduction to competence to advanced.

Overall, I hope that schools put serious thought into curriculum mapping, while also recognizing that it is not the end of assessment … but instead the beginning.

Checklist for Getting Started with Assessment

I’m at a conference, Responding to the New ABA Standards: Best Practices in Outcomes Assessment, being put on by Boston University and the Institute for Law Teaching and Learning.  The conference is terrific, and I’ll have a number of posts based on what I’ve learned  today.

It strikes me that law schools are at varying stages of assessment.  Some schools—particularly those who have been dealing directly with regional accreditors—are fairly well along.  

But other schools are just getting started.  For those schools, I recommend keeping it simple and taking this step-by-step approach:

  1. Ask the dean to appoint a assessment committee, composed of faculty who have a particular interest in teaching and learning.
  2. Start keeping detailed records and notes of what follows.  Consider a shared collaboration space like OneDrive or Dropbox.  
  3. As a committee, develop a set of 5-10 proposed learning outcomes for the JD degree, using those in Standard 302 as a starting point.  (Alternatively, if you wish to start getting broader buy-in, ask another committee, such as a curriculum committee, to undertake this task.)  If you school has a particular mission or focus, make sure it is incorporated in one or more of the outcomes.
  4. Bring the learning outcomes to the full faculty for a vote.
  5. Map the curriculum.  Send a survey to faculty, asking them to identify which of the institutional outcomes are taught in their courses.  If you want to go further, survey faculty on the depth of teaching/learning (introduction, practice, mastery).  Compile a chart with the classes on the Y axis and learning outcomes on the X axis.  Check off the appropriate boxes to indicate in which courses the outcomes are being taught (the point of assessment is to identify whether students are actually learning them).  
  6. Identify one of the outcomes to assess and how you’ll do so: who will measure it, which assessment tools they’ll use, and what will be done with the results.
  7. Put your learning outcomes on your school’s website.

All of this can probably be done in 1-2 years.  It essentially completes the “design phase” of the assessment process.  Separately, I’ll post about some ideas of what not to do in the early stages …