Thought You Were Closing the Loop? Think Again.

“The end of assessment is action.”
- Walvoord (2010)

Everyone knows that assessment is all about the use of the data. There’s a reason that “closing the loop” is one of the most used and most loved phrases in the assessment professional’s handbook – it’s always relevant. The assessment cycle (whatever version of it you use) is here to stay – it’s always a loop, and that final step is what matters.

At least, that’s what you tell us. Our latest user research survey of institutional effectiveness professionals shows that 80 percent of you rate “meaningful use of data to inform improvement” as extremely important and 84 percent of you saw that “successfully closing the loop on use of results” is also extremely important. And of course, accreditors make it crystal clear that this is what’s expected of us.

Well, I have some bad news to share: Despite your best efforts, for four out of five of you, your loop is still broken.

How do we know that? Data, of course!

We looked at the assessment documentation processes from 86 different institutions of varying sizes and types across the United States. Each of these institutions mapped the fields in their assessment documentation templates to the Plan, Do, Check, Act framework which we use in our assessment software:

'Plan Do Check Act'

Here’s an example of what that may look like. Let’s say your institution has three different assessment documentation forms – one for Academic Assessment, one for Strategic Planning, and one for Student Life Assessment. Each of those forms has a set of fields that your staff and faculty fill out and those fields correspond to a phase in the assessment process. Here’s an example of how a form might map to the phases of the cycle:

Plan

  • Unit Goal
  • Related Strategic Priority
  • Success Criteria

Do

  • Assessment Method
  • Description of Assessment Process
  • Date of Data Collection

Check

  • Summary of Results
  • Was Success Criteria met?
  • Interpretation and Reflection
  • Recommendations for Action
  • Resources Needed

Act

  • Actions Taken
  • Evidence of Impact

So when we add up all of the assessment processes and forms from those 86 institutions, we have a set of 204 assessment cycles made up of 2,397 fields. When we looked closer at this data, we learned a few things:

We really, really, really like to plan. Almost half of all the information we document about our assessments is mapped to the PLAN phase.

'Fields by phase'

By comparison, only 20 percent of the fields were mapped to the ACT stage. And if we look closer at the data on an institution-by-institution level, we found over 50 percent of the institutions had absolutely no fields mapped to the Act phase at all. (That means that these institutions have yet to design or implement an assessment process that even intends to capture evidence of actions taken.)

'Institutions and assessment cycles mapped to each phase of the assessment cycle'

As you can see above in our Plan, Do, Check, Act model, the ACT phase requires more than just saying we’ll make improvements or changes – it requires that the actions were **actually taken**. (Recommendations for action are cataloged in the CHECK phase, when results are being analyzed and interpreted.) When we looked even closer at the field names and descriptions that were mapped to the ACT stage, we found something that brought our number down even further – many fields asked about planned actions, and never confirmed what actually happened.

'One in five institutions'

All in all, only one in five of the institutions were collecting evidence of actions taken based on data.

Which, at first, sounds very disheartening. After all, isn’t the point of all of this to drive improvement and action? Schoepp and Benson (2015)​ talk about this dilemma in their work – the gap between deciding on closing the loop actions and actually taking meaningful actions. (They also identified that it takes an average of 1.68 years for us to implement one of those actions!)

The truth is, we spend a lot of meaningful time and effort guiding our faculty and staff to do good assessment – to start with an outcome, to pick an appropriate method, to stop and consider that data. So we’ve emphasized that a lot in our reporting and documentation. And, by and large, it looks like that has made an impact on the quality of our assessment efforts – which is certainly an accomplishment and something we should celebrate as a win.

But continuous improvement doesn’t magically come by virtue of having data. How can you ensure that your hard-fought assessment efforts close that loop in a meaningful way? The first step can be to stop and take a closer look at your assessment documents to see if you can make some simple changes in existing processes. Then you can continue the momentum of the cycle all the way through, leveraging the success you’ve had in driving quality assessment in the early phases.

To help with this, we examined the templates from those campus success stories and identified specific techniques to incorporate for the crucial “Act” stage documentation into your assessment process.

So, are you committed to putting your #DataInAction and fixing your broken assessment loop?

Four Ways to Document Data in Action
Download Four Ways to Document Data in Action

Photo of Annemieke Rice

Annemieke Rice

Vice President of Campus Strategy | Campus Labs

Annemieke’s passion for the Campus Labs mission is what’s been motivating her at this company for more than a decade. Her insightful approach and insider’s candor is drawn from her previous experience as an employee of a member campus. Before joining the Campus Labs team in 2008, Annemieke worked for her alma mater, Northeastern University, where she served first as an academic advisor and then as a Fellow to the Senior Vice President of Enrollment Management and Student Affairs. In addition to leading the division’s assessment efforts, she participated in divisional and institutional strategic planning and initiatives for student retention. She also served on the self-study teams for Northeastern’s NEASC accreditation, as well as AACSB accreditation for the university’s D’Amore-McKim School of Business.

Annemieke earned a Bachelor of Arts in Behavioral Neuroscience and Journalism from Lehigh University and a Master of Science in Applied Educational Psychology from Northeastern University. A charismatic speaker, she has presented at more than 100 national and regional forums and consulted with more than 250 higher education institutions.