Friday is trash day at my house. Every Thursday evening, I break down the recycling, fill the cans, and roll them down to the end of my driveway. It’s easy because someone gave me the cans, put wheels on them, and promised to stop by every Friday morning and take it all away. But I don’t put the trash out just because someone made the process easy for me. I do it because I want a clean home. Easy is nice, but it’s not the same as useful or meaningful; knowing my trash is properly managed, on the other hand, is meaningful. The same holds true for assessment work. I know it’s a flawed metaphor–I certainly don’t equate assessment with trash–but I do feel meaningful assessment is more valuable than easy assessment.
In meeting after meeting with college administrators, I hear a common refrain about outcomes assessment. “We want this to be as easy as possible for faculty.” While I agree this is important, I would prefer to concentrate on whether the process is also meaningful. Easy is great, but it’s a natural side effect of any well-designed software that might be used to support a thoughtful process. Statements like “we can’t inconvenience faculty” can be interpreted as tacit admission you think assessment has little value but is necessary for accreditation purposes. That’s not a great leadership message to send, and it provides fodder for arguments against learning outcomes, like the one recently published in Inside Higher Ed. Thank goodness we have professionals like Linda Suskie who know better. I say, make the work meaningful and you won’t have to fret about ease.
Unfortunately, it’s common for campuses to see assessment as a necessary evil and lose sight of the benefits. Assessment work is inherent in our jobs and holds a tremendous amount of value in the process, the resulting data, and the conversations that occur before, during, and after. Higher education hasn’t yet reached a point where meaningful and useful assessment is typical. It’s noble to want assessment to be as easy as possible for those who will be executing the process, but it’s wise to ensure they get something out of it.
Students, faculty, staff, and administrators can all benefit from their assessment roles. Students should be willing and active participants as well as thoughtful consumers. Administrators are responsible for using the data their colleagues provide and being transparent about it. If there’s no evidence of learning or if there is evidence that no learning is taking place, action should be taken. I am by no means suggesting punitive action. If data suggests courses aren’t accessible to students, make a change. If an instructor is struggling, help them out. If you see standout levels of success, use that resource to educate others! Highlighting concrete examples of how data can make things better goes a long way towards increasing participation from faculty and staff.
Faculty should also be willing participants and thoughtful consumers of information. It’s laudable to enter a profession for the scholarship and the desire to leave a mark on the discipline. It’s also true that hundreds of students are sitting in class, every day, ready to learn. They deserve a fair opportunity to do so. I can sadly recount the time a faculty member said to me, “I’ve never really considered it my responsibility that students learn.” With that attitude, no data in the world would make this person a thoughtful educator. So let’s instead focus on those who value information and insight. Let’s support them in their work and ensure they aren’t lost in a sea of overly standardized processes, where the information disappears into a meaningless black hole. Data tucked away in binders on a shelf or a desk does not lend itself to meaningful assessment. Meaningful assessment happens when data connected across an educational ecosystem is readily available to those who can benefit from it.
Throughout my years in the classroom, I was completely dedicated to teaching and was successful most of the time. I’m not ashamed to admit I did mess up occasionally, and while my mistakes may have been innocent, they were still mistakes. Knowing something was off and wanting to do a good job was an uncomfortable situation to be in. It was only when I was given access to information that allowed me to fix what was wrong that I found relief.
Gaining access to course-level learning assessment, for example, provided me with valuable insight to improve my effectiveness. Since well-written learning outcomes provide a clear agreement for what a course is about, how it fits into a program, and why students are participating, thoughtful assessment offers data that can reveal whether or not students are learning. In my case, we collected evidence of learning for a particular fundamental outcome through ePortfolios. Evidence of learning was measured using a collaboratively created rubric. Data showed students were struggling with this outcome in every course. Through faculty discussion, we realized we were all explaining the concept in different ways. We agreed to streamline our vocabulary when talking about this concept. And once we did, students started making connections. They learned! All it took was some data to identify the issue and a simple collegial conversation to create a solution.
Learning data helped me make changes to my approach in the classroom and have formative conversations with students. It also spurred meaningful collaboration with my fellow faculty. Assessment tools can provide meaningful data and spare us from tedious solitary work. What these tools can’t do is open the eyes and minds of those who misinterpret or undervalue assessment. It still takes human commitment to make the technology work. Administrators have to provide useful resources and clear expectations, and faculty have to stop fighting against their own best interests.
Too often, I’ve seen assessment plans thoughtfully created and earnestly executed with no clear useful end product. It doesn’t matter how easy it is; the work has to have purpose for both the campus as a whole and the individuals doing the assessment. Institutions of higher education can (and should) glean the most value from learning data. Let’s stop wincing every time we say assessment and instead, recognize how easy it is to make meaningful, positive change. Once we know exactly what and how our students are learning, we can truly begin to innovate.
Shannon LaCount, Ed.D.|
Dr. Shannon LaCount is Vice President, Campus Strategy. Her career in higher education before joining Campus Labs includes eight years of teaching experience as a clinical and classroom professor in Communication Sciences and Disorders and five years as Director of Student Learning Assessment at the University of Minnesota Duluth (UMD), where she led a campus-wide assessment process for academic departments and student life programs. She has also participated in advising events and undergraduate research at UMD, as well as consultations and professional development events as a Teagle Assessment Scholar with the Wabash College Center of Inquiry. She has a master’s degree in speech-language pathology from the University at Buffalo and a doctorate in education from the University of Minnesota.