Creating an Engaged and Happy Faculty Through Course Evaluations

Creating an Engaged and Happy Faculty Through Course Evaluations

Mention course and instructor satisfaction surveys to faculty members, and you might elicit a frown, an eye-roll, or an outright dismissal of the effectiveness of these seasonal feedback reminders. But if you’re in search of a different response, then let’s look at what MGH Institute of Health Professions accomplished in just one year. A team of disengaged, frustrated faculty tackled evaluations head-on and made changes that created significant results.

A Year in the Making…

MGH’s graduate faculty were struggling to make sense of their course and instructor satisfaction data. This struggle led to discounting what they were trying to interpret and rendered their surveys useless.

Enter a determined Director of Institutional Effectiveness and an improvement team of faculty volunteers with a modified Delphi method plan for engagement. The purpose of this task force was to gain consensus, produce a new survey instrument, and do so within a year’s time. Seem insurmountable? Not with the tactics used.

Through a year of change, MGH faculty answered questionnaires focusing on the survey instrument. Delivered in rounds, these questionnaires included information on how the project was progressing, what stages needed faculty input, and participants had permission to change their opinion. Changes were then made in the proposed new survey instrument based on each prior round. Questionnaire sessions were followed by a summary of all participants’ expert forecast to keep everyone engaged and in-the-know.

After preliminary rounds, additional sessions of faculty-reviewed questionnaires produced a consensus and/or stability of the results that would close the loop on each round. These rounds occurred at every faculty meeting throughout the year, and changes were generated and presented back to faculty at the following meeting.

Survey Says!

Because of MGH’s proactive approach, analysis uncovered “inconsistencies and data that couldn’t be used based on how the questions were worded.” For example, if there were multiple parts to a question, the data was unacceptable because students didn’t know which question they were answering. Faculty members were making changes to courses for the new semester based on misleading feedback, thus receiving worse results. Another common belief was that only “mean students” answered the surveys.

The modified Delphi method played a pivotal role in MGH’s quest to re-engage faculty and move them to a place of curiosity—and not dismissal—of the data. The year-long journey resulted in major improvements and refinements to the questions and shorter versions of both course and instructor evaluation surveys. Now students answer with confidence to very specific and clear questions while empowering faculty to control and change the questions based upon their specific course/program needs.

Not only were the survey instruments altered for the greater good, but the overall approach to taking the surveys evolved. Students now have time in class to complete the online forms. And even better, they know the process and questions weeks ahead of time so there are aren’t any surprises. Faculty have scripts for presenting the evaluation process to their classes. And they’ve even established a mid-course “mini check-in” that provides a model of how faculty might use feedback from students. This practice has enabled faculty to respond in real-time to student feedback and make mid-course corrections if needed. Some of these corrections are often a clarification of students’ expectations for the course which, if left unspoken, would have surveyed as dissatisfaction on the course and instructor evaluation.

Yes, you can have it all…

The impact of using a modified Delphi method has been tremendous. Here’s a snapshot of MGH’s results:

  • Response rates increased from 34% to 73%
  • 62% of the overall instructor mean score increased over prior years
  • 96% of faculty teaching in-person classes provided in-class time
  • Actual completion time decreased from 30-40 minutes to 15! [An approximate 55% improvement]
  • 50% reduction in total number of questions and increase of 60% for narrative comments
  • 77% of faculty are satisfied with new process, instrument, implementation, response rates, etc.

Paulette Di Angi, PhD, Director of Institutional Effectiveness at MGH said it best, “Making the questions very clear, assisting the faculty in ‘owning’ both the process and the results can only improve the value of the data and possibly improve the quality of teaching by affording faculty data they can reflect on and build improvements upon.”


Photo of Brian Hopewell

Brian Hopewell

VP Higher Education Sales | Campus Labs

Following twenty-five years in higher education administration, primarily in senior enrollment management positions, Brian pivoted to higher education technology sales and business development. He has not lead the typical gypsy existence of ed-tech sales leaders in that he’s chosen to abide with one company, Academic Management Systems (AMS), now Campus Labs, for over a dozen years, enjoying several management roles and participating, on the receiving end, of four mergers and acquisitions.