When we think of traditional on-campus partnerships, a coordinated effort involving student services, enrollment management, and institutional research may not immediately come to mind. Yet, when aligned, the concepts of student success and institutional effectiveness provide a clearer view of how campuses can make the best use of all their resources to anticipate, identify, and fulfill the needs of students.
Tell me if this sounds familiar
Before joining Campus Labs, I served for many years as a senior administrator at a private, four-year liberal arts college. During my time on campus, I learned quickly that suggesting we limit funding to certain student success programs or support services was much like touching the third rail on the subway. The first time I broached this, it’s true our retention rates had increased for the first time in a decade, and our graduation rates were higher than they had been in seven years. But we had approximately 40 line items and no system for confirming which of our numerous initiatives were truly working. In addition, those who owned the initiatives claimed that by eliminating their particular piece of the student success puzzle, we’d be risking drastic reductions in retention and graduation rates. Without a holistic data set to help me determine the relative contribution of every program and policy, I couldn’t even begin to demonstrate otherwise. So each year we continued funding multiple—sometimes duplicative—efforts. We did this because we were motivated by an understandable mix of good intentions, blind faith, and let’s face it, fear of change.
Making change less daunting
While institutional decision-making should ideally be driven by a student success strategy, the scenario I just described suggests there is much room for improvement on many campuses. For one thing, today’s higher education marketplace elicits a degree of fear that’s perfectly understandable. Students talk with their feet—and their tuition dollars. Campuses want to ensure students are as well-positioned as possible to succeed, but the reality is, they don’t have a limitless supply of funding for every pet program. Only by bringing student success and institutional effectiveness together can colleges begin to understand what’s happening at their campus on a deeper level, including hidden aspects of the student journey.
Viewing student success as being at the heart of institutional effectiveness brings data and researchers to the forefront of decision-making on campus—and to the forefront of institutional progress. IE professionals can do so much more than compile IPEDS reports, train faculty on how to provide assessment data, or craft a regional accreditation certification narrative.
Asking different questions
The institutional researcher— the administrator behind all the metrics—can help campuses more closely focus on student success efforts by bringing new questions to the surface. The data needed to answer the most important questions, however, should be neither abstract nor theoretical. It should be practical and personal, because the insights gleaned from the information could have an immediate impact on the everyday lives of students.
Imagine, for instance, that your campus writing center has a check-in system. Tracking visits could enable your staff to see which students are using the center at which times. Knowing that over a period of one week, there were 432 unique student visits to the center out of a total of 613 visits tells only part of the story. To increase the potential for deeper insights, you should review and connect more of your available data.
Now suppose you continue tracking and connecting the different data points throughout an academic term or academic year. Armed with even more information, including final grades in writing-intensive courses, you could explore whether regular visits to the writing center correlate with achievement in specific outcomes. Which students aren’t visiting the center, including those who would benefit the most? The ability to answer a question like this using good data opens up the possibility for more questions—and greater strides toward meaningful improvement.
Progress is within reach
So what can administrative leaders do to begin asking—and answering—targeted questions? How can they ensure their focus on student success is fueling institutional effectiveness?
Step One: Strive for a Culture Where Silos Are Discouraged
Student success can—and should—be considered every person’s business on campus. Without students, after all, colleges and universities would have empty classrooms. For institutional effectiveness and student success to meaningfully partner, data silos need to be eliminated. If the goal is to assess how different combinations of services, programs, policies, and procedures are impacting things like academic success and retention, it’s essential to have a holistic database to work from. But beyond gathering the various data points, a campus needs to develop a culture in which data is shared openly across offices in order to inform decisions, allocate resources, and help a much greater number of students succeed.
Step Two: Track and Assess Whenever and Wherever Possible
Eliminating data silos will help bridge the information gap between the offices tasked with student success and institutional effectiveness. Meanwhile, numerous data points from other corners of the campus also need to be tapped into. A truly holistic view requires tracking variables such as the following:
- Noncognitive skills, including levels of confidence and resilience
- Assessments, such as course-based outcome measures or survey results
- Co-curricular involvement, including attendance at events and participation in specialized programs, clubs, and student organizations
- Use of academic resources, such as the campus tutoring center
- Program participation, including federally funded programs such as TRIO or a Title IV program
- Course enrollment behavior, including how early or how late a student tends to register for classes as well as what kind of courses the student prefers
Successful tracking will likely require new technologies, a reallocation of data responsibilities, and a willingness for administrators to think outside their traditional roles.
Step Three: Use Data to Put Anecdotal Evidence into Perspective
A program owner may point to the positive impact of a specific program on a single student. Another might cite a policy that addressed a particular issue three decades ago. While these initiatives needn’t be discounted, they shouldn’t continue unquestioned either. Relying solely on personal anecdotes and historical knowledge—even if the results have been positive—will impede efforts to change the culture surrounding student success.
Given how personal services and programs can be to campus staff, it will be essential that data is able to speak for itself—without the need for a lot of additional context from those overseeing the program or policy in question. Anyone working in student success will hopefully have the interests of students in their minds as they look at ways to maximize student success on campus—even if it could work against their own personal interests at times. From senior leadership down, a culture needs to exist whereby data is allowed to speak for itself with some context provided by those directly involved.
It’s about more than celebrating a five-point increase in the retention of first-year students. It’s even more than boasting a high graduation or completion rate for four years in a row. A broader use of data analytics can help answer a variety of deeper questions tied to an institution’s strategic goals and priorities around retention and success. That is indeed the point at which student success and institutional effectiveness will naturally converge.
Will Miller, PhD, leverages data best practices to help campuses make strategic decisions. He joined the Campus Labs team in 2016, after serving as a faculty member and senior administrator at Flagler College in Florida. There, as Executive Director of Institutional Analytics, Effectiveness, and Planning, he helped transform the campus-wide outcomes assessment process. He also served as Accreditation Liaison to the Commission on Colleges of the Southern Association of Colleges and Schools (SACSCOC). Before joining Flagler, he held faculty positions at Southeast Missouri State University, Notre Dame College, and Ohio University. His courses have explored topics in political science, public policy, program evaluation, and organizational behavior. His scholarly pursuits focus on assessment, campaigns and elections, polling, political psychology, and the pedagogy of political science and public administration.