Lots of mainstream businesses are graduating from rearview-mirror reporting to more advanced, predictive analytics, so it's no surprise that colleges and universities are making the same progress.
The goal, of course, is to figure out what's coming. You know how many students enrolled last fall, but did you accurately forecast how many freshmen would drop out after the first semester? Can you anticipate what revenue will look like next fall? How many dorm rooms, classrooms and dining halls will your college need five years from now? These are just a few examples of the data-driven analyses taking place in higher ed.
The move to analytics comes as colleges and universities face declining enrollments. Nearly half of colleges and universities responding to a recent survey by Moody's Investors Service said they expect enrollment of full-time students to decline. As a result, a third of them expect tuition revenue to decline or to grow at less than the inflation rate.
Compounding the problem, many colleges and universities are experiencing higher drop-out rates. The University of Kentucky ventured into advanced analytics last year hoping to improve student retention. UK is working on a way to identify which students are having trouble so that the university can intervene before they drop out.
UK did a proof-of-concept project in March, using six years' worth of student data going back to 2006, to model the characteristics of students who did and did not drop out after their freshman year. Within two months, the university developed a baseline predictive model using high school GPA and college entrance exam scores.
In May, the university hired a statistician who advanced the hypothesis that UK could do a better job of predicting retention by measuring student engagement, so the team added variables from the university's Blackboard learning management system. That data included the number of times students log in to their class Web pages, check syllabuses, download homework assignments and collaborate online with classmates, and whether they turn in homework assignments on time. UK is using SAP's Predictive Analysis software, in large part because it uses SAP's ERP system and BusinessObjects BI software.
UK's risk models, based on regression analyses, have yet to be proved; that will require more testing and comparative analysis, says Adam Recktenwald, the university's enterprise architect. Nonetheless, Recktenwald says UK is taking a novel approach.
"Schools tend to look at students in large swaths, such as high-performing students and low-performing students," he says. "We want to bring this down to micro segments, so that we can better understand the needs of individual students."
Faith In Numbers
Taylor University, a private Christian university in Indiana with 1,900 undergraduates and 120 grad students, is also experimenting with retention analytics, using the RStat module from Information Builders, its incumbent business intelligence software vendor.
After spending two months compiling data, Taylor's director of institutional research and associate registrar, Edwin Welch, was disappointed to learn that he could come up with only about 6,300 complete and consistent digital student records, 460 of which pertained to students who dropped out after their freshman year -- the outcome Welch wanted to predict. That's not a large sample size, particularly when the software splits that data, randomly choosing 70% of the records to build models and holding aside 30% for testing accuracy.
Despite the thin data sample, Welch built a variety of models using logistic regression, neural networks and other techniques available in the software. The initial effort turned out models that predicted with 78% to 86% accuracy whether a given freshman would either leave the university or continue on to sophomore year.
After testing a number of variables, Welch determined that the best predictors of retention for Taylor boiled down to six measures: high school GPA, SAT and ACT scores, fall-term GPA, percentage of credit hours completed in the freshman fall term, number of midterm grades below a C- in the fall term, and the number of credit hours registered for in the spring term.
Not satisfied with any one model, Welch then combined the results of his six best models and reached 90% accuracy, measured by testing the model against 2011 data. In that year, 68 freshmen dropped out, yet only nine of those students were on academic alert lists (compiled based on faculty and staff observations rather than data analysis). Welch's model predicted that 25 of those 68 students wouldn't return.
Welch says he isn't satisfied with 90% accuracy, but until he can add more data to the model, he can't hope for better. Still, had Taylor used the model in 2011, it would have known that 16 more students were at risk than were identified by conventional means. Retaining even half of those students would have yielded a return on the university's analytics software investment.
Taylor is putting the predictive modeling to use. Students flagged by five or six of Welch's models will be contacted by the university's Academic Enrichment Center, which offers tutoring and seminars on time and stress management. Students flagged by three or four of those models will go on the alert lists of instructors, counselors and hall directors.
The University of Kentucky is in the same boat; it won't have definitive evidence of retention success until the school year comes to a close. Meantime, UK is developing models to predict revenue, insight that will help it predict the number of classrooms, dorm rooms and even buses it will need. "We'll continue to refine these tools," Recktenwald says, "and we're designing methods to bake the findings directly into our resource management process."