Colleges and universities that use sophisticated data systems to analyze and guide students must be careful to avoid ethical dilemmas that can arise, like pigeonholing students unfairly or underestimating their abilities, according to a new report that analyzed common pitfalls of these systems.
These systems, known as predictive analytics, have the potential to reinforce disparities for disadvantaged students if used without proper training and consideration for how computers might shuffle students into categories, according to the report from New America, a Washington, D.C.-based think tank. The report was released this week at SXSWedu, a national education technology conference in Austin, Texas.
“We should always try to balance the potential of technology with its risks,” said Manuela Ekowo, an education policy analyst at New America, who co-authored the report with Iris Palmer, a senior policy analyst. “We shouldn’t let technology blindfold us to the ways it can do harm, particularly to students from underserved and under-represented backgrounds.”
Predictive analytics uses data to help inform instruction or academic guidance for students. It takes past performance, such as grades or other data points thought to be correlated with academic success, and attempts to distill that data into reports that can help ensure a student stays on track for graduation. It is a growing trend in higher education but it’s not ubiquitous. A report last fall from the same authors estimated that less than half of higher education institutions surveyed use data in this way.
Some say these methods can help improve student outcomes. Georgia State University, for instance, uses this type of a program to help more students from disadvantaged backgrounds, who have a greater tendency to drop out, and steer them toward success, according to the report. Administrators there use data to alert them to pressure points for students. For instance, the system posts an alert if a student flunks a course or does not complete a required class on time. And, significantly, the university hired 42 new advisors – human beings, not the computer version – and reported that in one year they used data to schedule 43,000 in-person meetings between students and advisors.
You need to train people who are using the data so they know these are not predictions – especially when we call it predictive analytics. Really, it’s misnamed.
~ Manuela Ekowo, education policy analyst at New America
In other words, a key to the success story was human intervention, not a computer cure-all.
Even the best-intended programs can fail if educational intuitions do not consider, and plan for, ethical dilemmas, according to the new report. Schools might use data to justify enrolling more affluent students and fewer poor students, for example, because people from poorer families tend to face more challenges that make them less likely to graduate on time.
“You have to make sure you are not baking in historic inequalities,” Palmer said.
The authors suggest five steps that schools should take as they roll out these programs. Among the suggestions: Train staff, and include students in the decision-making process, giving them access to their own data and training them to use it.
Often, students are not given access to their own data or the systems that predict outcomes. Some students say they worry that they will be unfairly sidelined if a computerized system says they are not cut out for college work.
Students are key to ensuring the systems in use are accurate. They must be empowered to challenge the system, Ekowo said. And faculty and staff need to know that the predictions are correlations, not causations, Palmer said. The data is a guide, not a map that must be followed to its conclusion.
“You need to train people who are using the data so they know these are not predictions – especially when we call it predictive analytics,” she said. “Really, it’s misnamed.”
The post When using data to predict outcomes, consider the ethical dilemmas, new report urges appeared first on The Hechinger Report.