Your AI-Based College Advisor Might Be Racist

February 7, 2021
-

More than 500 universities across the United States use EAB's (an education research company) Navigate advising software. The program uses a predictive model to recommend students for classes and majors. It estimates student success using a range of variables.

But here’s the problem

Documents acquired by The MarkUp reveal that race is used as a predictor for student success in Navigate's model. In turn, there are large disparities in how the software treats students of different races. Black students are deemed "high risk" quadruple the rate of their white peers, which means they're far less likely to be recommended for STEM-based classes and majors.

  • For instance: Black students made up less than 5 percent of UMass Amherst’s undergraduate student body, but they accounted for more than 14 percent of students deemed high risk for the fall 2020 semester.

Navigate’s racially influenced risk scores “reflect the underlying equity disparities that are already present on these campuses and have been for a long time,” says Ed Venit, who manages student success research for EAB.

So why is this software being used?

Don't act too surprised: it saves money. In fact, EAB has aggressively marketed itself as a "financial imperative." Student retention is a big concern for colleges — especially public universities. EAB prides itself on, for instance, its integration at Georgia State University: since using the EAB program, Georgia state increased degrees awarded by 83%.

A path forward

Here at "Hold The Code," we're averse to data sets that are, as in this case, explicitly biased. But this doesn't mean that AI can't be used productively to increase student retention. Eliminating factors like race, considering students holistically, and doing closer research on the actual causes of student drop-outs can illuminate a meaningful application of AI.