Many institutions in higher education have begun to rely on artificial intelligence systems that can predict a student’s future success, utilizing their results to make decisions on admissions, budgeting, and student services. A new study published in AERA Open has uncovered a potential racial bias among these algorithms that unfairly judges Black and Hispanic students.
According to their analysis of over 15,000 students, the authors found these AI models incorrectly predict academic failure for Black students 19 percent of the time, compared to 12 percent for White students and 6 percent for Asian students. On the other hand, the systems incorrectly predict academic success for Black students 33 percent of the time, compared to a staggering 65 percent for White students and 73 percent for Asian students. The study also revealed similar racial biases regarding Hispanic students.
The authors state their findings point towards an urgent need for policymakers and institutions to carefully examine the algorithms used in their decision-making efforts, as they could be hurting racially underrepresented students chances’ for fair college admission processes and advanced educational opportunities.
The study was led by scholars at the University of Texas, the University of Illinois Chicago, and Northern Illinois University.