By: Todd Feathers
Early warning systems that use machine learning to predict student outcomes (such as dropping out, scoring well on the ACTs, or attending college) are common in K-12 and higher education. Wisconsin public schools use an algorithmic model called the Dropout Early Warning System (DEWS) to predict how likely middle school students are to graduate from high school on time. Twice a year, schools receive a list of their enrolled students with DEWS’ color-coded prediction next to each name: green for low risk, yellow for moderate risk, or red for high risk of dropping out.
But what many people, including school administrators and teachers, don’t know is that the algorithm uses factors outside of a student’s control—like race and free or reduced lunch-price status—to generate these predictions. Our investigation found that after a decade of use and millions of predictions, DEWS may be negatively influencing how educators perceive students of color. And the state has known since 2021 that the predictions aren’t fair.
Three Key Findings
The Background: Wisconsin’s Racial Gaps
What Wisconsin Students and Administrators Told Us
Administrators aren’t trained on how to interpret DEWS
Students we interviewed were surprised to learn DEWS existed and were concerned that an algorithm was using their race to label them and predict their future
Wisconsin DPI spokesperson Abigail Swetz declined to answer questions about DEWS but provided a brief emailed statement
“Is DEWS racist?” Swetz wrote. “No, the data analysis isn’t racist. It’s math that reflects our systems. The reality is that we live in a white supremacist society, and the education system is systemically racist. That is why the DPI needs tools like DEWS and is why we are committed to educational equity.”
In response to our findings and further questions, she added, “You have a fundamental misunderstanding of how this system works. We stand by our previous response.” She did not explain what that fundamental misunderstanding was.
How Does DEWS Work?
DEWS is an ensemble of machine learning algorithms that are trained on years of data about Wisconsin students—such as their test scores, attendance, disciplinary history, free or reduced-price lunch status, and race—to calculate the probability that each sixth through ninth grader in the state will graduate from high school on time. If DEWS predicts that a student has less than a 78.5 percent chance of graduating (including margin of error), it labels that student high risk.
How We Reported This
To piece together how DEWS has affected the students it’s judged, The Markup examined unpublished research from the Wisconsin Department of Public Instruction, analyzed 10 years of district-level DEWS data, interviewed students and school officials, and informally surveyed and collected responses from 80 of the state’s more than 400 districts about their use of the predictions.
Our investigation shows that many Wisconsin districts use DEWS—38 percent of those that responded to our survey—and that the algorithms’ technical failings have been compounded by a lack of training for educators.
Fellow journalists, if you want to know how you can cover this subject in your area, subscribe to our newsletter so you’ll be the first to know when we release our story recipe for this investigation in the coming weeks.
This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.