Is credit recovery creating fraudulent graduation rates and manufacturing an illusion of learning? As credit-recovery programs have grown in popularity and import, a growing number of questions have emerged about the impacts on participating students and the integrity of a high school diploma. Invaluable analyses, like those by AEI鈥檚 or Fordham鈥檚 , have helped move these questions to the front-burner. In that light, I was more than a little interested when Jeremy Noonan, a doctoral student at the University of Leicester and former classroom teacher, reached out with some thoughts and research regarding online credit recovery. I thought his concerns and experiences well worth sharing, so here you go:
鈥淩ecent work by Nat Malkus has given us a detailed picture of the extent of schools鈥 reliance on credit-recovery programs, specifically of the online variety. 鈥淐redit recovery鈥 generally refers to the opportunities schools provide for students to acquire credits that鈥攚hether due to failure, prolonged illness, etc.鈥攖hey did not gain on-time. Thus, it is a way for students to get back on track for graduation. While credit recovery has been practiced for decades (think summer school!), it is the rapid, widespread proliferation of online credit recovery (OCR), in place of traditional strategies like retaking a course, that has spurred questions about the validity of these credits. As Nat Malkus has framed it, are they real second chances or merely second rate?
A curious fact about the growth of OCR is the absence of any kind of research base on the fittingness of online courses for the kind of learners largely served by these programs鈥攁t-risk kids who struggled to learn in that subject in the first place. Prior to 2019, there was just one academic study on the efficacy of OCR for student learning compared with credit recovery in a traditional classroom鈥攐ne!鈥攁nd this found that OCR students did worse on a standardized algebra exam.
That OCR has grown nationwide in the absence of any research demonstrating its efficacy for student achievement raises questions about schools鈥 commitment to 鈥渄ata-driven鈥 decisionmaking. My own story suggests that perhaps schools do not know whether OCR is good for student learning because of willful neglect; i.e., they do not want to know these effects.
From my first day in an OCR classroom, I felt troubled that this was not an environment set up to support learning. And the more I saw, the more concerned I became. Eventually, I laid out my burdens in a letter to my superintendent, which catalyzed a districtwide meeting of everyone involved in our OCR program. The meeting agenda charged us to resolve the issues I raised and to do so with a view toward raising students鈥 scores on Georgia鈥檚 End of Course exams.
But as the meeting drew to a close, the scores had not once been mentioned, so I raised my hand: 鈥淲ait a minute, we haven鈥檛 discussed raising their test scores. We have to know where we are starting from. Does anyone know their scores?鈥 When no one could answer, I thought they were feigning ignorance鈥攐ut of embarrassment perhaps. The administrator leading the meeting stifled further inquiry, saying, 鈥淩aising test scores is nice, but what鈥檚 most important is keeping our high graduation rates.鈥 In other words, increased student learning鈥攚hich would have required raising expectations鈥攚as seen as a threat to high graduation rates!
Later, I expressed my incredulity to my own assistant principal: How could no one know the test scores? She shared how she stumbled across the OCR students鈥 End of Course exam scores while exploring the district鈥檚 new data-management system. A folder labeled 鈥淯nknown鈥 piqued her curiosity. In it, she found the files with the scores! No one in the district had ever bothered to look; they were tucked safely away.
The test scores were 鈥淯nknown鈥 because no one cared to know. All that mattered was that the OCR program was a boon to the district鈥檚 graduation rates (which had increased by 13 points the previous year).
I soon discovered that there was more to their ignorance than apathy. After securing access from my principal to my own school鈥檚 scores, I was permitted to present them at the district鈥檚 next credit-recovery meeting. They were dismal鈥攐ver 70 percent scored at the lowest level (which I learned later was consistent with statewide OCR scores). Naively expecting a receptive audience, I was shouted down by my colleagues鈥"What鈥檚 wrong with you, buddy? Do you have some kind of axe to grind?"鈥攚ho left the meeting angrily. So much for the ideal of neutral, impersonal 鈥渄ata-driven鈥 decisionmaking!
In an era in which this ideal of 鈥渄ata-driven,鈥 鈥渆vidence-based鈥 decision-making is regarded as a professional standard for school administrators, the absence of research evidence on OCR is quite startling. The implementation of OCR has far outpaced the research, which is only beginning to catch up. A study published in 2019 looked at credit recovery outcomes over a four-year period in a large urban school district. It found that while online coursetaking was positively associated with credits earned and high school graduation (predictably!), there was an increasingly negative association between the number of school years in which students took an online course and performance on math and reading exams. Specifically, students who took OCR courses for all four years of high school were set back in math and reading progress by the equivalent of one year!
Such findings should prompt school leaders to re-evaluate their reliance on online courses for credit recovery, but will they? An abundance of research done in the early 2000s on online learning鈥攜ears before the OCR explosion鈥攆ound that online environments were generally ineffective for student learning, suggesting that OCR courses would NOT 鈥渨ork.鈥 Yet that has stopped few from implementing OCR in their schools.
That鈥檚 because these decisions are driven by one data point: graduation rates. The efforts to boost rates via bogus online courses are depicted vividly in Slate鈥檚 investigation The Big Shortcut. Yet is the effect of higher graduation rates sufficient evidence to justify the use of these programs? The research suggests鈥攁s the administrator running that meeting believed鈥攁 trade-off between higher rates and student achievement. What if more students are graduating but are learning less and are ill-equipped for life post-graduation? Are these trade-offs worth more students having diplomas? These are questions data alone cannot answer.鈥
is relentless. Pressure schools to raise graduation rates, and they鈥檒l raise graduation rates . . . somehow or other. There鈥檚 good reason to fear that credit recovery has become part of that 鈥渙r other.鈥 Here鈥檚 hoping that advocates, researchers, parents, practitioners, and policymakers start asking what exactly is going on before鈥攔ather than after鈥攃redit recovery turns into one more case of 鈥淗ow the heck did we let THAT happen?!鈥