91直播

Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Assessment Opinion

Treat NAEP as a Reality Check, Not an Advocacy Exercise

By Rick Hess 鈥 November 16, 2020 3 min read
  • Save to favorites
  • Print
Email Copy URL

If you鈥檙e like me, the election, COVID spikes, and the rest mean that October鈥檚 NAEP scores haven鈥檛 exactly been top of mind. But, with the election behind us, I鈥檓 inclined to say a few words about the 2019 12th grade numbers, which showed math performance flat and reading declining noticeably since 2015.

Overall, just 37 percent of 12th graders were 鈥減roficient鈥 in reading and 24 percent in math. And keep in mind that these aren鈥檛 COVID-impacted scores. The tests were administered in fall 2019, six months before COVID reared its ugly head. As Haley Barbour, chair of the National Assessment Governing Board, , 鈥淭hese results demonstrate that far too many of our nation鈥檚 high school seniors do not have sufficient math and reading skills for postsecondary endeavors.鈥

So, not great. At the same time, let鈥檚 keep in mind that NAEP results are a snapshot. They鈥檙e useful for tracking big-picture student achievement but need to be handled with care. Unfortunately, too many appear disposed to disregard such cautions, treating NAEP less as an essential reality check than as a festival of agendas and dubious narratives. That鈥檚 been especially true during this election season.

As is true with each new NAEP release, the teachers鈥 unions and self-described 鈥減ublic school advocates鈥 have a universal excuse for any crummy results鈥攖he 鈥渄isinvestment鈥 in public education. Never mind what the actual spending figures show, or that after-inflation per-pupil spending has increased steadily over the past two decades. In their eyes, the lesson of NAEP is always more spending.

Meanwhile, there鈥檚 a segment of school choice advocates who eagerly greet any lousy new NAEP numbers by shouting, 鈥淎h-ha, schools are failing!鈥 For them, the lesson of NAEP is simple: more school choice. Of course, given that NAEP proficiency was purposefully set at an aspirational level from inception, aggregate proficiency results should be taken with more than a few grains of salt.

And then there are efforts to weave complex narratives to explain the results. For instance, some holdout Common Core enthusiasts have gone to great lengths to insist that we not blame the long-standing stagnation in NAEP on Common Core. Of course, this involves devising convoluted alternative explanations鈥攕uch as them to the aftermath of the 2008 recession. This argument was , once again this year, with adherents implying that we should trace changes in reading and math scores to decade-old economic circumstances rather than a massive push to change reading and math instruction.

The truth is that there鈥檚 no clear or defensible way to determine what鈥檚 responsible for NAEP results, and all of the predictable spinning should be treated accordingly. It may be wishful thinking, but we鈥檇 all be better off if analysts restrained themselves from offering convenient, one-shot explanations for NAEP changes.

It鈥檚 equally important that we safeguard NAEP from its purported friends. NAEP is our one reliable tool to measure academic progress over time. On that score, recent efforts to overhaul NAEP鈥檚 reading framework are deeply troubling. In a major push to 鈥渦pdate鈥 NAEP鈥檚 reading framework, the National Assessment Governing Board has now developed a massive draft framework that aims, at enormous cost, to reading assessment from a straightforward snapshot of reading performance into a complicated, amorphous gauge of 21st-century 鈥渓iteracy鈥 as understood by the education school set. Specially, the framework calls for the inclusion of multimedia texts, such as video clips, alongside (or instead of) textual passages. The result would compromise our ability to know how well students can actually read, introduce an array of potential distortions, and sorely reduce our ability to compare future results with past performance. As Checker Finn, the National Assessment Governing Board鈥檚 very first chair, has 鈥淥ne of the framework developers鈥 key impulses is a truly worrying overreach for NAEP.鈥

NAEP can play a useful role in providing a respected baseline for assessing whether states or the nation are making academic progress, grounding our sense of where we are and what we鈥檝e done. But NAEP plays that role best when we recognize its limitations.

Related Tags:

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of 91直播 in Education, or any of its publications.