In an early glimpse of how much tougher state tests could be in the Common Core era, a new federal report released in July shows that early adopters of the controversial standards are assessing their students with a far higher degree of difficulty.
While this new report is unlikely to settle the battle between Common Core advocates and foes, it does indicate that one of the original purposes of the standards – challenging students in math and reading so they’ll be more prepared for the rigors of college and career – seems to be bearing fruit. But tougher tests aren’t contingent on adopting the Common Core: Texas, one of the few states that eschewed the standards, is among the rare group of states testing its students in far more challenging ways.
Common Core adopters Kentucky, New York and North Carolina joined Texas in offering tough math and English tests to its fourth and eighth graders.
Though the report, released by the National Center on Education Statistics, contains many moving parts, its basic premise is that because states set their own rules for testing difficulty, those rules — or cut scores — should be compared to a common yardstick. The U.S. has such a benchmark: The National Assessment of Educational Progress, considered the gold standard in measuring how much students know and viewed by most education experts as a more exhaustive assessment than anything the states offer.
Similar to state assessments, NAEP assigns various levels of achievement, including below basic, basic, proficient and advanced. The federal report released this month aligns state benchmarks for proficiency with a common scale using those NAEP scores.
While states have demonstrated some improvement over time in measuring student progress by using higher-caliber test questions, they have a long way to go. The report shows that what’s considered proficient in 25 states translates to ‘below basic’ on the NAEP for fourth-grade reading. Only two states’ proficient levels – New York and Wisconsin – matched NAEP’s definition of proficient in fourth-grade reading. In eighth-grade reading, just New York could say the same.
Fourth-grade reading, 2013:
Eighth-grade reading, 2013:
During a press call with reporters, acting NCES commissioner Peggy Carr explained that the national board that sets the ‘proficient’ level on the NAEP considers that score an indication that students are college-ready. By that measure, only a handful of states call on their students to reach levels of academic prowess that line up with NAEP’s definition for college readiness.
Five states had fourth-grade math proficient levels that match NAEP’s while four states’ proficient levels placed below NAEP’s basic cut score. The charts below summarize the rest of the findings.
Fourth-grade math, 2013:
Eighth-grade math, 2013:
U.S. Secretary of Education Arne Duncan said in a press release that “coupled with the fact that more than 40 states are moving forward with new, higher academic standards that the states themselves developed, this is encouraging news for parents and students.”
To Gary Phillips, American Institutes for Research vice president, the varying definition of proficient across all states signals that many are “living in a Lake Wobegon fantasy where they say the students are above average when they’re not.”
Phillips says that the range in proficiency levels is the equivalent of “about three to four grade levels in student performance. The rigor of the grade-four standards in the highest achieving states may be comparable to the rigor of the eighth-grade standards in the lowest achieving states.”
Because states largely set their own definitions for proficiency, two states with the same percentage of students showing strong results can mask sizable differences in achievement. While 71 percent of students in both Arizona and Kentucky earned state assessment scores that fell within their states’ definition of proficient, Kentucky students in that category would have an average NAEP score of 252, while in Arizona those students would have a comparable score of 243 – a statistically significant difference.
Since 2013 roughly half of all states agreed to use assessments aligned with the Common Core created by two consortia — Smarter Balanced and The Partnership for Assessment of Readiness for College and Careers. For those states and others that align their tests with the Common Core, tougher cut scores are likely on the way.
“I will predict that the consortia states will find that their standards are more rigorous than what they may have had previous to the consortia assessments,” said Louis Fabrizio, who heads data, research and federal policy at the North Carolina Department of Public Instruction.
He notes that though North Carolina didn’t roll out tests designed by the consortia, the state for the first time administered assessments based on the Common Core state standards in the 2012-13 school year — just in time to be noticed by this month’s federal report. As one of the few states at or near the NAEP proficiency standard, “the results speak for themselves,” Fabrizio said. The last time NCES issued such a report, evaluating 2011 state tests, it found that North Carolina had some of the weakest assessments standards in the country.
While Kentucky and New York also adopted Common Core-aligned assessments between 2011 and 2013, paving way for their high marks in today’s report, Texas had no part in the Common Core. Still, the state transitioned to a new set of tests, called STAAR, that were touted as much more difficult than its predecessor and aligned with its own independent academic standards. A 2014 Dallas Morning News article noted that on average students weren’t improving on the assessment since its debut in 2012. The NCES report helps explain why: Between 2011 and 2013, Texas’ proficient benchmark soared from the near bottom to among the top few in the country.
“We would encourage states to adopt Common Core,” said Scott Norton, who heads assessments and accountability for the Council of Chief State School Officers, in an interview with EWA. “But if they don’t want to do that and adopt some other set of college and career ready standards, that’s good, too. Texas is a great example of that. This report bears that out.”
Wisconsin, another state that demonstrated major strides in its state exam rigor between 2011 and 2013, toughened its cut scores in time for the 2012-13 state tests. John Johnson, the head of communication at the Wisconsin Department of Public Instruction, told EWA in an interview that state education leaders sent letters to parents explaining the changes, which also warned them that student scores may drop due to the increased difficulty.
Still, no matter how much more work is poured into strengthening the tests, Norton cautions that tougher assessment benchmarks won’t on their own lift student scores. “The first part is adopting more challenging career-ready standards,” he said. “Then it’s important to put in a test that’s aligned to those standards with rigorous achievement [levels] … when that happens, students can begin to perform better, and that’s probably what we’ll see over time.”