Why Isn’t NYS Exploring Alternatives to Standardized Testing? And, Why Has It Taken Five Months to Release the Scores?

[Revised: According to reporters State Ed says the release of the scores is slated for “later in September”]

On Monday the members of the New York State Board of Regents will convene for the September meeting in Albany and I expect we we’ll see the rollout of the state grades 3-8 standardized test scores: a weighty slide deck, a presser and a contentious conversation.

It’s been five months since the exams: why has it taken five months to grade the exams?

Susan Edelman in the NY Post speculates,

The state Education Department has delivered the 2018 student test scores to schools — but demanded the results remain top secret until late September.

Critics call the stalling manipulative and political, noting that the delayed release will come after Thursday’s Democratic primary pitting Gov. Andrew Cuomo vs. Cynthia Nixon.

The state could also be tinkering with the “cut scores” — where to set the lines between passing and failing — to shape the overall results.

Both theories might be right, the scores are usually released in early/mid August.

The April tests are both machine-scored and the extended response scored by teachers, in June the state convenes about 100 or so teachers and supervisors to participate in the standards-setting process. Teams go through the questions; there are six tests in English and six tests in Math, one team for each grade in each subject. The questions are “rated,” level one through four, the teams rotate to gain a broad-based consensus and pass along results to the commissioner. The commissioner and the psychometricians at state ed review the standards-setting process and set cut scores, scores that determine whether the standing of individual students; a score of 3.0 determines “proficient.”

What does “proficient” mean?  The term itself has varying definitions; NAEP (National Assessment of Education Progress) definition is not the same as the definition used in New York State. (See “What does it mean to be proficient” here).

The test used in New York State, developed by Questar, a national testing company has come under attack from two testing experts.

A new study by the Benjamin Center at SUNY New Paltz looks at the curiously high percentages of students who received zeroes on certain types of ELA questions between 2013, when New York introduced tests aligned to the Common Core standards, and 2016.

The title of the study gives away the report’s conclusion: “Tests are Turning Our Kids Into Zeroes: A Focus on Failing.” 

The authors, Fred Smith and Robin Jacobowitz, argue that so many kids got zeros on certain questions, reflecting a complete inability to cope with the material, that the tests must have been flawed. “We conclude that testing instruments that put children in a virtual stupor cannot be defended as sound testing practice, nor as a way to raise standards or serve as a foundation for high-stakes decisions…”

Annual testing is required by federal law and “standardized” means that every student takes the same test.

We give tests to measure “progress,” measuring student, school, school district and state progress from year to year. If you change the tests, for example moving to the Common Core, shortening the number of days of testing, moving to untimed tests, you have to create a new base year; the spring 2019 tests will be using the revised New York State Standards, (aka, Common Core lite); yet another base year.

Whether we like it or not the test results garner headlines: Are New York State students/schools doing better ot not?  The consequences for leadership at every level can b dire.

Two states Colorado and New York have large and active opt out movements, in New York State about 20% of students are “opted out” by parents and the opt out parents are well-organized and have become a political force, endorsing candidates and lobbying in Albany.

There are a number of states that are exploring alternatives to annual standardized testing and the movement is growing across the nation. Read “How to Measure Student Progress Without Standardized Testing” here and “8 Alternatives to Standardized Testing” here.

For a deep dive in how Virginia is moving to alternative assessments read a scholarly article here.

The New York State Education Department has shown no enthusiasm to seek alternatives, they did not apply for the competitive federal program (there were no funds attached) and aside from the forty schools that have waivers from the Regents Examinations, a waiver that has been renewed since the 90’s, there has barely been any discussion. A few Regents members, not surprisingly the members who have served as school district leaders have raised the issue of alternative assessment pilots; ignored by state ed leadership.

I suspect the legislature will begin to pursue the issue, the political activity of opt outs may very well create an enthusiasm among legislators to force the commissioner to explore alternatives to annual testing.

2 responses to “Why Isn’t NYS Exploring Alternatives to Standardized Testing? And, Why Has It Taken Five Months to Release the Scores?

  1. Proficient is in my opinion a novice who meets very minimal standings. I would suggest that the category immediately above proficient be termed as sufficient. As for holding back test scores, one way to look at that is that the students as a whole did well and thus the state might have seen cutbacks in some of its federal monies. Another possibility is that the students did so poorly that it would be an embarrassment to those running for office .


  2. The flames of NYC’s opt out movement should be fanned by the Benjamin Center – SUNY report linked and cited above.

    The study was based on information finally obtained from the NYS Education Department after Pearson had its 5-year ($38 million) run from 2012 to 2016. When the data were made available it was too late to stop the core-aligned testing that had been criticized from the beginning by a handful of NYC parents. The findings in the report point to poorly designed tests, dumbfounding questions that had particularly harsh impact on 3rd and 4th graders, ELLs, students with disabilities and minority group youngsters. The report provides evidence that vindicate the views of the parents who opted out or voiced strong early waring objections to the instruments Pearson produced and SED shielded from scrutiny.

    Now we have Questar set up to continue to do its 5-year run ($44 million) as the test vendor succeeding Pearson. We are two years–2017 and 2018–into this span. And SED has not released information that would allow parents, the general public and analysts to form insights into how Questar’s product has impacted the 440,000 NYC children who took the exams. Does anyone see a pattern here?

    It’s time for NYC parents to press for immediate release of data about how the questions on the 2017 and 2018 exams functioned–specifically known as item analysis statistics (or item-level data). No it’s time to DEMAND IT!

    Don’t be fooled again. This is a safe, no-fear demand. All we want are the facts. Don’t be bamboozled by SED or Mayor de Blasio.

    I would say that failure to provide the data would make a persuasive case for opting out of the 2019 exams. It’s not too early to push back.

    Fred Smith


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s