View the article’s original source
This post was written by Fred Smith, who worked for many years as a testing expert in the New York City Board of Education. In recent years, he has advised anti-testing groups like Change the Stakes.
I’ve passed the point of exasperation. But, after 40 years in New York’s testing trenches, giving up now is a luxury I can’t afford. Call it over-investment.
And here today is this email I get from a friend sending me the following link from Education Dive (a new one to me), which does a very short summary of a July report from the National Center for Educational Statistics. It compares common core standards and core-aligned test results across the states. Headline: New York Tops List of States with Most Difficult Tests.
The study examines proficiency cutoff scores and equates statewide performance on reading and math exams with corresponding results on the National Assessment of Educational Progress. It shows the relative standing of each state in terms of the NAEP scale.
Given the high regard in which NCES and NAEP are held, their methodology and findings must be respected. Here is the link to the report
Mapping State Proficiency Standards Onto NAEP Scales: Results From the 2013 NAEP Reading and Mathematics
I have two immediate reactions to this. First, making a test difficult in a statistical sense–does not make it valid, nor does it make it rigorous, a measure of critical thinking or more challenging—as proclaimed by proponents of core-aligned exams. More than 200,000 children, whose parents opted them out of the exams this year, loudly reject this proclamation.
Confusing, badly constructed items, inadequate time limits, developmentally inappropriate content–make a test “more difficult.” So does taking exams in an unlighted classroom, or when you’re hungry, homeless, sleep-deprived, just learning English or have special needs.
The study itself issued this caution about interpreting the results. The analyses in this report do not address questions about the content, format, exclusion criteria, or conduct of state assessments in comparison with NAEP.
Second, the fact that 2013 is the centerpiece of this report is important. According to the New York State Education Department (SED), 2013 was supposed to be the foundational year—the baseline, if you will, that would usher in the common core and assessments against which progress toward meeting the standards would be measured. NCES’s report merely shows that New York produced the most difficult tests. And…??
Finally, we come back to exasperation. Two recent events let us know that despite our protests and the importance of gaining necessary reform, we the public, remain where the politicians want us, on the outside looking in. The New York State legislature in the shoddiest, last-minute way possible just passed the weakest test-related bill imaginable. It does nothing to require truth in testing, which had been the focus of proposed legislation that was evidently abandoned. (A real T-in-T measure could have passed in this session. It was an extraneous item insignificantly glommed onto a much larger omnibus bill that satisfied the diverse interests of the legislators and was crafted to pass.)
And SED managed to award the next 5-year testing contract to an outfit called Questar Assessments. It has worked closely on testing projects with Pearson, Inc. in the past. And, before the contract was awarded, Pearson was quietly granted a one-year extension of its expiring 5-year contract. It will work to assure a smooth hand-off to Questar, especially in the matter of field testing, which may project Pearson’s involvement beyond 2016. We all know how sound Pearson’s test development expertise proved to be. Why not reward it for more of the same.
Will Pearsona non grata silently become the subcontractor, running the testing program from the shadows. Again, SED has gone about its decision-making with no transparency—leaving us to find out the details too late.