View the article’s original source
A new blogger enters the education fray with timely questions about the validity, reliability, and fairness of the Smarter Balanced Assessment, the Common Core test paid for by the U.S. Department of Education.
Dr. Roxana Marachi, Associate Professor in the Department of K-8 Teacher Education at San Jose State University has launched a blog called http://eduresearcher.com/.
In this post, she raises important questions, such as:
Q1: How is standardization to be assumed when students are taking tests on different technological tools with vastly varying screen interfaces? Depending on the technology used (desktops, laptops, chromebooks, and/or ipads), students would need different skills in typing, touch screen navigation, and familiarity with the tool.
Q2: How are standardization and fairness to be assumed when students are responding to different sets of questions based on how they answer (or guess) on the adaptive sections of the assessments?
Q3: How is fairness to be assumed when large proportions of students do not have access at home to the technology tools that they are being tested on in schools? Furthermore, how can fairness be assumed when some school districts do not have the same technology resources as others for test administration?
Q4: How/why would assessments that had already been flagged with so many serious design flaws and user interface problems continue to be administered to millions of children without changes/improvements to the interface? (See report below)
Q5: How can test security be assumed when tests are being administered across a span of over two months and when login features allow for some students to view a problem, log off, go home (potentially research and develop an answer) and then come back and log in and take the same section? (This process was reported from a test proctor who observed the login, viewing and re-login process.)
Q6: Given the serious issues in accessibility and the fact that the assessments have yet to be independently validated, how/why would the SmarterBalanced Assessment Consortium solicit agreements from nearly 200 colleges and universities to use 2015 11th Grade SBAC data to determine student access to the regular curriculum or to “remedial” courses? http://blogs.edweek.org/edweek/curriculum/2015/04/sbac.html.
She includes a startling graph produced by SBAC, with projected failure rates in the 11th grad math tests for different subgroups.
67% of all students are expected to fail
83% of African-Americans ” ”
80% of Latino students ” ”
93% of English language learners ” ”
“Evidence of Testing Barriers and Implementation Problems
The Board is encouraged to consider the following evidence documenting serious concerns regarding the validity, reliability, security, accessibility, and fairness of the SmarterBalanced Assessments.
SmarterBalanced Mathematics Tests Are Fatally Flawed and Should Not Be Used documents serious user-interface barriers and design flaws in the SmarterBalanced Mathematics assessments. According to the analyses, the tests:
“Violate the standards they are supposed to assess;
Cannot be adequately answered by students with the technology they are required to use;
Use confusing and hard-to-use interfaces; or
Are to be graded in such a way that incorrect answers are identified as correct and correct answers as incorrect.”
“The author notes that numerous design flaws and interface barriers had been brought to the attention of the SmarterBalanced Assessment Consortium during the Spring 2014 pilot test, and remained unresolved during the Spring 2015 test administration.”
The post includes comments by teachers and administrators about the problems with SBAC.
She closes her blog with this reflection on the predicted failure rates:
“My letter to the Board is to encourage responsible, ethical, and legal communications about the assessment data that will apparently soon be disseminated to the public. Students’ beliefs about themselves as learners will be caught up in the tangle of any explanations surrounding the assessments, and as we know, decades of research demonstrate the power of student belief to be a factor impacting subsequent effort and persistence in learning.”