You are on page 1of 4

For

Immediate Release
March 6, 2016
Contact: Michael Molnar, Executive Director of Educational Services
Phone: 440-988-1981

Testing Method Impacted 2015 Report Card Ratings



After receiving 7 As and one B in Value-Added categories in 2013 and 2014, the Amherst Schools
received three Fs and one D for 2015 according to the grades released by the Ohio Department of
Education. Knowing that teaching and instruction methods have only improved during that time period,
we began to suspect a problem with testing method reliability.

Harris Elementary School tested all students and subjects using the online PARCC assessments and
received four Fs. Nord Middle School tested all students and subjects using the online PARCC
assessments and received four Fs. In contrast, Amherst Junior High School tested all students in
Mathematics using the online PARCC assessments, but tested all students in English/Language Arts using
the paper-pencil PARCC assessments. The junior high received three As and a C (which was demoted
from a B due to a penalty in the special education participation rate because of opt-outs).

Preliminary research using local districts and similar districts as determined by the Ohio Department
of Education revealed a similar pattern. Despite being alerted to this concern, the department did not
express interest in researching the hypothesis that there is a correlation between a districts selection of
testing format for the 2015 PARCC assessments in grades 4 8 and their value added report card grades.
Therefore, we conducted our own research.

In just three days, 428 school districts across the state responded with data regarding their selection of
testing format from last year. The highlights of the research findings are listed below. In addition, we
have chosen to make this information available to every board member, superintendent, teacher,
student, parent, and legislator in Ohio. The data can be accessed at https://goo.gl/Td7NJH and will be
continually updated as new districts provide information.




RESEARCH RESULTS


Of the 428 school districts that responded as of March 4, 2016, 89 districts utilized the PARCC paperonly assessments for all grades 4 8 math and ELA testing last year. As you can see, 85% of those
districts received an A on the Overall Value Added measure. In addition, 15 of the districts that received
an F on the Overall Value Added measure in 2014 improved to an A in 2015, while only 3 districts
dropped from an A or B in 2014 to an F in 2015. Now compare this information to the online-only
districts

Of the 428 school districts that responded as of March 4, 2016, 260 districts utilized the PARCC online
assessments for all grades 4 8 math and ELA testing last year. As you can see, 62% of those districts
received an F on the Overall Value Added measure. In contrast to the paper-only districts, only 3 of the
districts that received an F on the Overall Value Added measure in 2014 improved to an A in 2015 while
an astonishing 80 districts dropped from an A in 2014 to an F in 2015.



Of the 428 school districts that responded as of March 4, 2016, 79 districts utilized a combination of
online and paper PARCC assessments for all grades 4 8 math and ELA testing last year. Too many
factors and variations of paper and online testing exist between each district to establish any kind of
concrete trend. If you are interested in the various combinations, column N of my spreadsheet
(https://goo.gl/Td7NJH) contains each districts combination (if reported).



Reviewing the 428 responding districts into quintiles based on 2013 median income data shows that
each quintile contained similar format selections - meaning there was no correlation between the
wealth of a community and its format selection. School districts seemed to base their test format on
educational factors and not district wealth.
CONCLUSION
The data proves that the differing formats of last years PARCC testing greatly impacted and altered the
value added data. Paper-only districts performed better than online-only districts. Many factors could
contribute to this disparity but by comparing online testing districts to paper testing districts, the Ohio
Department of Education is not providing accurate and fair information to the public. Clearly, the 2015
value added grades released by the Ohio Department of Education are unreliable and invalid. The goals
of this research are to ensure that:

Every Ohio district can make an informed decision about their choice of testing format for this
springs testing.
The Ohio Legislature can use these statistics to justify passing legislation to remove the 2015
report card data, information, analysis, and reports from the Ohio Department of Education
website and to eradicate all 2015 data for future analysis.
The Ohio Legislature can also use these statistics to justify passing legislation to select one
testing format (online or paper) for all future testing to ensure that all school districts are being
compared and assessed equitably.

You might also like