New York State school districts have again used electronic scoring with OSC’s EASE™ System (Educational Assessments Scoring Environment) to expedite scoring of New York State’s Grades 3-8 standardized assessments.
From May 10th through May 23rd, certified New York State teachers used electronics scoring to help score ELA, Math and Science Assessments for Grades 3-8. A management team from OSC oversaw the process, providing extensive support from OSC’s technical staff, which set up the hardware and software, managed the workflow and provided group training and individual instruction as required. Holistic instruction and scoring was administered by educational professionals from the schools or by OSC, who provided accredited New York State teachers with extensive scoring experience to supplement the team as required. The data was sent to BOCES immediately after the scores were processed. New this year to Long Island and the first in the State is the use of electronic scoring of the NYSESLAT (New York State English Second Language Achievement Test ) for Grades K-12 and the Grade 4 & 8 NYS science test.
Citing the advantages of electronic scoring, several teachers commented on the ease of viewing the answers onscreen in an organized fashion instead of wading through hundreds of cumbersome answer booklets. As Michel Richez, Director of Technology and Information Services for the Long Beach Public Schools said, “We knew from our experience in 2009 and 2010 that OSC’s electronic scoring was the way to go, and, if anything, the scoring process in 2011 once again proved itself an invaluable tool.”
A comparison of hand scoring vs. electronic scoring verified that the EASE™ solution increased the number of tests scored per hour, per scorer. For Grade 3, the number of math tests scored per hour increased 218%; the number of ELA tests scored per hour increased 531%. Even assessments from grade 6, which historically require more time to score, showed dramatic increases in the number of tests scored per hour per scorer - 50% for Math; 49% for ELA. (Data based on 2009 – 2010 assessments. Data for 2011 is not available at this time.)
EASE™ was developed by the Education Division of Optimum Solutions Corporation (www.oscworld.com), designers of the patented, data capture technology that was used by the United States Census Bureau for the 2010 Census as well as by Young & Rubicam, Maritz Research, Scarborough Research, AXA Financial, the Bank of New York, Lockheed-Martin, the Gallup Organization, Simmons, and other leading Fortune 100 and multi-national corporations.
“It’s important to understand,” explained Jeffrey Schneider, Optimum Solutions Corporation’s Vice President of Research & Development, “that while electronic scoring is relatively new, the technology that underlies EASE™ has been tested and refined over a 20-year period. In fact, it was the data capture technology used by the Lockheed Martin-led Decennial Response Integration System (DRIS) team for the 2000 and 2010 Census, and has been utilized consistently by several Fortune 500 market research companies to capture data for their most extensive paper surveys.”
Joan Flig, an Educational Assessment Consultant who has been involved in New York State test scoring for years noted that, “OSC’s electronic scoring solution enables school districts to rapidly convert their traditional manual scoring systems into a completely automated system that functions faster and better, involving fewer teachers, while reducing costs. It is precisely the solution budget strapped schools need.
Electronic scoring enabled scorers to view constructed responses on laptops. Workflow was organized by batches, with the answers from students to a single question appearing on-screen one at a time. Scorers were able to adjust the image easily, scroll down or link to another area to view the entire response onscreen, utilize electronic tools such as rulers and protractors to score math tests and readily score that response in a designated box before proceeding to the next answer. The program’s built-in features blocked the scorers from moving to the next answer if they failed to mark any boxes or marked more than one box, preventing ‘No-Scores’ or ‘Double Scoring.’ Should a scorer require assistance, the table leader and other support personnel were consulted. Additionally, a percentage of answers were automatically tagged as read-behinds to ensure the accuracy of the scorers’ marks. Administration tools enabled personnel to supervise and expedite the entire process in real-time.