Automated scoring explored, explained

Changing the student assessment model from an annual snapshot to continuously capturing academic performance is a bold challenge. As the education community considers how to develop, deliver, and score large volumes of these new assessments, automated scoring technologies are instrumental to effect this change.

Knowledge Technologies, a business within the Assessment & Information Group of Pearson, has more than 15 years of experience in automated scoring.

Today, the company released a new white paper, “Pearson’s Automated Scoring of Writing, Speaking, and Mathematics: A White Paper,” authored by Knowledge Technologies’ scientists Lynn Streeter, Ph.D.; Jared Bernstein, Ph.D.; Peter Foltz, Ph.D.; and Donald DeLand, Available at, the paper gives educators, policymakers, and test developers explanations of how automated scoring works, how well it works, and its potential for educational applications.

The document explores in-depth the way that Knowledge Technologies automatically scores written, spoken and, to a lesser extent, mathematical responses, as well as specific applications of this scoring technology. Written text can be scored for language arts and content domains, such as science, social studies and history. In addition to scoring writing for declarative knowledge, Knowledge Technologies assesses writing for language skills as reflected in stylistic and mechanical aspects.

Spoken responses are scored for declarative knowledge and speech quality in tasks such as reading aloud to determine fluency and in orally summarizing a reading. The most extensive applications of spoken technology are determining proficiency in speaking and understanding English as well as other languages. Written and spoken automated scoring have been combined to assess the traditional four language skills (reading, writing, speaking and listening) for college admissions and employment decisions. Knowledge Technologies’ automated mathematics assessments are under development and will allow students to show and explain their work as they step through computations, derivations and proofs using graphic and equation editors and text input.

The release of this white paper is part of Pearson’s overall collaboration with the greater education community on the development of next-generation assessments. Other components in this initiative include the launch of a special website, Next Generation Assessments, which features open, public access to numerous white papers and other resources to help states as they design and deliver new assessment systems. A new video series on the site features Pearson’s content development, editorial, research, and psychometric professionals discussing topics such as the power of technology to transform assessments and measuring student growth.