Buying an Assessment System: Five Considerations

No Child Left Behind has focused schools on tracking student progress like never before. But high-stakes assessments administered by states and used to measure schools' "adequate yearly progress" provide data only once a year. As a result, schools and districts are increasingly demanding tools that supply more frequent information about student academic performance throughout the year. In response to this need, many companies have developed or repositioned products to serve as interim assessment systems-typically online tests that students complete on a monthly or quarterly basis, coupled with electronic reporting tools that can be used by educators to analyze student performance.

Interim assessment, if carried out comprehensively and systematically, can yield substantial improvement in student achievement. Teachers can use the data collected to adjust and differentiate their instruction to ensure all students are on track to reach end-of-grade academic standards. Likewise, interim measures of student progress enable administrators to make exacting changes in curriculum, instruction, and professional development, keeping the school and the achievement of its students improving continuously.

A program of interim assessment can also fail miserably. Testing every student several times a year can be viewed by teachers, administrators, and students as a major burden, consuming already scarce instructional time and compounding the widely perceived problem of "over-testing." Analyzing assessment data and acting on it are not skills that teachers and administrators routinely possess; they have to learn these strategies and be prepared to devote time to doing so. An interim assessment program could easily collapse of its own weight, generating little meaningful change in instruction, lots of teacher and administrator backlash, and no significant gains for students.

For a school or district to realize the full potential of an assessment system, it needs to select a vendor partner with deep and successful experience in the field of diagnostic evaluation. Many companies and organizations provide assessments well matched to state standards and district curriculum. The hard part is ensuring that faculty and students use the system effectively. That requires choosing not only the right system, but a vendor who knows how to make it really work.

Buyer's Guide

For the purposes of this article, we're focusing on software and Web-based assessment management packages that (1) are designed to diagnose students' progress; (2) connect assessment with daily instruction (as opposed to test prep); and (3) are designed to cover more than one subject area. In addition to the obvious question of cost-which can range anywhere from $5 to $45 per student annually-the following are five specific criteria to consider when selecting a partner.

Quality, Not Quantity: Many vendors emphasize the quantity of content rather than its quality. They will point out the number of questions available in their item banks (usually numbering in the tens of thousands), but don't provide great detail on the process and standards used to create the items. Remember, it's the quality of the questions that will affect the quality of the data teachers are getting and best aid their ability to improve student achievement.

A good assessment system takes in both the state standards and the state assessment requirements. It starts with content that matches individual state assessments in look, format, and quality. For example, some questions on state tests require technically accurate maps, graphs, and other illustrations that can be read and interpreted by students at the appropriate grade level. Such questions should be available in the interim assessment system you select.

Content should not only match what is being tested, but how frequently it is tested. In other words, questions should be weighted according to the frequency with which they have historically been tested in state assessments. Content should also be mapped to the skills and strands of your state's learning standards. Ask whether the vendor writes specific content for each state they serve or if they merely match existing items from their bank with state standards.

Finally, be certain the publisher tests its content for reliability and validity. Reliable questions are ones that will generate similar results when used with similar populations. Valid questions accurately measure the standards they're intended to measure.

User Friendliness: An essential technical feature is a reliable, user-friendly interface for students, teachers, and administrators. The testing screen should have a simple, intuitive navigation system that indicates which questions have and haven't been answered. In addition, the interface should allow students to return to unanswered questions or change answers until the time they tell the system that they have completed the assessment. In essence, the interface should look, feel, and act like the state assessment as much as possible.

Variations in monitor size and screen resolution should not affect the presentation of items. When a student begins an assessment, the system should automatically adjust text size based on the size of the computer screen, providing a relatively similar appearance to all students regardless of hardware. However, items should appear on the screen appropriately shaped and sized for different grade levels.

Another critical factor is speed. In order for students to make best use of the system, it must be free of delays caused by limitations of the Internet, incompatible hardware, or other reasons. Imagine the impact on a student's level of concentration when forced to wait for more than one or two seconds for an item to load. Web-based solutions that include no local caching, serving, or storage are particularly susceptible to the unpredictable traffic jams of the Internet.

Also important are robust tools for synchronizing to your school's student information system. Student names and demographic information should flow automatically into the interim assessment system for easy data management and to avoid double entries.

Finally, the technical solution should include automated data back up in case of emergencies. The system should record each answer as students move through an assessment. So, if students need to terminate a testing session before completion (for a fire drill, because of a hardware failure, or due to a power outage), they're able to resume the assessment exactly where they left off.

Easy Data Mining: A high-quality assessment solution is a fully automated online system that provides tests for students as well as scoring and reporting features for teachers and administrators. The system should provide fluid, real-time reporting of data in multiple categories so teachers can immediately view test results for individual students or the entire class. Drill-down reporting features should allow teachers and administrators access to any level of specificity that fits their current reporting needs. It is particularly valuable because it requires little technology literacy and allows users to quickly access the data they want.

A large variety of reports should be available including analysis by strand, skill, grade, subject, teacher, student, subgroup, custom group, and longitudinally. For example, teachers should be able to pull reports that show the strand and skill that each question tests, the percentage of students who chose each answer option, and which option was the correct answer. Longitudinal reports are important in this regard since they allow teachers and administrators to examine trends from which to make data-driven decisions.

Administrators should be able to monitor all levels of performance results, as well as month-to-month usage, test dates, tests started and completed, incomplete tests, and tests not started at the district, cluster, and school levels. Other desirable reporting features include a query function that allows for different data to be exported into CSV-text or Excel formats and the ability to make online reports printer-friendly.

Teacher Support: Comprehensive support includes professional development, program support, and project management. Look for solutions that ensure teachers and administrators have the basic technical mastery necessary to effectively use the program, especially its reporting functions. The product should include staff development, not just as an option or a program add-on, but as an essential piece of the assessment solution.

Some vendors offer teaching notes for each test item. These notes provide support for both novice and veteran teachers: novice teachers get helpful hints that come with experience; veteran teachers get extensions of lessons they've routinely taught. Some systems include suggestions for open-response questions that can be administered and graded by the teacher.

As when purchasing any technology-based solution, it's important to select a partner with robust and prompt technical support. This includes both a rapid response to technical crises, as well as a demonstrated track record of product improvement and responsiveness to user needs. You want to select a product that includes upgrades at no additional cost. As state tests evolve, you want a system that changes in response to them.

Proven Success: Don't take chances on an unknown quantity. Vendors should be able to prove that their products have been used to provide achievement gains across a variety of clients, including urban, suburban, and rural environments; large- and small-school environments; high- and low-performing schools; and elementary, middle, and high schools. Finally, select a partner committed to continuing its investment in research and development of both content and technology.

Todd McIntire is the vice president of achievement for Edison Schools.

Read other articles from the November Issue

Key Features

Here are seven attributes your assessment system should offer.

  • Users can choose predefined test or create customized diagnostic tests.
  • Program customizes test to the student's achievement level.
  • Content developed to reflect individual state assessments in quality, look, and format.
  • Item bank aligned to state standards.
  • Students receive immediate feedback; administrators receive aggregated scores within hours.
  • Real-time, drill-down reporting provides user choices in how to analyze data.
  • Question-specific teaching notes link assessment to classroom planning and instruction.

Read other articles from the November Issue

Multiple Choices

The following is just a sampling of the many companies offering products in the burgeoning assessment management market. For a more complete list of resources, see www.techlearning.com/db_area/archives/TL/2002/01/accountb.html.

Classwell: www.classwell.com

Compass Learning: www.compasslearning.com

Co-nect: www.co-nect.net

EyeCues: www.eyecues.com/assessa

Lightspan: www.lightspan.com

(Editor's Note: Lightspan and PLATO Learning recently announced plans to merge.)

Merit Software: www.meritsoftware.com

Northwest Evaluation Association: www.nwea.org

Pearson Digital Learning: www.pearsondigital.com

PLATO Learning: www.plato.com

Homeroom (Princeton Review): www.homeroom.com

Renaissance Learning: www.renlearn.com

Riverdeep Learning: www.riverdeep.net

Riverside Publishing: www.riverpub.com

Scantron: www.scantron.com

Edison Affiliates: www.edisonaffiliates.com

Vantage Learning: www.vantage.com

Read other articles from the November Issue