Is More Testing the Answer?

from Technology & Learning

What are experienced districts saying about the keys to successful differentiated instruction? The answers may surprise you.

Testing students once a year, with data reported many months later, is like using an autopsy to determine how to help a patient, says Cindy Ambrose, chief academic officer of the Horry County Schools in Myrtle Beach, South Carolina. "The state tests are administered at the end of the year for accountability purposes, and by the time we get the results the students have long ago moved on to their next educational endeavor." She claims if the goal is to gather data that will really help identify individual student needs, it's far better to use an approach that more closely resembles that of an effective hospital. "When a patient is in the hospital, the nursing staff periodically takes vital signs in order to perform a status check on the patient's health," says Ambrose. "In our district, we utilize a variety of measures to obtain what we refer to as 'vital signs' on the health of the system."

In Horry County and many other districts around the country, assessment that monitors these vital signs on an ongoing basis is having a positive impact on student achievement. While educators in other schools worry about the amount of time spent on standardized testing each spring, those in districts committed to "formative" assessment frequently clamor for more testing—the sort that is usable for differentiating instruction and improving district programs.

The Weld County School District 6 in Greeley-Evans, Colorado, is a case in point. At the start of the 2005–2006 school year the district was placed on "academic watch" by the Colorado Department of Education as the result of eight years of sub-standard, declining student achievement on the Colorado Student Assessment Program (CSAP) assessments. That same year, under entirely new leadership, the district set out to work on improvement.

New Tools and Alignment

In addition to realigning the district's content standards so they matched the statewide standards on which students were actually being tested, District 6 adopted a number of tools for gathering and analyzing data. According to Dr. Larry Kleiber, director of curriculum, instruction, and assessment for District 6, one key element was data analysis software that allowed teachers to get at critical data with ease. "The previous data analysis software was not very user-friendly and quickly got a 'bad rap' with the teachers," he explains, "so use was extremely limited." The new administration switched to Alpine Achievement Systems data analysis software (developed by Colorado educators for Colorado educators). The acceptance of the new system was immediate and widespread. "Teachers took advantage of in-house professional development opportunities to strengthen their capacity to use the system to isolate specific student needs, as gleaned from the data, and to build student-specific lesson plans to address those needs," he says.

Much of the information the teachers were analyzing was gathered using the Northwest Evaluation Association (NWEA)'s Measures of Academic Program (MAP) assessments, administered to all District 6 elementary students in the fall of the 2005– 2006 school year, with follow-ups in the spring. The district also used the DIBELS (Dynamic Indicators of Basic English Language Skills) assessment developed by the University of Oregon for monitoring early reading progress. With the new approach, in little more than one year District 6 has seen substantial improvement.

"This first round of state assessment results was tremendous, above and beyond our highest expectations," says Kleiber. "The success of aligning instruction to standards, focusing instruction on what is most important, and using frequent assessment to monitor progress caught the attention of the Colorado Department of Education. While still on academic watch, District 6 took an enormous step toward reestablishing itself as an accredited school district with no negative labels."

Positive Teacher Attitudes

Perhaps most dramatic is the change in teacher attitudes with the emergence of a new culture grounded in "information-based instructional practice." According to Kleiber, "The teachers were thrilled with the results. When they began to see the value and power of the MAP data, they started to ask if it would be possible to assess more frequently," he says. "One common comment I heard was 'It just doesn't make sense to use the fall data to build our plans and then not check our progress until spring.' Teachers were seeing the need for more frequent assessment to confirm progress toward their goals and to be able to adjust the plans to most effectively impact positive student achievement. Within the first two months of the 2005–'06 school year, a teacher-driven decision was made to open a winter MAP assessment window. Assessing in the winter window was not a district mandate, but rather a response to teachers seeing and expressing the need for timely data to inform their professional practice."

The middle and high schools began to take note of what was happening at the elementary schools, Kleiber reports, and by the end of that first school year, most of them were on board to use MAP assessments with their students as well.

Monitoring the System's Health

Using the sorts of assessment tools shown in the directory below allows educators to monitor vital signs in two key areas: the health of the overall system and the needs of individual students. In taking a big-picture look at data collected at several points during the school year, information-oriented districts ask such questions as How do individual classes or schools compare to one another? Which strands within a broader subject, such as math, are being taught most and least effectively? Which teachers can serve as resources to one another? What needs changing about our programs? Which interventions are having a positive impact?

Building on the medical analogy, Matthew Deevers, director of curriculum and instruction for the Berea City Schools in Ohio, poses the following questions to principals in his district: If you knew you were at very high risk of having a heart attack, what habits would you change today to influence the outcome? Would you change your diet? Alter your level of exercise? He then challenges the administrators to take a similar look at the school district's "early warning signs," considering what changes are necessary in order to improve outcomes.

These are the sorts of big-picture questions teachers and administrators in the Horry County Schools ask themselves all the time, according to CAO Ambrose. "We are constantly examining data and working in teams to ask 'What if...' questions," she says. "For example, we looked at data from students who were not performing well on the state end-of-course physical science assessment to determine what characteristics they had in common. Answering that question led us to the realization that students who are reading below a Lexile of 1100 and scoring below 235 on the mathematics portion of MAP are likely to have difficulty with the physical science assessment." A similar questioning process led them to conclude that the district's elementary math curriculum materials were working quite well with students on or above grade level but that students not performing on grade level "needed additional instruction. Both of these findings led to significant program modifications.

Spartanburg School District 3 in Glendale, South Carolina, is another district that looks closely at formative assessment results in order to modify instructional approaches and programs. According to Mary Seamon, assistant superintendent for instruction and personnel, this involves drilling down to specific strands and subtopics in order to identify and correct curricular weaknesses. "For example, in math we noticed that measurement was a problem for many elementary grade students," she says, "so we added more measurement activities to the curriculum using the Promethean white boards that are in each classroom."


The key to differentiated instruction is educator familiarity with the personal data of individual students. This report from McGraw-Hill's Acuity provides information to help teachers target lessons specifically to the needs of each child.

Differentiating Instruction

An equally important aspect of ongoing, formative assessment is the way in which it allows teachers to zero in and respond to the needs of individual students.

In Clayton County Public Schools in Georgia, CTB/McGraw Hill's Acuity math benchmark assessment program is one tool that enables this sort of differentiated instruction. A total of five tests are administered across the school year, including a baseline, or pretest, at the start of the year, three scaled benchmark tests between October and March, and an end-of-the-year summative assessment.

These frequent assessments are allowing teachers to group students in a flexible way for remediation or enrichment, according to Assistant Superintendent Dr. Cephus Jackson. The Acuity assessments are used to identify which students need help with which strands within the math curriculum. "Perhaps a student is doing well with measurement but struggling with the steps of long division," Jackson says. "Using this information, the teacher can group the student with others who need extra help in the same area. There is a lot of flexibility in the grouping process, based on what strands are being addressed and what new information is being gathered from assessments."

The district also groups students for reading instruction based on data from Wireless Generation's mClass DIBELS assessment, which allows teachers to use handhelds to assess a child's reading progress as he or she reads aloud. Each student is identified as being at one of three tiers, ranging from fluent to in need of intensive help. With a full 135 minutes set aside for reading each day, elementary-grade teachers have the time to personalize instruction. Some students receive 50 minutes of intensive reading instruction daily, while others participate in targeted "booster" sessions based on their strengths and needs.

Extra Time and Extra Help

The Spartanburg, Horry, and Berea school districts all take a similar approach to differentiated instruction in elementary reading and math. Spartanburg School District 3 added 25 minutes to the elementary school day in order to increase the ability to teach to each student's strength and needs. The school year is divided into nine-week blocks, alternating between an extra focus on math and language arts. Using results from benchmark assessment tests administered three times a year, teachers determine which students need to work on which skill areas and group them accordingly. During much of the day students are grouped heterogeneously, but the extra time allows teachers to target problem areas, offer acceleration for those who need it, and so on. The assessment data are also used to give different homework assignments.

The Berea City Schools offer extra help for students during a 50- or 60-minute instructional period known as the "power hour" or the "nifty fifty." As with Spartanburg, the Berea students are in heterogeneous classes for much of the day, but during these extra time blocks they are grouped based on data regarding the concepts or skills on which they need extra work. The groupings cut across grade levels and change quite frequently as students master concepts or the topic focus changes.

According to Deevers, "At the school that was the first to implement a power hour in mathematics here in Berea, 25 percent of the 5th grade students demonstrated a full year of academic growth between the fall and the winter assessment cycles. We test in the fall and spring but the testing window is also open in the winter for those buildings that wish to more closely monitor student or program progress," he says.


Fluid, ongoing small groupings of students like this one in Horry County, South Carolina, allow targeted instruction on very specific skills for short periods of time.

Flexibility

In Horry County several elementary schools have divided their 90-minute mathematics and 150-minute language-arts instructional blocks into two time periods, with one period focused on teaching curriculum standards at grade level to meet state assessment requirements and a second period set aside for students to be grouped according to proficiency level. To maximize the number of levels that can be addressed during the personalized instructional period, the schools enlist the help of a wide range of teachers, including school coaches, administrators, special area teachers, and retirees who return at designated times to serve as "master teachers." "The regrouping of students requires teams of teachers to work together collaboratively to assume responsibility of an entire group of students, rather than just the 25 or so that would typically be assigned to a teacher," says Ambrose. "The regrouping also requires analysis of student achievement data, constant communication, and careful planning to make sure that differentiation of instruction occurs."

All this emphasis on grouping may seem a bit like the "olden days" when students were tracked by ability, but Seamon sees it quite differently. "There isn't the stigma that was attached to being in the 'Blue Birds' or whatever reading group was known to be the 'slow' one," she asserts. "Much of the day is spent in heterogeneous groupings, with students helping one another, and the needs-based groupings are for a short time, focusing on discrete skills."

Deevers agrees. "This is not tracking, but flexible grouping," he says. "The idea is to teach students where they are academically and to move them forward. With that in mind, we have to be open to the idea that students may move in to or out of groups based on current levels of achievement. Our focus is on cultivating every student's potential, rather than defining every student's limitations."

While the examples of differentiated groupings described above mostly take place in an elementary school setting, the Horry County Schools, like Colorado's District 6, use the MAP benchmark assessments for secondary school students as well. Algebra I students in Horry County, for example, are grouped flexibly based on the NWEA data. To do this, the district increased the number of sections of Algebra I offered simultaneously, allowing students to be regrouped frequently.

Teacher support

As described by Kleiber earlier in this article, the keys to building a data-oriented school culture of the type taking hold in District 6 schools include ease of use, an approach that allows educators to see results, and effective professional development. This year District 6 initiated the concept of weekly data meetings at each elementary school for each grade level. The district also hired a new data coach and 10 new instructional coaches who work directly with teachers to deepen the capacity of staff to understand and use the data for information-based instructional practice.

With teachers in his district coming to understand the concept of academic growth, Kleiber says new questions emerge. "They begin to ask, 'What does reasonable growth look like?' or, 'Are we seeing growth in the right areas?' and, 'How can I best hold high expectations and communicate those expectations to each of my students?'" he says. "These are great questions that open the door to managing their instruction more efficiently."

In the Clayton County Public Schools, elementary school teachers meet three times a week, during release time, to analyze data and plan for students. Each school has a lead teacher for math and a literacy specialist to support reading instruction. And in some cases, the decision has been made to pair seasoned "Boomers" who have teaching expertise but limited experience using technology with younger, more technology-savvy "Gen X-ers" for mutual mentoring.

It is also common for the formative assessment results to be used to help identify teachers who have strengths they can share with others. As Seamon puts it, "We like to recognize outstanding teachers for their performance and identify specific areas where they have had particular success so others can learn from them. We give teachers the opportunity to visit classrooms to watch successful colleagues in action."


Districts experienced with data mining for increased student achievement have found that flexible grouping allows instructors to isolate students needing help in a particular skill. Reports like this one from Acuity help streamline the process.

Principal as Model

Administrators play a crucial role in these schools when it comes to supporting assessment and guiding teachers. "Principals are the true instructional leaders," says Seaton. "We have three training sessions a year focusing on what principals need to model for their teachers when it comes to interpreting data. The superintendent meets with the principals regularly to review data and determine how to best allocate resources to help students learn."

In Clayton County principal meetings take place once a month, with more intensive professional development occurring in the summer. As Jackson explains, "It's a real 'train the trainer' approach with our principals serving as models. Assistant principals are now being added to the trainings so they can play a more central role in supporting teachers. We used to hire consultants to help the teachers learn how to make sense of data, but now we have the expertise in-house."

"Data is not a magic bullet," Jackson concludes. "However, it certainly is a powerful tool to help teachers individualize instruction and help us all work smarter at improving student performance."

Judy Salpeter is a consultant and contributing editor at Technology & Learning and its former editor in chief.

Glossary of Formative Assessment Terms

Formative Assessment: Assessment that provides feedback for the purpose of improving instruction. In formative assessment one measures the degree to which students know or are able to do a given learning task and identifies the areas in which further teaching is needed. Since it is intended to inform and guide teachers it occurs throughout a course or school year.

Summative Assessment: A culminating assessment, at the end of a unit or activity, which gives information on students' mastery of content, knowledge, or skills.

Norm-referenced Assessment: An assessment in which an individual or a group's performance is compared to that of a larger "norm" group.

Criterion-referenced Assessment: A test to measure a student's progress toward mastery of specific learning objectives or standards. The "criterion" is the standard of performance established as the passing score for the test.

Benchmarks: Targets for performance, a measurement of group performance against an established standard at defined points along the path.

Adaptive Tests: The difficulty of questions increase or decrease on computer-based adaptive tests based on student performance along the way. In this way, the test is customized for each tester. Although the tests cover the same mix of content for all test takers, no two people are likely to receive the same set of questions.

—J.S.

FORMATIVE ASSESSMENT DIRECTORY

The Center for Data-Driven Reform in Education (CDDRE)

CTB/McGraw Hill

Datawise

DIBELS

Edison Schools

ETS

Harcourt Assessment

Liberty Source (Tango)

NWEA

Pearson Assessments

Performance Pathways

Plato Learning

Princeton Review

Renaissance Learning

Scantron

SchoolNet

STI

TetraData

Wireless Generation