DATA: Maximize Your Mining, Part One

Over the last decade, schools and districts have become increasingly sophisticated in their collection, storage, and analysis of data. And with the rise of NCLB, the focus of data analysis has been largely trained on ways to help schools achieve Adequate Yearly Progress. The more important and greater challenge, however, remains in finding ways to harness data over the long term to raise student achievement in a consistent, sustained manner.

Three Stages to Maturity

Typically, schools move through three stages as they learn to link data to higher student achievement. Stage one consists of initial efforts to contextualize the many data sources available, stage two focuses on using data to maximize educational efficiency, and stage three represents a fundamental reorganization aimed at ensuring sustained higher levels of performance. This article describes best practices associated with stages one and two, toward the goal of helping schools and districts accelerate their progress toward stage three. Part Two of this series, which will appear in our June Leadership Guide, will examine stage three best practices.

Stage One — Analysis for its Own Sake

Leaders of schools in the early stages of data-driven decision making understand that data analysis is important and helpful for planning and monitoring but have not yet developed sufficient skills for filtering data sources according to their relative usefulness. The focus is more on the analysis process than action plans that follow from the analysis.

Two years ago, I observed a school in stage one. They had a history of low performance and, as part of a reform effort, had committed to using data to improve student achievement. The leadership team was reviewing the state's high-stakes assessment results with the aim of comparing the school to statewide average percentage points in each category. Plugging the numbers into a spreadsheet, they determined their "strengths and weaknesses," identifying three categories which had the greatest discrepancies with state averages. As an action plan, they made those three areas themes for special "skill of the month" activities for the coming year.

In many respects, this school had done precisely what we expect of institutions that employ good practices of data-driven decision making. They collected and organized data; they analyzed the data and agreed on findings; and they developed an action plan in response to their findings. Unfortunately, their action plan did not result in substantive gains. While zealous to analyze the data, they had lacked the experience to understand the limitations and context of the data and its usefulness to their needs.

It is important to note that stage one is a necessary step in the development of a data-driven culture. The school just described was merely taking its first tentative steps in data-driven decision making. While this stage cannot be avoided, the goal should be to make it as brief as possible.

The following best practices can dramatically shorten the length of time spent in stage one:

1. Understand the limitations

The school described here used the following guiding principle: if numbers are reported, then they must be useful and informative. While all data is informative, most single data points do not lead directly to a useful set of findings. In this case, the school was looking at summative data describing the performance of the previous cohort of eighth-graders, which had since moved on to high school. The data's applicability to the current cohort is unclear without additional information comparing the two cohorts. Further limiting the usefulness of this data set was its lack of specificity. The report included only the average performance of all students in broad categories. There was no indication of the spread of student scores or the extent to which various subgroups contributed to the overall average. Due to its limitations, planning effective actions from this data alone is impractical.

Too often, leaders of stage one schools drop the data onto the desks of their teachers and say: "analyze it" without providing professional development on how to interpret and understand the limitations of the data. To be useful, the teachers must be trained on how to understand the relative value of data and how to combine and compare it to other more specific and informative sources.

2. Understand the context

The school that I observed established action priorities by calculating the differences between the school's performance and state averages in various categories. The differences varied modestly among the categories. However, lost in this effort was the broader context that the school's performance substantially lagged the state average in all areas. Minor differences in performance in the various categories are not an effective indicator of relative need. In fact, all areas of the curriculum required major attention. An action plan that focused only on a few "weaknesses" would not result in significant future achievement gains because it missed the bigger picture.

When analyzing a data set it is also important to ask questions about its context. What is the big picture? Have there been recent changes to the school, the method of collecting data, or the individuals and groups from whom the data is collected? What data was collected, how often, by whom, and using what measures? Answering these sorts of questions will provide an understanding of the applicability of the data to a specific situation.

In this case, questions that would further help establish context would include: How do the state averages compare to expected proficiency levels in various categories? What is the appropriate comparison, state standards or state averages? To what extent does the school's performance parallel state averages in each skill area? What are the relative weights of the different categories on the state assessment?

3. Understand the implication for further data collection

Academic journals frequently include a section entitled "further research" in which they list questions or issues that require additional study. School leaders would benefit from using a similar technique as part of their data analysis routine. Every round of analysis generates implications for further data collection and analysis.

The action plan of the school I have described was based on schoolwide data rather than individual or subgroup results. In addition, they used only summative assessments to draw their conclusions. To be effective in addressing the needs, this school needs to gather data at multiple levels at shorter time intervals about very specific student skills (see sidebar "Drilling Down" on the previous page).

Stage one is a first and necessary step in data-driven decision making, but staying there for an extended period will breed frustration because results are not readily produced. Effective professional development is key to educators understanding the context, limits, and implications of the data they analyze, and to moving to stage two where results are the payoff.

Stage Two — Analysis for Improved Efficiency

Many school districts and state departments of education have made considerable investments in data warehousing technology. They use this technology to combine information from disparate databases and perform sophisticated analyses of longitudinal student performance, demographic factors, and financial information. The primary purpose of these efforts has been improving large-scale operational efficiency by evaluating the effectiveness of various resource investments such as personnel or curricular expenditures and identifying priorities for additional investment.

While large-scale data analysis projects continue to drive policy decisions related to operational efficiency, NCLB has pushed educational organizations to focus more on educational efficiency. NCLB's rigid standards of accountability — articulated in percentages of students required to score proficient or above on high-stakes assessments — have forced schools to develop methods of accurately tracking the performance of groups of students. Organizations in stage two are analyzing data to affect smaller-scale, shorter-term decisions at the school and classroom level.

The overriding goal of stage two schools is to find the most effective ways for the school to achieve AYP. They use data to answer questions such as: How many students do we need to score proficient or above on the state high-stakes test? How many students are currently performing at this level? Which students are on the margin of performance level? How many of them can get to the standard if given appropriate special attention? How many students does each teacher need to "contribute" to reach the total?

Schools in stage two use data to squeeze the most out of their students' performance and thereby get as many as possible over the next bar. They do everything they can to maximize the performance of students on the margin of moving to the next performance level. Schools on the knife-edge of AYP must get results now, and therefore the focus is on short-term action plans. Which "bubble students" need help on this week's mathematics topics? What was the impact of last week's Saturday School lessons? How many in the ELL subgroup reached the threshold on this month's practice test? For three keys to successful stage two schools, see "Meeting AYP Goals."

The Role of Formative Assessment Systems

Perhaps the most effective means of using data analysis to improve educational efficiency is through formative assessments. Several companies — including Edusoft, Chancery, NWEA, and Tungsten Learning (see "Formative Assessment Providers," below for a more complete sample) — provide excellent formative assessment systems. The best of these systems provide assessments tightly aligned to state standards and individual test items designed to emulate those on the state's high-stakes tests. These systems provide assessments on a frequent basis (monthly or quarterly), giving teachers the benefit of monitoring performance for students they are currently teaching and allowing them to make adjustments in response to that data.

These sophisticated formative assessments provide a "neutral" source of aligned content at the appropriate performance level. This is particularly important in schools where there is a mismatch between the level of classroom teaching and assessment and the level of the material assessed by the high-stakes test. In chronically low performing schools, it is common for students who get A's and B's on classroom assessments to perform below or at basic on state high-stakes assessments. The problem is that teachers spend most of their time teaching below "grade level" and below the level assessed by high-stakes state tests.

Providers of benchmark assessment systems help overcome this challenge by providing schools with statistical analysis about how well their tests predict student performance on high-stakes assessments. A typical correlation would state that if a student scores consistently above 78 percent on the benchmarking system, then they are almost certain to score proficient or above on the high-stakes test and conversely, if they score below 62 percent, they are very likely to score below proficient. Schools use student performance on the formative assessments to identify marginal students, adjust instruction, and diagnose problems with the level of teaching or classroom assessment.

Stage two schools are committed to using formative assessment systems and implement structures to ensure that their teachers and team carefully analyze the data collected. The most effective schools train teachers to use a highly ordered approach to analysis — assessment by assessment and item by item. Structured analysis typically takes the form of a series of questions that teachers ask as they review the data. (See "Structured Analysis Method".)

Glass Ceiling for Stage Two

While the structured analysis method can help stage two schools dramatically improve their educational efficiency, there remains a threatening cloud on the horizon. What happens when efficiency has been maximized, when all of the students within striking distance of proficiency are performing at their very best? Inevitably, focusing most effort on the marginal students will fail to be a sufficient strategy as NCLB standards continue their inexorable climb to 100% proficiency by 2014. From this longer perspective, we can see that data analysis that focuses on improving efficiency works mostly at the edges of the problem. To move beyond stage two, schools must get to the core of the issue.

In the next article in this series we will explore how schools can move to stage three-analysis for sustained achievement. In this stage, schools use data analysis and organizational reform to help all students make sustained gains in achievement. It is about establishing structures and building systems that will fundamentally change the mindset of the school from one where some or most students can meet the standards to one where all students can.

Todd McIntire is a vice president at Edison Schools, Inc. He is currently involved in developing Edison's new division in the United Kingdom.

Drilling Down: Four Tips

The following are specific data points necessary to create an effective action plan.

1. Understand the performance of the current cohort in comparison to the previous and the next cohorts.

2. Collect information about individual student performance.

3. Identify specific skill areas that teachers are failing to secure.

4. Know whether teachers are teaching below the level of the standards being assessed. Data-driven decision making is an iterative process with each round of findings and additional data collection moving you closer and closer to the core issues.

Meeting AYP Goals

Successful stage two schools apply the following methods to measurable subgroups as required by NCLB.

1. Move from percentages to numbers to names.

NCLB goals that are presented as percentages of students needed to meet various thresholds are unnecessarily abstract. Teachers need to understand the precise meaning of the goal for their classes and the individuals in their classrooms. Stage two schools convert their AYP goals from the percentage of students needed to reach proficiency to the number of students in the tested grades that need to meet the target to a list of names of students targeted to be in that number. Teachers know which students in each of their classes must perform to proficiency in order for the school to reach its goals.

2. Focus extra effort on marginal students.

When trying to improve efficiency we typically start by adjusting the factors that provide the greatest impact with the least effort. In education, this means focusing on students that are on the verge of reaching the next proficiency level. Schools in stage two go to great lengths to identify and focus interventions on these students. They use practice assessments to determine the performance level of each student and identify the "bubble students." Then, they focus discretionary resources on moving these students securely into proficiency. Teachers know the names and performance levels of each of these "bubble" students and have classroom-level plans to get them to the next performance level. From an efficiency perspective, these students represent the slack in the system and getting them to perform at their best will make the greatest impact on helping the school reach its NCLB goals.

3. Track student performance and adjust accordingly.

Knowing the target and identifying the students that will help the school reach its target are important first steps, but to gain high levels of efficiency, teachers must understand the precise areas of need of their students. Stage two schools use formative assessments, from teacher observations to electronic benchmark testing systems, to regularly measure student performance against known thresholds. They track baseline performance, rates of progress, and longitudinal patterns. Teachers analyze the data from these assessments to diagnose the specific needs of each class and student, make adjustments to the pace of instruction, and review and re-teach where required.

Structured Analysis Method

The most effective schools train teachers to ask the following about student assessment data.

1. Have I taught the content assessed by this item?

a. If yes, go to question two.
b. If no, is there anything to be learned? If the students performed well, can I reduce the amount of time I will spend when we get to this topic? How can I use this data as a baseline? Go to question five.

2. Did the students perform as well as I expected?

a. If yes, what are my expectations for performance on this item? What other assessment data do I have to establish my expectations? Go to question five.
b. If no, go to question three.

3. What do their attempts at answers tell me?

a. Are the students guessing the answer?
b. Do the students have a misconception (as evidenced by selecting the same distracter) and are solving the problem the wrong way?

4. What do I need to do to improve the students' performance?

a. Is a brief review sufficient?
b. Do I need to re-teach the concepts with different methods or from another perspective?
c. If the students have a misconception, how can I "un-teach" it and help them create proper understandings?

5. What actions will I take in the next day, the next week, and the next month to act on the findings of this analysis?

a. How will I reassess this content after I have made the adjustments suggested by this analysis?

Often teachers will answer these questions using a standard analysis form that can be shared with other members of a teaching team or used by managers to monitor compliance and the quality of analysis.

Formative Assessment Providers

CompassLearning Odyssey Manager
www.compasslearning.com
(800) 422-4339

Houghton Mifflin Edusoft
www.edusoft.com
(866) 4-EDUSOFT

Levings Learning PassPlan
www.levingsco.com
(877) 281-4561

McGraw-Hill
Yearly ProgressPro
www.mhdigitallearning.com

NWEA
Measures of Academic Progress (MAP)
www.nwea.org
(503) 624-1951

PLATO Learning eduTest Assessment
www.plato.com
(800) 44-PLATO

Princeton Review Homeroom.com
www.homeroom.com/faq/index.asp
(800) REVIEW-2

Renaissance Learning StandardsMaster
www.renlearn.com
(866) 492-6284

Scantron Classroom Wizard
www.scantron.com
(800) 228-3628

Tungsten Learning Benchmark Assessment System
www.tungstenlearning.com
(866) 801-7683