For some districts, the current obsession with data grows out of the need to comply with No Child Left Behind and additional accountability-related mandates. For others, it dates way back before the phrase "data-driven decision making" rolled so frequently off the tongues of educators. In either case, there is no denying that an integral part of the business of K-12 education today is to collect, manage, analyze, and learn from a wide array of data. In response, the past few years have witnessed an explosion of technology-based tools, consulting services, professional development opportunities, and other resources designed to help schools move beyond being data rich but information poor.
Robert Ewy, former director of planning for Community Consolidated School District 15 in Illinois, describes his district's situation before they began implementing a data warehousing solution three years ago. "Data was scattered across the district, mostly in paper form," he says. "It was difficult to find or access when it was needed, and usually required too much time and effort to analyze. If having the right data available at the right time to influence the decision-making process is a primary criterion, we failed miserably."
Before investing in a new solution, it was crucial for the district to understand how they were going to make use of the data. Asking the right questions starts with a district's mission and goals. For CCSD 15, this meant looking at organizational effectiveness, with guidance from the Baldrige Education Criteria for Performance Excellence, and focusing on student performance targets set by the board of education.
Ewy explains, "Each target was important to a student's future success and each was, in theory, measurable — except that we had not established a method for measuring them. This led us to identify 19 questions we wanted to answer through data analysis. Examples included questions about the characteristics of students making the most dramatic gains and losses on local, state, and national tests; how students who have been in attendance in the district for varying amounts of time perform academically; and how well second language and special education students achieve after exiting the program." CCSD 15 contracted IBM to build an educational data warehouse that would make it easy to answer these and many other significant questions.
A Closer Look at Student Achievement
To track student achievement, it is clear that one needs multiple sources of data from multiple time frames. Administering standardized exams once a year with results delivered months after the tests were taken is far from sufficient. As Dr. Gregory Decker, principal of Lead Mine Elementary in Raleigh, N.C., puts it, "Receiving test data in July is like driving a school bus looking out the rearview mirror. I can see where my students have been but I cannot see where we are going."
Instead, he insists, "We need to assess student learning and collect real-time achievement data on a continuum — quarterly, monthly, weekly, and even daily." At Lead Mine and many of the other schools represented by the education leaders interviewed for this article, assessments and benchmarks are being used several times during the year to measure how students are doing.
A number of districts are administering homegrown assessments using pencil and paper, and then digitizing and incorporating them into achievement-oriented databases. Others are using computer-delivered adaptive tests such as those from the Northwest Evaluation Association, which not only gives a more accurate reading by adjusting in difficulty based on a student's responses, but also offers immediate results to teacher and student. State test scores, grades, comments from school report cards, rubric scores of student performance, and records from computer-managed courseware all round out the picture.
With such a wealth of performance and achievement data and the right digital tools to access them, it is possible to drill down to information about the exact skills individuals have mastered or need help with. Tools for disaggregating and grouping also make it possible to identify groups of students who need special interventions, draw conclusions about curriculum areas that are particularly strong or weak, and generally shape instructional programs based on achievement data.
The Title I program at Rowland Unified School District in California has shifted focus with help from technology. High school intervention coordinator Sallie Paul uses the Edusoft data management tool, among others, to examine performance for all students down to substandards, and design targeted programs. Students who need help are assigned to short, intensive interventions focusing on specific skills such as synonyms or exponents. In addition, school-wide strategies have been developed for areas where many students are having difficulty. The targeted interventions are evidently paying off; over a two-year period, the high school has seen more than a 20 percent increase in student pass rates for each subject on the California High School Exit Exam.
See More Examples
Looking at Data Over Time
A key aspect of data-driven decision making involves looking at information over an extended period of time. This longitudinal approach allows schools to monitor trends, track the effectiveness of interventions, or determine what happens to groups of students as they move from one setting to another.
A longitudinal perspective also serves the needs of individual students. According to Michael Greenfield, director of instructional technology for Harrison Central School District in New York, teachers frequently work in isolation with little knowledge about the history of the students they teach or what happens to them after they leave their classroom. "While teachers use disaggregated exam results to focus on individual student needs, they are unlikely to look at trends across classes, buildings, or the region. An apt aphorism is that they can't see the forest for the trees," he says.
Greenfield was impressed during a visit to Pearl River, N.Y., a district that, like Illinois' CCSD 15 described earlier, has been honored as an education winner of the Malcolm Baldrige National Quality Award. There he observed a group of elementary school teachers from several different grades sharing information about fourth-grade students. "Often," Greenfield says, "the exam-related pressure has fallen unfairly on the fourth-grade teachers because their students have historically been the ones required to take the New York state assessment test. But it is actually a formative test, a benchmark to measure progress toward graduation. It makes sense that third- and fifth-grade teachers should also view each of these students as 'theirs.'"
As data management tools mature and districts amass integrated information from a growing number of years, this sort of longitudinal analysis will become increasingly easy to conduct — yielding important results for shaping school programs.
In Mamaroneck, N.Y., an analysis of math assessment data, student placement in middle school math classes, and quarterly grades helped educators identify students who should have been placed in more rigorous courses of study as they transitioned from fifth grade to middle school. This analysis resulted in a change of placement criteria and process for the district and the establishment of a committee of fifth- and sixth-grade math teachers who meet monthly to discuss students and transition issues.
See More Examples
A Multidimensional Perspective
The newest data management tools allow information from many different sources, including ones that have not traditionally been viewed as achievement-related, to be accessed at the same time, facilitating multidimensional analysis. The most obvious example of this is the integration of achievement data with demographic information from a school's student information system, making it possible to sort results by gender, race, languages spoken at home, school lunch status, numbers of years in the system, and much more.
Another information source that is growing in importance as a tool for data-driven decision making is the survey. Dr. Jim Angermeyr, director of research and evaluation for the Bloomington Public Schools in Minnesota, explains, "We collect survey information from parents and students annually on a variety of satisfaction measures. Schools use this information to determine whether they are successfully meeting parent and student expectations, providing a safe and orderly environment, as well as a place where students feel they are connecting with teachers."
Additional sources of data include special education files, personnel and professional development records, disciplinary reports, library records, and financial data. Such records make it possible to arrive at a variety of conclusions about individual programs as well as to seek correlations about the cost-effectiveness of a program, or the relationship between teacher participation in a particular professional development program and student performance in the curriculum area targeted.
At Paxton Keeley Elementary School in Columbia, Mo., data from the district's TetraData warehouse showed a strong correlation between students who had been in the district less than two years and low reading scores. "Based on the finding," says principal Elaine Hassemer, "we decided it was important to put in place more support for new students." The school also uses the SWIS II database to keep track of discipline records, including time and types of referrals. After observing a spike of aggression-related referrals from the playground one month, the school decided it was time to reteach equipment routines. The result was a dramatic drop-off in the number of problems.
See More Examples
Clean and Accurate Data
For technology leaders committed to supporting data-driven decision making in their districts and schools, there are a variety of technical challenges to overcome, many of them involving availability and reliability of data. Some examples include the need to digitize large quantities of information, including, in some states, test results; difficulties cleaning up data from multiple sources so that they are compatible and nonredundant; and various entry and accuracy errors that can lead to incorrect conclusions.
Katie Lovett, chief information officer for Fulton County Schools in Atlanta, Ga., asserts that it is difficult but essential to develop validation processes, procedures, and definitions "to deliver reliable data that users trust. After all," she says, "we seldom get a second chance for user buy-in." The need for accuracy and for viewing all data with a critical eye became clear when Adequate Yearly Progress results were released in Lovett's state last year. "Fulton County Schools had the data available to quickly refute the results," she explains, "and some schools were subsequently removed from the Needs Improvement list."
There is also the more subtle question of curriculum alignment or ways of establishing whether the assessments actually measure what we think they do.
The thorough curriculum mapping project undertaken by Lead Mine Elementary before adopting their technology solutions involved detailed questions such as where students need to be at the end of the second quarter of third grade in order to succeed on the end-of-grade test. To measure progress at this level of detail, the school developed internal benchmarks and adopted managed software solutions such as Pearson's SuccessMaker. Principal Decker explains that his school provided Pearson with student results on the end-of-grade test for 2000 as well as data on their performance in the courseware, and the company performed an on-target analysis showing the statistical relationship between the two. "Now," says Decker, "we can forecast when a student needs to reach a specific courseware level and see the relationship of that level to achievement on the end-of-grade test. We incorporate this information in our quarterly benchmarks, and use it to drive our day-to-day decision making."
See More Examples
Building an Information Culture
The most important element of an effective data-driven program is not the data, the analytic tools, or even the curriculum framework on which data analysis is based; rather, it is the school culture in which the data inquiry takes place.
For starters, it takes a strong district-wide commitment, or as Pearl River's director of quality and community relations Sandra Cokeley Pedersen puts it, "Insisting upon it. If people don't have the data to back up their plan or evaluation, make them come back with it. Nothing happens in Pearl River without data analysis first."
Mary Tribbey, data coordinator for the Butte County Office of Education in Oroville, Calif., says that it is crucial for administrators to be on board, and to make time and resources available to the teachers, even in cash-strapped times.
Professional development for teachers, led by staff who are deeply grounded in curriculum, is another essential element. According to Jim Hirsch, associate superintendent for technology for the Plano Independent School District in Texas, "Teachers and other staff members need to understand the power of collecting, organizing, and using information to help a student's achievement. Unless the staff is convinced the information received is worth their extra effort, data warehousing will not pay off."
Joy Rose, principal of Westerville South High School in Westerville, Ohio, describes the goal as a "data mind-set." "We are continually asking what data we have to support this hunch and what data we need to determine a decision," she says.
Rose and several other administrators interviewed for this article emphasize the importance of creating a positive atmosphere in which data is used to support, not to punish. "So much depends on the tone set by the building principal," says Jim Angermeyr. "I would never advocate using test data and school improvement results in a punitive way or as part of a merit pay plan. When the principal approaches the whole data-driven process as a professional activity in which everyone is intent on improving teaching and learning practices, I have seen strong positive results."
As Robert Ewy puts it, "When educators are challenged to meet high academic expectations, they will want to use data to help them better understand the dynamics of student performance in their schools. Data without a purpose will do nothing. Data with a purpose can create miracles!"
At the beginning of each school year, the principals in the Norwalk-La Mirada Unified School District in California come together for a retreat during which goal-setting based on data analysis is a key component. During the school year, they meet twice a month to continue their work around data-driven decision making. Weekly staff development sessions for teachers, as well as workshops at other points during the year, focus extensively on data and its use. There is a strong emphasis on allowing principals and teachers to discover collaboratively the benefits of using data, and on encouraging teachers and schools to share their successes with one another.
See More Examples
In researching this article we interviewed 20 school administrators and technology leaders who are deeply involved in data-driven decision making. See their names and titles. A special thank you to all of them.
NEXT: Elements of a Data-Driven System
More School/District Examples
Judy Salpeter, former editor-in-chief of T&L, now serves as a consultant, freelance editor and program chair for Technology & Learning Events.
Read other articles from the March Issue
Elements of a Data-Driven System
This illustration is based on two figures from Making Sense of the Data by Eduventures, Inc. For more information regarding Eduventures' research and advisory services, visit www.eduventures.com.
A comprehensive system for data management, analysis and reporting might consist of the following:
As illustrated, these typically include student information systems, human resource records, financial databases, and assessment data from sources such as state tests, benchmark assessments, and instructional management software. In addition, specialized databases with information on individual education plans for special education, disciplinary referrals, professional development and teacher certification, technology support help line calls, community survey results, and library circulation can all play an important role in data-driven decision making.
Districts investing in data-driven decision making will often supplement internal resources with consultants and other service providers who can offer help with professional development, needs assessment, data analysis, and system planning. In addition to the companies in this arena, many nonprofit associations and consortia also offer benchmarks, advice, downloadable tools, and other data-related resources.
One of the newest and highest profile data management tools in the education world, the warehouse is a central storage area for data pulled from the various databases. An extraction, transformation, and loading tool is typically used to bring validated and consistent data into the warehouse. In some systems, the data warehouse will be used to spin off smaller subject-specific databases, called data marts, for reporting purposes.
Decision Support Tools
These newer, more specialized utilities, designed to play a prescriptive role, are described in Making Sense of Data as "provid[ing] recommendations, real-time alerts, and automatic actions for administrators, teachers, and staff."
Data Analysis Tools
This category includes a wide array of technology-based tools for statistical analysis, forecasting, graphing, and highlighting trends. In addition to special-purpose tools, spreadsheets and other common applications already in use in the district can play an important role in data analysis.
The ability to create customized, formatted reports of various sorts is built into most of the data tools, including the data warehouse itself.
It is important to note that it is very possible to engage in data-driven decision making without assembling all of the tools shown here. Many districts employ homegrown data solutions that use Access, Excel, or other common tools to pull directly from the source databases, skipping the warehouse all together.
Continue to Resources > > >
< < < Return to Intro
Read other articles from the March Issue
Data Management Solutions Tailored to Education
EDmin Virtual EDucation (www.edmin.com)
IBM Insight at School (www-1.ibm.com/industries/education)
Otis Education Systems (www.otised.com)
Public Consulting Group (www.pcgus.com)
Sagebrush Analytics (www.sagebrushcorp.com)
SAS Education Performance Management (www.sas.com/govedu/education/edperfman.html)
Scholar Inc. (www.scholarinc.com)
Turnleaf AMS Enterprise v3 (www.turnleaf.com)
Business-Oriented Data Solutions Used by K-12 Customers
Computer Associates (www.ca.com)
Confluent Technologies (www.confluenttech.com)
SPSS Inc. (www.spss.com)
Data-Related Professional Development and Support for Schools
Baldrige National Quality Program (www.baldrige.nist.gov)
Co-nect Dataflow (www.co-nect.net/services/dataflow.shtml)
Consortium for School Networking (3d2know.cosn.org)
Education for the Future (eff.csuchico.edu)
New American Schools (www.naschools.com)
Pulliam Group (www.pulliamgroup.com)
Other Pieces of the Picture
Although this directory does not list individual companies and organizations in the following areas, these programs can play a crucial part in data-driven decision making.
- Tools for Data-Driven Decision Making: Student Information Systems (www.techlearning.com/db_area/archives/TL/2003/06/tools.html)
- Instructional Management and Assessment Programs (www.techlearning.com/story/showArticle.jhtml?articleID=16000696)
- Additional tools for administering surveys, scanning test results, tracking special education plans, monitoring discipline referrals, and more.
Continue to More School/District Examples > > >
< < < Return to Elements of a Data-Driven System
Read other articles from the March Issue
More School/District Examples
Here are some additional examples of the ways data-driven decision making is being implemented in schools and districts around the country.
In Prince William County, Virginia, the district math supervisor took a closer look at results from several elementary schools that were, on the whole, performing well on the state math test and found a surprising number of fifth graders who were failing certain portions of the test. She requested money to provide an upper elementary math intervention program and used the data that she pulled from the data warehouse to support her grant application. A similar approach has been taken by school principals who have used the data to identify students who belong in reading intervention programs based on their performance on state reading tests.
At Lead Mine Elementary, Principal Decker creates spreadsheets several times a year with student results from internal benchmarks, Successmaker lessons and other sources. Teachers use these to determine which students need extra challenges, who is on target and who needs remediation. The school's improvement team meets three times a year to examine the data and set priorities. Two years ago, the team concluded that interventions were lacking for fourth and fifth graders who were reading below grade level. An upper-elementary reading intervention teacher was hired and students who needed the extra help have improved dramatically.
In Bloomington, Minnesota, and Florence County, South Carolina, disaggregated data helped educators focus in on the fact that far too many black, male students were scoring below grade level. Dr. Jim Angermeyr, Bloomington's director of research and evaluation, explains that, "We discovered that students entered our schools in kindergarten with this learning gap and that while students of color were evidently making clear progress from year to year on growth measures, they were not gaining enough ground to close the gap.Ã“ In both districts, interventions and teacher education programs have been undertaken to address the problem head-on. In Florence County District Three, for example, a number of teachers have been sent to training sessions with Ruby Payne, an expert on addressing the needs of students in poverty. Preliminary results in both districts are promising.
Return to Main Article
"It wasn't until our district hired a director of assessment seven years ago that we began to focus not just on how students compare to norms but on their growth over time," says Mark Schneider, assessment specialist for the Norwalk-La Mirada Unified School District in California. Using NWEA assessments, administered several times a year, the district now develops annual class assessment and grade level profiles to help principals and teachers measure progress. LaVaun Dennett, administrator of curriculum, assessment, and instruction, says, "It seems particularly important to me for teachers to be able to look at the progress students have made in their particular class. When we give them class data at the beginning of the year and they see each student's growth at the end of the year, they have to take responsibility for what happened." The profiles are also used to transition from one year to the next. Teachers meet each fall in grade-alike groups to decide how to group students and allocate resources, based on data, to increase student achievement during the coming year.
In Chappaqua, New York, a few years ago, then-director of technology Michael Greenfield headed up an action research study focusing on student performance on the mandated fourth grade New York State exams. "This was a grassroots way of developing our skills at working with data," he says, "while addressing teacher concerns about student performance." A group of fourth grade teachers and elementary administrators met a number of times to discuss the performance of a group of students who were not scoring well on the tests. The goal was to find patterns and trends. It became clear that, although many of these same students had been in early intervention programs when they were in kindergarten and first grade, inadequate data had been gathered about their progress after they stopped receiving the special services. This made it difficult to determine how the early intervention program informed fourth grade performance or to track other factors that might have contributed to progress in the intervening years. The data analysis experience turned out to be a powerful professional development opportunity, Greenfield says, and helped inform the development of a data warehouse project that would improve the ability to do longitudinal assessment in the future.
With five years of trend data in their Educational Data Warehouse (EDW), Illinois' CCSD 15 is now able to track cohorts to determine what educational value certain programs and approaches add over time. "More importantly," Robert Ewy explains, "we can identify subgroups within each cohort that are not as academically successful and focus scarce resources in very specific areas to create academic success for various subgroups of students." Ewy reports that, because of interventions resulting from this information, the district's special education and second language learner achievement rates are much higher than state and national averages.
Return to Main Article
Tracking Other Types of Data
In the San Jose Unified School District in California, data concerning student demographics have played an important role in shaping intervention programs. As part of a three-year plan focusing on the achievement of high standards in reading for every student—and, specifically, those students for whom achieving high standards was elusive—Castillero Middle School took a close look at a number of data elements, including where students were born. They found that Hispanic students born in Mexico and Hispanic students born in the United States required different levels of academic support services. This allowed the school staff to target resources and support more effectively for the students. "All facets of the school program, including staff assignments, resource allocations, governance structures, best teaching practices and school policies, were realigned to support these reform efforts," says continuous improvement programs supervisor, Marcy Lauck, of Castillero's reading-improvement efforts. "Teams of teachers, students and parents continue to engage in innovative action-research, ongoing data analysis and training to make their vision a reality."
Joanne Good, IT Manager for the Charles County Public Schools in Maryland, looks forward to the completion of a new district-wide data warehouse that will house a broad range of information in a single, linked location with an easy-to-use interface. In addition to information on student achievement, schedules and demographics, the data warehouse (which is being built with help from Oracle and IBM) will track suspensions, teacher qualifications and financial statistics, allowing educators to look for an array of correlations.
According to Jim Hirsch, associate superintendent for technology in the Plano ISD in Texas, tracking a broad range of data allows for many important correlations to be made. He offers some examples: "Attendance data has been correlated to student achievement. Help Desk records showing availability of technology resources have allowed us to identify areas of greatest need for updates and related support. Data relating to actual use of technology resources has enabled us to provide support to those teachers who have had more challenges in using those resources with their students. And data about the experience level of teachers in various schools has allowed us to focus on providing high quality teachers to those student populations that need it most."
Two of the nation's largest school districts, the Montgomery County Public Schools in Maryland and the Chicago Public Schools in Illinois, have taken the lead in using data management tools to monitor staff development. Both districts have built databases to keep track of teacher qualifications and professional development activities. According to Montgomery County CIO John Porter, MCPS's new professional development module will provide valuable input to the staff development office while helping district administrators track progress on government mandates related to teacher qualification and certification. Chicago Public Schools CIO Robert Runcie, explains how his district's new web-based Educator Qualification System will manage the process for tracking the qualifications of about 33,000 teaching professionals. "Teachers can log in and see what's in their records and request corrections. The record then moves on the principal who can accept the corrections and view other information on teacher progress and qualifications." The database will be used district-wide to track progress toward NCLB requirements and generate letters to parents whose children are being taught by teachers who are not yet meeting the requirements.
Return to Main Article
Accuracy of Data
In California's Rowland Unified School District, much energy has gone into ensuring the reliability of self-made exams. The questions district assessment leaders ask themselves are as specific as "Does this benchmark, administered by this teacher in the third month of this school year accurately predict how that student will do on end-of-the-year exams?" Director of assessment, Tony Wold, says that the district has worked closely with Edusoft programmers to create a more advanced item analysis tool that gives (Kudor-Richardson) reliability scores for each question asked. The reliability analysis looks not only at the overall question but each of the answer choices to determine if they are asking precisely what needs to be asked. Using this tool, district leaders have analyzed every internally-developed benchmark for reliability and commissioned task forces to amend those that needed to be adjusted or rewritten. Wold casts an equally critical eye on statewide assessments. In some recent cases, he has found compelling evidence that the data being reported were inaccurate. When this happens, he says, "We have to make a case for ignoring the state data until we can have confidence in its accuracy."
Katie Lovett, chief information officer for Fulton County Schools in Atlanta, Georgia, says: "I have been heavily involved in building a data warehouse in two of the largest school systems in the state. In both districts, I have seen the challenges and frustration around lack of data quality." She advocates setting clear data validation processes and procedures; agreeing upon and documenting data definitions; and establishing business rules regarding the conditions of the data being used at a specific point in time.
According to Robert Ewy, CCSD 15's use of SPC software allows them to determine the capability and stability of systems and processes. "For example," he says, "we use certain control charts to analyze our mathematics and reading programs to determine if they are capable of achieving our student performance targets and if they are stable over time. This analysis allows the department of instruction to focus strategy development and resources in very specific ways. This has been a significantly new and important addition to our data-based decision making process."
Return to Main Article
Professional Development and Learning From One Another
According to Mary Tribbey, data coordinator for the Butte County Office of Education in California, "Linking student performance results to teacher professional development has helped many of our schools. For example, when student results point to a specific need, we examine who on staff has knowledge and skills in that instruction or that strategy." The district provides professional development for a number of different aspects of data-driven decision making, including data analysis, designing effective assessments, and developing subject area strategies and coaching to improve instruction. Tribbey says that training teachers in the use of the data management software is essential, but takes relatively little time compared to the other crucial aspects of data-related professional development. For help infusing data-driven decision making into the local school culture, the Butte County Schools have worked closely with professional experts such as Robert Marzano, Richard Stiggins, Dennis Fox, Victoria Bernhardt, and Thomas Guskey.
Speaking about his school's emphasis on data-driven decision making, principal Greg Decker explains: "When we interview prospective teachers, we tell them, 'This is what Lead Mine Elementary does and this is what will be expected of you. This is not just a job decision; it is a career decision.' As a result, we get new teachers who are philosophically aligned with our vision and mission." When working with his teachers — veterans and newcomers alike — Decker focuses on keeping the approach positive and constructive. On occasion, he says, it will be clear that a teacher needs to change what he or she is doing, but he feels that it is important to look at the data in a way that is neither punitive nor intimidating. "It's about the children," he says. "Teachers understand that. If a teacher says 'I taught that four times!' but then sees, from the data, that the students didn't learn it, it's time to figure out what needs changing." A data bank showing effective teaching strategies and colleagues to call upon when help is needed with a particular skill provides an added resource for grassroots professional development.
"Teachers have generally not had a lot of experience with data or the experiences they have had are negative," says Joy Rose, principal of Westerville South High School in Ohio. With help from organizations such as the Baldrige Initiative (in which the entire state participates) and Victoria Bernhardt's Education for the Future, all that is changing in Westerville. Teachers and administrators have attended many professional development sessions over the past several years, with the focus being on using the Baldrige and EFF tools to develop a 'data mind set.' In the process, teachers have learned to develop class mission statements and ground rules, chart progress, help students keep data notebooks, use affinity diagrams to brainstorm ideas and "fishbones" to uncover cause and effect, and much more. They have also participated in teacher forums where they can share their experiences with others regarding success and opportunities for improvement. ("Failure," Joy Rose points out, is not a Baldrige word.). When asked whether teachers are intimidated by the magnifying glass under which their progress is examined, Rose responds, "I think when only a few people have access to data, the impression is that things are 'secretive' and therefore it is threatening. When all teachers have access to their data, play a part in analyzing the data and then are given real opportunities to USE the results to develop and implement plans to improve instruction, the 'threatening' aspect disappears."
In the Bloomington Public Schools in Minnesota data management tools are used not only to track student improvement but also to share resources between schools. Each school team builds its own continuous improvement plan using the district's online tool. Plans have three components: a goal (the measurable outcome they want to achieve), a strategy (the broad methodology they believe will help them reach the goal) and an activity (the specific steps to accomplish the strategy, including who, what, when, and what cost). At the activity level, the team also records how the success of the activity will be evaluated. The results of these evaluations are later stored online where other schools can access them. "In this way," says Jim Angermeyr, director of research and evaluation, "every school can view the database of activities that have been tried and adopt those which have been the most successful."
Return to Main Article
Continue to Acknowledgements > > >
< < < Return to Resources
Read other articles from the March Issue
A special thank you to the following school administrators and technology leaders who contributed to this article:
Dr. Jim Angermeyr, director of research and evaluation, Bloomington Public Schools, Minn.
Sandra Cokeley Pedersen, APR, director of quality and community relations, Pearl River School District, N.Y.
Dr. Gregory S. Decker, principal, Lead Mine Elementary School, Raleigh, N.C.
LaVaun Dennett, administrator of curriculum, assessment, and instruction, and Mark Schneider, assessment specialist, Norwalk-La Mirada Unified School District, Calif.
Robert Ewy, consultant and former director of planning for Community Consolidated School District 15, Ill.
Steve George, chief information officer, Prince William County Schools, Va.
Joanne Good, information technology manager and project manager for the data warehouse project, Charles County Public Schools, Md.
Michael Greenfield, director of instructional technology, Harrison Central School District, N.Y.
Dr. Lynda P. Hawkins, senior director for accountability, Florence County School District 3, Lake City, S.C.
Elaine Hassemer, principal, Paxton Keeley Elementary School, Columbia Public Schools, Columbia, Mo.
Jim Hirsch, associate superintendent for technology, Plano Independent School District, Texas.
Dr. Sherry P. King, superintendent of schools, Mamaroneck Union Free District, N.Y.
Marcy Lauck, supervisor, Continuous Improvement Programs, San Jose Unified School District, Calif.
Katie Lovett, chief information officer, Fulton County Schools, Atlanta, Ga.
John Q. Porter, chief information officer and associate superintendent, Montgomery County Public Schools, Rockville, Md.
Joy Rose, principal, Westerville South High School, Westerville, Ohio.
Robert W. Runcie, chief information officer, Chicago Public Schools, Ill.
Mary Tribbey, data coordinator, Butte County Office of Education, Oroville, Calif.
Tony Wold, director of assessment, Rowland Unified School District, Los Angeles, Calif.
Read other articles from the March Issue