Driven by Data - Tech Learning

Driven by Data

Question: What is data-driven professional development? A mouthful What you get when you apply data-driven decision making to staff training A technique for planning professional development based on research instead of seat-of-the-pants guesswork All of the above The answer, of course, is d, and although
Author:
Publish date:
Social count:
0

Question: What is data-driven professional development?

  1. A mouthful
  2. What you get when you apply data-driven decision making to staff training
  3. A technique for planning professional development based on research instead of seat-of-the-pants guesswork
  4. All of the above

The answer, of course, is d, and although the definition may be cumbersome, the concept of data-driven professional development is both straightforward and sensible. Implementing this approach is another story, however, which is why many administrators are turning to sophisticated tools to help manage data collection and analysis. These tools allow educators to assess and correlate student outcomes, instructional methods, and state standards — information that can then be used to select appropriate professional development opportunities based on teachers' real needs.

We spoke with three district administrators who are using data-driven methods and technologies to plan professional development programs, improve instruction, and augment student achievement. Here are their stories.

Targeting Success

Three years ago, technology training specialist Robbie Grimes wanted to change the way teachers in Brownsburg, Indiana planned professional development for technology, going from the whimsical, "Hey, what do you want to learn?" approach to a data-based methodology.

Using MyTarget, an online self-assessment tool from iAssessment (www.iassessment.com) that's based on the National Education Technology standards, he and his colleague Susan Smith asked district teachers to spend two hours evaluating their tech skills. Using rubrics, teachers assessed their comfort levels for everything from right-clicking with a mouse to video-based distance learning. MyTarget sorted through the responses and generated lists of training opportunities customized for each teacher's needs. Later, after teachers had attended the recommended conference, taken the suggested course, or availed themselves of supplementary Web links and resources, they could return to MyTarget, reassess themselves, and view their progress on a chart.

"Once teachers took the assessment," says Grimes, "we were able to look at a chart for any one building in our district and say this particular area needs to be focused on in this particular building," says Grimes. The resulting professional development opportunities addressed teachers' self-reported areas of deficiency — for example, social and ethical issues of technology. The district also surveyed students on what technological tools they used in class and understood. This helped corroborate the teachers' data. For example, when a teacher reported using PowerPoint in class, the students' data backed that up.


"We were able to prove that there was an increase in knowledge," says Grimes. "Teachers felt they weren't so successful before and were more successful afterwards."

In addition to providing useful data for Indiana's building improvement plans, the assessment information came in handy at school board meetings. Board members were able to see that the specific way in which money was being spent for equipment and professional development was paying off in skill level.

One unexpected hitch: Because the tool was time-consuming — it takes two hours or more for teachers to complete — many of the Brownsburg principals have decided not to use it anymore. To reverse this trend in the future, Grimes plans to have participating teachers take the assessment over a longer period of time, in smaller chunks rather than all at once. The next step after that is correlating how student achievement is affected by the professional development they're offering.

Editor's note: For tips on how to design your own online data gathering tool, see our Online Survey Checklist.

Brownsburg's Lessons Learned

  • Use more than one assessment tool. Multiple data sources optimize the credibility of results.
  • Create a well-developed plan for data collection and a reliable assessment strategy to determine how effective the process is.
  • Don't ask teachers to complete all the MyTarget segments in one sitting.

Making Connections

In Texas, Julie Guarjardo is guiding her school's professional development with teacher effectiveness data. The principal of Love Elementary — a pre-K-5 school in Houston with 500 students — Guarajardo relies on assessment tools from Co-nect (www.co-nect.org), a full-time facilitator, and occasional consultation services for their staff development efforts.

Guarajardo starts with student performance assessments. She and her facilitator look at data from standardized tests taken by third- through fifth-graders and perform an item analysis for each content area, grade level, objective, and response. That allows them to determine how their students are doing and what their target areas should be. "We can see, for example, if students have full knowledge of certain items under an objective," explains Guarajardo, "if they're still going for the distracters, or if they're just all-out guessing." That tells the administration if they've taught a concept well, if they're getting close with just a glitch or two to iron out, or if there's a problem area that needs to be re-addressed.

Love Elementary also gathers teacher instructional data, culled from both administrative walk-throughs and peer observation. They use rubrics for the classroom visits — "Evidence of Quality Teaching" and "Evidence of Quality Learning" — to standardize the data. Then they administer Co-nect's Instructional Practices survey, asking teachers what approaches they're using and what support they need.

Using all of this data, Guarajardo zeros in on an appropriate professional development plan. To accomplish all this, she relies on a grant that funds the Co-nect assessment software tools and, perhaps more important, a full-time Co-nect facilitator who translates data from item-analysis breakdowns and assessments into targeted professional development offerings.

One benefit of this approach is an increased feeling of ownership. Because there's solid data to back up the assessments and the plans, teachers and students see where they need to go and the path to get there. The biggest hurdle is finding the time. Training, completing the observations — it all cuts into the school day. "When we go into each other's classes, that requires subs and scheduling dates and times that don't interfere with lunch or other programs," says Guarajardo. Despite the challenges, this data-driven approach seems to be paying off. "Even since August we're seeing a real difference in instructional practices. And it's based on rubrics and solid research, not just stuff I'm pulling out of my hat."


"We walk around and determine where we are, not so much individually, but as a whole," says Guarajardo.

Love's Lessons Learned

  • Strive for what Jim Collins calls "autopsy without blame." Take a clean look at student performance data without focusing on individual instructors.
  • Provide adequate resources. Everyone wants to do their best, but with financial and professional resources you can get the tools and key people to help you achieve your goals.
  • Personalize the assessment approach according to the culture and needs of your schools.

Data-Driven Epiphany

Michael Prada is principal of School of the Epiphany, a San Francisco K-8 institution with 650 students. Like most administrators, Prada and two assistant principals make informal walk-through observations of the school's staff. Their goal: 45 walk-through observations a week, per administrator.

On a typical day, Prada pops in unannounced to a teacher's classroom, stays for two to three minutes, and then documents his observations on his Palm handheld using a template created with eWalk software from Media-X Systems (www.media-x.com). "We look for everything. We observe questioning techniques, types of questions, and time on task, and we look at teacher instructional behaviors, cognitive behaviors, and pedagogical issues," he says. Once completed, Prada e-mails the results directly to the teacher and carbon copies himself and the other two administrators so everyone's in the loop. Later, he uploads all the observations to the server.

Epiphany's administrators made a deliberate decision to use the data for professional purposes only — to have individual goal-setting conversations, for example — not for evaluations or contracts. "I don't want people to be afraid of it. It really is about growth," says Prada.

To that end, anonymous aggregate reports are posted every two weeks so faculty can spot trends and view their collective progress. "We can see, for example," explains Prada, "that of the 273 walk-through reports we've created to date, there were 72 PA announcement interruptions in class, so we've learned we need to minimize that. Or we notice that out of 273 walk-through visits, only five times did we see technology being used. Or of all the questioning techniques that we saw being used, 95 percent were of the lower order thinking skills."

The hardest part was creating the walk-through template, "because if you don't ask the right questions, you won't get the right data," says Prada. To help with this they used formulas from Dataworks (www.dataworks2.com), a company that focuses on curriculum calibration, effective direct instruction, and time on task. Sample template questions include: "Are teachers echoing back correct answers?" "Do they provide sufficient think time?"

To be sure, many staff members question the validity of data gathered from three-minute snapshot visits. But the method's strength is "showing, in a concrete way, the pulse of what's happening on a regular basis," says Prada.

Epiphany's Lessons Learned

  • Don't think of the data as threatening; it's neither good nor bad.
  • Be sure to ask the right questions.
  • Don't tie the results to formal evaluations, especially in the beginning, or you won't get cooperation or buy-in.

Stephanie Gold is a San Francisco-based freelancer.

Featured

Related

Data-driven teachers by Dr. Scott McLeod

Data driven decision making (DDDM) is a system of teaching and management practices that gets better information about students into the hands of classroom teachers.  The five major elements are •    good baseline data •    measurable instructional goals •    frequent

Data-Driven Instruction Survey Released

LinkIt!, the K-12 education technology, assessment and data-management company, announced today the results from its “Step Back, Leap Forward Report: A 2012-2013 Data-Driven Instruction Survey,” a nationwide survey

Data-Driven Teaching Skills

Are you struggling with data-driven decision-making? Do you wish you had better training in data-driven instructional practices? Dr. Scott McLeod has created a white paper that outlines the basic competencies needed for effective data-driven instruction.