Q&A: Gerald Crisci

If anybody in the K–12 environment knows about assessing the value of an IT program, it’s Gerald Crisci. As director of technology for Scarsdale Public Schools in Scarsdale, N.Y., Crisci recently oversaw a thorough inquiry of his district’s approach to IT. The investigation explored the gains the district makes through its technology expenditures.

Q: How do you prove a technology investment has made an impact on student learning?

A: It’s easy to show what the impact of technology is on someone’s teaching, but it’s difficult to show a direct correlation between IT and student learning. In studies that I’ve seen, people talk about things like improving student attendance and improving student test scores. At the end of the day, that’s not really a compelling argument in terms of the value you put on technology. How does IT change what kids can do? How does it change how they learn? Those are the key questions.

Q: How do you answer them?

A: Very carefully. We recently did a study called the “Tri-State Evaluation of Technology.” We had a team of sixteen people from other districts come here in November, and they evaluated our tech program with a focus on research. The way they went about the evaluation was using a model that looks at each area of the district through the lens of 15 different indicators. These indicators span the gamut from professional development to equity, opportunity to learn, and innovation. These evaluators conducted interviews of members of our teaching staff, the community, and also students. It was really comprehensive.

Q: Why focus on research?

A: If you’re going to do any study of technology, you have to focus on one area. It’s too broad otherwise. We chose research because it was easy to document for us. We also thought that we had a need to focus in on that area. We felt like we needed some work in that area in terms of collaboration. Why pick something we think we’re already excelling at? If you’re going to do any kind of evaluation, the whole idea is to get some bang for the buck.

Q: Sounds interesting. How did you prepare for this?

A: What we did in preparation for the interview was significant. We spent a year gathering evidence of our use of technology. Teacher lessons, student work, supporting material—you name it, we looked at it. This model looks at things in three different ways: approach, implementation, and result. After this research, we made recommendations. (See www.scarsdaleschools.org/technology/Recommendations.html)

Q: And what did you learn?

A: We did this because we felt it would give us an opportunity to rally around something we really needed to work on. Second lesson we learned—you should really focus on one area if you’re going to measure the value of your district. Any good evaluation should foster conversations that continue beyond the evaluation. We had 370 different pieces of evidence that we presented. We reorganized it by grade level, and now we’ll have department chairs and other interested parties who are not part of the evaluation come in and help foster conversations within the district.

Q: How much did this whole process cost?

A: It didn’t cost that much at all. The Tri-State operates under a “group of friends” model, which means you’re being evaluated by peers and colleagues from other districts. The only money we had to put in was before the visit, and the money funded computer teachers and librarians doing professional development so they could gather the evidence. Realistically, they did a lot of work on their own.

Matt Villano is contributing editor of School CIO.