Creating a Technology Climate Where the Self-directed Learner is Nurtured

Introduction

I have been working on research based presentation projects with my seventh- and eighth-grade students for the last three or four years. As my school’s technology coordinator I have worked hard to collaboratively develop these lessons with our seventh and eighth grade social studies teachers. We have built the lessons carefully, designing them to help students to develop research, presentation, and technology skills, as well as to help them to develop qualities consistent with self-directed learning. We took this same kind of care in the development and revision of our assessment rubrics, which are included in the original packets of information that we supply to our students at the onset of each project. The expectations are clearly stated and we take great pains to fully explain them to our students and to suggest ways that the rubric expectations can be used as self-assessment tools. As my students worked on the first of these projects and presented them to their classmates, it became clear that they need to use the rubric to self-monitor, self-evaluate, and self-modify at certain critical points during the project.

We used two strategies to foster student evaluation and improvement when we readied the eighth-grade project for presentation. The first, a pilot activity called “Critical Friends,†was designed to help students use the evaluation criteria from the project rubric to review and revise both their verbal and PowerPoint presentations before giving their final presentation. The three students involved in this activity improved their final presentation after they completed the “Critical Friends†rehearsal activity. Our student audience used the project rubric and a rating sheet as they viewed their classmates’ presentations, which was the second evaluation strategy. We solicited evaluation comments, limited to the rubric expectations, from the student audience after each presentation. Students themselves began to verbally self-evaluate their presentations. We heard comments such as: “the sound effects I used got in the way of my oral presentation; I wish I had tried harder to find more pictures for my presentation; I could have used spell check; I think my pacing was too fast; my animation got in the way of my presentation.†It became evident to me, as well as to the eighth-grade social studies teacher, that we needed to engage our students in a more formal on-going self-evaluation process. This led to the development of the action research problem and strategies we could use to gather data during the course of the second of the two projects which we completed with our seventh-grade students.

Statement of Problem

How can middle school students become more self-directed performers and producers of computer technology projects through the use of self-assessment rubrics in the social studies classroom? It became clear that there really were two definitive sides to the question. One aspect of the question was: have we created an atmosphere of self-directedness? Have students’ previous experiences in our computer lab, working on computer projects, given them the background to tackle a complex, multi-faceted technology project in a self-directed manner? The other segment of the problem was: do students understand all parts of the assigned computer projects? Before students can understand the project’s assessment rubric and be able to use it as an instructional tool, they must really understand which tasks we are asking them to complete.

This was a larger, more complex problem than I anticipated. Therefore, I decided to limit my action research to one side of the problem. It is this aspect that met the criteria for problem selection: it is in my sphere of influence, it is significant to me, and I am passionate about it. Let me further explain, I am the school’s technology coordinator, and there is another teacher in charge of the computer lab instruction. I work with subject area and classroom teachers to develop computer based projects for students to complete in the computer lab. When I develop a project in this way I am the lead teacher in the lab for the duration of the project. During these projects I design and supervise instruction, assessment rubrics, and any other support materials necessary for the project’s successful completion. I believe that my students are not as successful as they can be on these projects and I would like to find ways they can improve their practice to produce a better final project, one that will allow them to achieve on a higher level of the assessment rubric. To this end I would like to have my students use the assessment rubrics as instructional tools.

The four sequential questions that meet the conditions of the two-step test are:

  • Do students understand all parts of computer-based projects?
  • Do students understand all aspects of the projects’ assessments?
  • Are the assessment rubrics the proper instructional tools that will lead to better student understanding?
  • How can students use the rubrics as instructional tools to improve both their practices and products?

If students do not understand all parts of the project, how can they complete the project successfully? This is a significant factor. I can’t be sure that students understand all parts of the project, no matter how well I explain it, unless we can develop an assessment that will test their understanding. This factor passes the two-step test. If students do not understand all parts of the project, it follows that they probably do not understand its assessment and cannot use the assessment rubric as an instructional tool to help them comprehend the project’s expectations. Again these factors are significant and I am unsure of my students’ understanding. Logically, the next question would be: how can students use the rubric to improve both their practice and their product? I am unsure of this, although I have some hypotheses. I feel this is a very significant factor, since it will yield the practical steps and the concrete ways in which students can use the assessment rubric as an instructional tool to improve both their practice and their product. This will be the first step in the process of creating an atmosphere of student self-directedness in our computer lab. Only then will I be able to develop further instructional strategies, techniques, and plans to address the other side of the issue, which is the creation of an atmosphere of self-directedness for our students in the computer lab. This process has raised many intriguing questions and has helped me to a greater understanding of my instructional practices, my motivation as a teacher, my students’ thoughts, feelings, abilities, motivations, and understanding, as well as, the larger questions of our school community’s learning atmosphere, our assessment rationales, our grading practices, and our educational philosophies.

Methodology

Data Collection Matrix

Research Question

Data Source #1

Data Source #2

Data Source #3

1. Do students understand all parts of computer-based projects?

Student artifacts: a. Index cards
b. Scripts
c. Storyboards

Teacher observations recorded on data collection form2

a. Student surveys3
b. Self-assessments1

2. Do students understand all aspects of the projects’ assessments?

Student artifacts
a. PowerPoint presentations
b. Works Cited resource pages
c. Videos of student presentations

Post presentation peer comments based on criteria rubric1

Instructor’s grading record of criteria rubric1

3. Are the assessment rubrics the instructional tools that will lead to better student understanding?

Student surveys and comments3

a. Student self-evaluation rubrics1
b. Group evaluations4

Instructor’s grading record of criteria rubric1 & 2

4. How can students use the rubrics as instructional tools to improve both their practice and products?

Student reflections3

Class brainstorming discussion

Instructor analysis2:
a. Of all student artifacts
b. Comparisons to overall subject grade and grades on two other projects
c. Comparison to similar eighth-grade project assessments

List of Support Documents *These documents are already uploaded to my Digital Drop Box

1 = RMoroney Civil War Project Rubric.doc
2 = RMoroney Data Collection Civil War Project 7-1.doc & Data Collection Civil War
Project 7-1 #2.doc
3 = RMoroney Project Reflection 7.doc
4 = RMoroney Civil War Project Student Evaluation rubric.doc

By using a variety of collaboratively developed and tested data collection methods, as well as multiple independent sources (triangulation), I hope I have ensured the validity and reliability of this action research project by using tools that were valid and which measured what I intended them to measure reliably with accuracy and precision. These included existing sources such as teacher records, and student work. I have compared the grades from this project to four other sources as part of my analysis: the students’ second trimester social studies grades, their grades on two other seventh-grade computer-based social studies projects completed earlier in the year, and a similar research and presentation project that our eighth-grade social studies students completed, but which did not use the criteria rubric as an instructional tool. In this way evidence can be found that the use of the rubric as an instructional tool has improved student practices and products (see data source #3 for question 4 on the data matrix). Student work also offered a rich source of data to lend validity, reliability, and authenticity to the project grades. In this way we can present real evidence of individual improvement.

We also used tools for capturing everyday life such as classroom observations, teacher journals, and videotapes to collect supporting data. These documents recorded student behavior, interaction, and performance in two ways, as a narrative and as a concrete measurement of student self-directedness. Observations like this lent support, offered understanding, and showed evidence of student practices and products to help us place our other data in context. If students were doing all they needed to do in a self-directed manner in their practice, it stands to reason that their final product shows an improvement in quality. Instructors and students have been able to look back on videotapes of their presentations for more careful in-depth analysis.

I have also used a number of tools and methods for questioning. These research tools included student evaluations and reflections, the project’s criteria rubric, peer comments, and class discussions. It is always good to get feedback from a number of points of view. Students and their classmates offered support and/or counterpoints to teacher observations and opinions, thus expanding the knowledge base and opening a dialogue for later class discussions. As for the project’s criteria rubric, this was the core document for both the project’s instruction and analysis, on which most of the other evaluation tools are based. As Dr. Sagor points out on page 101 of Guiding School Improvement with Action Research (2000), “When rating scales or rubrics are well developed and used properly, students can effectively use them to assess their own work. Students who assess what they produce can then provide their teacher (as well as themselves) valuable data regarding their perceptions of their work, their learning, and their future plans.†The multipurpose way that the rubric has been used for both instruction and evaluation by teachers, students, student work groups, and peers, offered a rich source of data comparison and ultimate understanding from both a research and instructional perspective for all those involved.

Below is a timeline of our seventh grade Civil War project which includes the instructional agenda, the data collection plan, and schedule:

Week 1 — Introduce the Civil War research and presentation project using a PowerPoint presentation modeled with another PowerPoint presentation. Each student receives project packets which contain:

  1. letter to parents
  2. possible topic choices
  3. criteria rubric
  4. storyboard and script forms
  5. project deadlines
  6. project instructions
  7. works cited information and format

End of Week 1 — Each group submits a list of its members and its chosen topic to the seventh-grade social studies teacher for approval.

Week 2 — Use the computer lab to discuss methods of conducting Internet research and to give an introductory lesson on the use of PowerPoint. Distribute worksheets containing search methodologies and strategies, a list of Civil War gateway sites, instructions on image searching and saving. In the library/media center show students how to find and notate resources.

Collect and record observational data on data collection form2.

Week 3 — In the computer lab students continue Internet research, using the project’s criteria rubric as an instructional tool. Students create a storyboard and a script for their presentation, using their index card notes as a guide.

Collected and recorded observational data2.

Week 4 — The student groups submit their script and storyboard to the instructors.

Grade script and storyboard according to the criteria rubric1. Record grades on the data collection form2. Continue to use observational records to record student progress in the computer lab2. Keep copies of sample artifacts (index cards, script, and storyboard) of student work.

Week 5 — In the computer lab give a PowerPoint demonstration lesson. In the library/media center the librarian instruct students in the correct MLA Citation format.

Continue to enter on the data collection forms2 the collection and recording of observational data

Week 6 — In the computer Lab students finish PowerPoint presentations, add animation, sound, fine-tune, and rehearse.

Finish collection of observation data.

Week 7 — Group PowerPoint presentations are due to the instructors.

The instructors review and record group grades on the appropriate sections of the criteria rubric1 and on the data collection forms2. Researcher collect artifacts (PowerPoint presentations) of student work.

Week 8 — During social studies class students give PowerPoint presentations accompanied by oral narrations. The Works Cited pages are due to the instructors for grading.

Record video artifacts of student presentations for later review. The instructors grade and record the grades for the Works Cited resource data pages on both the project rubrics1 and data collection forms2. Action researcher also save sample artifacts (Works Cited pages) of student work. Peer comments1, teacher comments1, self-evaluations1, project evaluations4 by students, student3 and teacher reflections submitted for evaluation. Instructors collect and record this data, noting selective comments on the collection forms2. Action researcher and colleagues analyze the data by comparison to overall social studies second trimester grades, two other project grades, and the assessment of a similar eighth grade project (please note that students did not use the criteria rubric on this project as an instructional tool).

Findings

During the data collection phase I was able to gather rich sources of both quantitative and qualitative data. These fell into four major categories. They are data collected from teachers’ journals and reflections, student reflective quotes and insights, student project artifacts, and quantitative data collected from student surveys, incident reports, and grades. These major categories could be further divided into a number of sub-categories. For example, student reflections fell into five sub-categories, which were related to the presentation and research project, group interactions, use of the criteria rubric, observations about self-directed learning, and negative feedback.

The teacher observations included data noting work habits during the project and comparisons of both seventh and eighth grade students’ self-directedness, work, and artifacts (See Appendix A. Artifacts of Student Work and Photographs of Student Presentations). We observed the following trends:

  • emphasized information, first, and foremost.
  • found more and better images.
  • had better sound effects and music, which enhanced rather than got in the way of their presentations.
  • information was correctly listed.
  • format of each citation was correct.
  • citations were in alphabetical order.

Student reflections included comments about the projects research component, and the use of technology. Specifically, using the Internet as a research resource and learning to use the software program (Microsoft PowerPoint), the importance of topic choice, organization, information, graphics, precise and correct spelling, grammar, and creative production. Many made personal observations and comments about how they enjoyed the project. They observed their group’s dynamics, how they divided up their work, how they used their time on tasks, and how their individual strengths balanced each other. They noted whether they used the rubric and cited the ways in which they used the criteria to help them to evaluate their own work at different stages of the project and as a final assessment. They noted how they grew in self-directedness from this project, articulating their efforts, planning, preparations, mastery of challenges, accuracy and precise use of language, time and commitment to the project’s many tasks, successfully meeting deadlines, learning about both history and technology, and final presentations. The negative comments were few, concerning the use of the technology, lack of computer time, deadlines that were felt to be too short, and the lack of precise steps to follow for the project. (See Appendix B for Representative Student Reflections.)

The following trends have been noted from the quantitative data analysis taken from student surveys3 and grade comparisons2:

  1. The total number of seventh grade students using the rubric as an instructional tool on the social studies project was fifty-six, including thirty-four girls (sixty-one percent), and twenty-two boys (thirty-nine percent). This accounts for sixty percent of the students (Appendix C Data Analysis Charts, *See original Excel document RMoroney Data Analysis.xls — Figure 1 & 1A).
  2. Thirty-eight seventh grade students did not use the criteria rubric as an instructional tool. This includes nineteen girls and nineteen boys, or fifty percent each, evenly divided by gender. That is forty percent of the students (Figure 1 & 1B).
  3. This was further broken down by class and compared to each class’ average grade. The classes in which more students used the rubric also showed evidence of slightly higher overall grades.
  4. Seventh grade students who were taught to use the rubric as an instructional tool had a higher average score compared to eighth grade students, who generally did not use the rubric in the initial stages of the project, but instead used it as a final assessment. The overall average for all seventh grade students based on a four point rating scale was 2.92 compared to an average of only 2.62 for all eight grade students or .3 points higher. These were also broken down by individual classes (Figure 2).
  5. We compared seventh grade averages for the project to seventh grade social studies grade averages for the second trimester. The average grade for the project was 2.92 compared to an average grade of 2.55 for the second trimester or .27 of a point higher (Figure 3).
  6. There were a total of sixty-nine notations of negative behavior incidents, a relatively small number among a population of ninety-seven students on a project that stretched over the course of eight weeks. It should be noted that many of the same students were responsible for multiple incidents.
  7. Students listed their project’s strengths as follows (Figure 4):
    Forty-eight said it was their informationNineteen said it was their imagesEight said it was their group’s teamworkFive said it was the presentation itselfFour said it was the sound effects and music they usedNine gave miscellaneous answers
  8. The following students noted what was most important to them about the project Figure 5):
    Forty-nine said it was their information Fourteen said it was their group’s teamworkTwelve said it was meeting the project deadlinesTen said it was the presentation itselfSeven said it was their grade on the projectThree had other answers
  9. The following are difficulties students perceived in the course of the project (Figure 6):
    Twenty-one struggled with some aspect of the technologyNineteen struggled to find informationThirteen struggled to meet deadlinesNine struggled to find images for their presentationsFive had trouble finding appropriate sounds and, or music for their presentationThree had trouble organizing their materialThree struggled to compile their Works Cited page correctlySeven had other struggles
  10. The following are components students would add if they had more time to work on the project (Figure 7):
    Twenty-seven would add more sounds and, or musicTwenty-two would add more imagesTwenty would add more informationEighth would make changes or corrections to their Works Cited pageThree would improve their use of technology aspects of the projectThree had other improvement concerns

The data collected for analysis during the research phase offers many insight factoids, as well as further opportunities for study and the development of an action plan for effective strategies and techniques to aid student improvement in the areas of using criteria rubrics as instructional tool for student improvement and self-directed learning.

Rickey Moroney

Appendix A. Illustrations of Artifacts of Student Work and Photographs of Student Presentations.

Appendix B. Representative Student Reflections.

Related to the Research Project and Presentation:

“It was different because it was a use of newer technology, the other projects we just typed and drew pictures.â€
Seventh Grader Molly F.

“The strengths of our presentation are the facts and the creativity. What was particularly important to me as I worked on this project was that it was correct, accurate, and successful.â€
Seventh Grader Megan D.

“I think this presentation is different because it was presented in a more up to date way. I also thought this project was fair and fun to learn how to do a PowerPoint.â€
Seventh Grader Alex B.

“The way everything was laid out, such as the background, where pictures were placed to add emotion, and how each slide was in context, how they impacted people.â€
Seventh Grader Brian D.

“The year’s projects were all good and required thinking. They all made us use our imagination and brain.â€
Seventh Grader Mike L.

“Overall, I had fun with the projects out of all the years, these projects were the most fun and the most involved yet.â€
Seventh Grader Natalie L.

“Our parents were shocked at how good it was. They liked it because it gave a lot of detail and it was short and to the point.â€
Seventh Grader Joe C.

“We normally don’t do PowerPoint presentations so it made it more interesting. It also made it fun because we could learn how to use the Microsoft PowerPoint program, which will come in handy when we are older and we have a project to do we can use PowerPoint.â€
Seventh Grader Liz E.

“This project was similar to the Civil War PowerPoint presentation that we completed at the end of seventh grade. These were both PowerPoint presentations that required us to research information, organized it, and present it. The Decades project was different from the Civil War project because many of the topics in the Decades project had not been covered in class. We had gone over the Civil War before doing the project in seventh grade.â€
Eighth Grader Jacqueline R.

On Group Interactions:

“The group had to work together. We all had to do our parts and be responsible. We also made many compromises on how we should do things.â€
Seventh Grader Beth B.

Use of the Criteria Rubric:

“If I did use it, it probably would’ve helped.â€
Eighth Grader Tim M.

Observations about Self-Directed Learning:

“It was important to us that all our information was correct and understandable.â€
Seventh Grader Lauren T.

“Our strengths in the presentation was (sic) our ideas of planning out and the people in the group. Alex is smart on all the facts, Steve is very organized, and I can get what we need on the Internet in a few minutes.â€
Seventh Grader Michael M.

“Some strengths (sic) in our presentation were we were organized and responsible. It all came together very nicely.â€
Seventh Grader Meg K.

“Made sure our Works Cited was done right. Tried to be independent.â€
Seventh Grader Lauren B.

Negative Feedback:

“This project is very difficult. I didn’t like having it on the computer because it was hard to find the time to go to the computer lab. Being able to work on projects at home was what made the other projects fun. I think more time in computer should be given. My suggestion is that you should keep the project the same, but let us use oak tag or regular paper instead of slides on the computer.†Seventh Grader Kaitlyn D.

Appendix C. Data Analysis Charts

Click the image for a larger view.

References

Sagor, R. D. (2000). Guiding school improvement with action research. Alexandria VA: ASCD.