Do They Know What They Think They Know? - Tech Learning

Do They Know What They Think They Know?

Purpose of the Study When informally observing students perform various computer operations and applications, there appears to be discrepancy between their ability to perform specific applications and their personal assessment of their computer proficiency. In case after case, the presenters have observed students
Author:
Publish date:

Purpose of the Study

When informally observing students perform various computer operations and applications, there appears to be discrepancy between their ability to perform specific applications and their personal assessment of their computer proficiency. In case after case, the presenters have observed students performing at a much higher level of competency on specific computer functions and applications, such as word processing, than the level for which these same students give themselves credit. A formal assessment comparing undergraduate students' perceptions of their overall level of computer efficacy and their perceived ability to perform specific functions and applications would provide useful information for classroom instruction. Therefore, the purpose of this study was to examine the congruence between undergraduate students' perceptions of their overall levels of competency in using computers compared with their ability to perform specific computer applications.

Theoretical Framework

Students that enroll in teacher preparation programs within the State University System of Florida are mandated to complete at least one specific course in educational computing technology. In addition, the teacher preparation program also infuses educational computing technology into the various, required teaching methods courses. As future classroom teachers, students are expected to acquire the competencies required to use computers and related technology when teaching and performing a variety of functions including: word processing; spreadsheets; presentations; internet research; email; etc.

Florida pre-professional teachers are expected to be able to:

  1. Use technology as available at the school site and as appropriate to the learner.
  2. She/he provides students with opportunities to actively use technology and facilitates access to the use of electronic resources.
  3. The teacher also uses technology to manage, evaluate and improve instruction. (Florida Department of Education, Education Standards Commission)

While the focus nationally has long been on computer technology, in the past few years it has shifted to integrating that technology into the curriculum (Albion, 1999). Because teacher education programs need to prepare graduates with skills in both computer technology and an ability to integrate that technology into teaching, it is important to know: just how well prepared are the students to use computer technology within educational contexts?

Many teacher education programs require graduates to possess certain technical skills, such as the operation of hardware or software and knowledge of specific computer technology functions. Yet, research has shown that graduating these beginning teachers with computer skills does not necessarily translate into the integration of that technology into teaching (Oliver, 1993). There is a growing body of research indicating that teachers self-efficacy in their capacity to work with technology can have a significant influence on determining classroom use (Honey & Moeler, 1990; Marcinkiewicz, 1994; Albion, 1999).

Numerous research studies have shown that teachers (and one would assume that includes pre-service teachers) have positive attitudes about computer technology yet they do not consider themselves qualified to teach with technology. If these teachers are to integrate technology into the curriculum they must feel efficacious using it (Delcourt & Kinzie, 1993; Ertmer, Evenbeck, Cennamo, & lehman, 1994; OTA, 1995). Bandura (1986) describes self-efficacy as being "concerned not with the skills one has but with the judgments of what one can do with whatever skills one possesses" (p. 391). Because there is a high correlation between efficacy judgments and subsequent performance, it is important that self-efficacy of computer technology use be measured and improved (Delcourt & Kinzie, 1993; Ertmer et al., 1994; Murphy, Coover & Owen, 1989).

Method of Inquiry

This descriptive study used survey methodology.

Data Sources

A total of 169 undergraduate students, (90% female, primarily second-year, elementary education majors, between the ages of 19-25) that were enrolled in the course "Introduction to Educational Technology," at a large urban public university located in Southeast Florida, participated in this study. The students were asked to complete two surveys after the second week of classes. The first survey, "Learning Technology Self Assessment," consists of ten high level items covering specific computer application competencies. The ten competencies include areas such as word processing and desktop publishing, presentations, spreadsheets, databases, web page design, Internet related functions, etc. The students were asked to rate each of the ten items on what they perceived their current level of computer technology usage/understanding. The ratings used the following scale: No Access/Don't Use; Novice; OK; Comfortable; and Expert. Following their initial rating, the students were instructed to convert their initial rating into a corresponding numerical rating ranging from 0 to 4. The ten numerical scores were then summed and divided by 10 to arrive at their "Personal Technology Use Index." The following index interpretation guidelines were provided by the authors of the instrument: 0-1 = The computer is the gray box sitting next to that thing that looks like a TV!; 1-2 = There's hope for you!; 2-3 = There's a few things you have yet to learn!; 3-4= Maybe you should teach the course!; and 4 = You're in the wrong class!

While the first survey measured the students' overall level of competency in using educational computing technology, the second instrument, "Technology Experience Level," measured their level of competency on 160 specific computer functions and applications. The 160 items can be directly subsumed under the ten items making up the "Learning Technology Self Assessment" survey. The students were asked to rate their experience level on each of the 160 items using the following scale: 1 = "I don't know how to do this;" 2 = "I may have done this, but I don't know it well;" 3 = "I know how to do this;" 4 = "I know how to do this and I understand it well;" and 5 = "I'm an expert at doing this."

The two surveys were matched according to a unique code created by the students. The code was used to ensure anonymity to the students. A total of 116 usable surveys were matched and used for the data analysis.

Results

The data analysis of the two surveys is primarily of a descriptive nature. The results of the students' ratings on the ten specific items making up the Learning Technology Self Assessment survey were reviewed for their overall level of computer technology usage/understanding. In general the students rated their current level of competency as follows:

Function / Application

General Rating

Word Processing

Comfortable

Desktop Publishing

Novice

Databases

Novice

Spreadsheets

Novice

Presentation Software

Novice

Instructional Software

Novice

Web Page Design

Novice

Internet for Research

Comfortable

Email

Comfortable

Live Group Discussion

OK

In general, the mode level of computer technology usage and understanding was at the "Novice" level. The students were "Comfortable" with word processing, Internet research, and Email usage. They appear to have the highest level of competency with live group discussion that they rated themselves as "OK." As a group the students did not see themselves as being at the "Expert" level on any of the functions. With regard to the Personal Technology Use Index, the 116 students' mean index was computed at 1.915. According to the instrument's interpretation index, a score between 1 and 2 equates to "There is Hope for You!"

Next, the students' responses on the "Technology Experience Levels" were analyzed. The majority of the 160 specific items were combined according to computer function in order to create a scaled score. The intent here was to create a scaled score corresponding with the items making up the Learning Technology Self Assessment survey. Certain of the 160 specific items did not directly correspond to the functions making up the Learning Technology Self Assessment survey. As a result, a total of seven scaled scores were created representing the following same functions that are found within the Learning Technology Self Assessment survey: Word Processing (20 items); Databases (16 items); Spreadsheets (14 items); Presentation Software (20 items); Web Page Design (20 items); Internet for Research (17 items); and Email (28 items).

The "Technology Experience Levels" survey did not contain items that could be used to create scaled scores corresponding to the functions of: Desk Top Publishing; Instructional Software; or Live Group Discussion. Examination of the scaled scores from the Technology Experience Levels survey resulted in creating four competency levels based upon quartile percentages: Poor, Fair, Good, and Excellent. As shown below, the results reveal that the students tend to rate themselves as "Good" across all the specific activities making up the seven functions.

Function/Application

Scaled Score

Percentile Group

Word Processing

70.69

Good

Databases

24.78

Good

Spreadsheets

33.71

Good

Presentation Software

56.03

Good

Web Page Design

32.19

Good

Internet for Research

38.48

Good

Email

91.25

Good

Results of the Study

The results on the Learning Technology Self Assessment survey reveals that the majority of the undergraduate students who participated in this study tended to rate themselves at the "Novice" level of computer technology usage and understanding. On the other hand, these same students tended to rate themselves as "Good" on their ability to perform the specific functions and applications described in the Technology Experience Levels survey. There appears to be a discrepancy between undergraduate students' perceptions of their overall levels of competency in using computers compared with their ability to perform specific computer applications. More specifically, students believe they are capable of performing specific computer functions and applications at a higher level than the level at which they rate themselves on an overall basis of usage and understanding. Knowledge of this discrepancy is beneficial to instructional planning.

Educational Significance

When planning the curriculum and related course syllabus, faculty should select learning activities and environments that recognize the full range of their students' capabilities. The resultant learning experiences should be designed in such a way to consider the students' prior knowledge and experience levels. Underestimating the students' capabilities will serve to dilute the learning experience effectiveness. It may even lead to boredom and disinterest on the students' part. When recognizing that an actual discrepancy exists between students' perceptions of their overall level of computer usage and understanding, versus their rating of how well they can perform specific applications, should enable faculty to design a more appropriate and challenging syllabus to meet the course goals and objectives.

Email: Robert Vos, Ed.D.

Email: Paul A. Rendulic, Ed.D

References

Albion, P.R. (1999) Self-efficacy beliefs as an indicator of teachers' preparedness for teaching with technology, Association for the Advancement of Computing in Education.

Bandura, A. (1986). Social foundations of thought and action. Englewood Cliffs, N.J.: Prentice Hall.

Delcourt, M.A.B., & Kinzie, M.B. (1993). Computer technologies in teacher education: The measurement of attitudes and self-efficacy. Journal of research and development in Education, 27 (1), 35-41.

Ertmer, P.A., Evenbeck, E., Cennamo, K.S., & Lehman, J.D. (1994). Enhancing self-efficacy for computer technologies through the use of positive classroom experiences. Educational. Technology Research & Development, 42(3), 45-62.

Honey, M., and Moeler, B. (1990). Teacher's beliefs and technology integration: Different values, different understanding. (Technical report 6) Center for Technology in Education.

Marcinkiewicz, H.R. (1994). Computer and teachers: Factors influencing computer use in the classroom. Journal of Research on Computing Education, 26(2), 220-237.

Murphy, C.A., Coover, D., & Owen, S.V. (1989). Development and validation of the Computer Self-Efficacy Scale. Educational and Psychological Measurement, 49, 893-899.

Oliver, R. (1993). The influence of training on beginning teachers' use of computers. Australian Educational Computing (July), 189-196.

Sheffield, C.J. (1994). Are your students like mine? Preservice Students' entering technology skills. In J. Willis, B. Robin & D. Willis, (eds.), Technology and teacher education (pp 67-71). Charlotesville, VA: Association for the Advancement of Computing in Education.

U.S. Congress Office of technology Assessment. (1995). Teachers and technology: Making the connection. (OTA-HER-616). Washington, DC: U.S. Government Printing Office.

Featured

Related

What Do They Need to Know About Cyber Safety?

Each school year in celebration of October’s Computer Learning Month, we focus on Cyber Safety and Ethics Awareness issues at our school, Deer Park Elementary. This paper outlines our cyber safety and ethics initiatives, and addresses what we have learned about what our students know and what they need to know in

What Teachers Must Know to Help Students Know

from Educators' eZine Adults assume students are tech-savvy, but in doing so we assume far too much. Students are good at using computers — for the things in which they are interested, which gives off a false illusion. I believe in a 10% theory of knowledge. If someone knows 10% more than another person and

Image placeholder title

What it means, how they do it

Predictive assessment gives teachers a prediction of how students are likely to perform on state summative exams three, six, or nine months before taking them.

What Are You Doing In That Computer Lab?

from Educators' eZine --> "Experience is the name everyone gives to their mistakes"—Oscar Wilde If that's the case, I've had a lot of "experience" from bringing my students to the computer lab. For example: At

Help Students Think about What They Post Online

I’ve read recently about several situations where adults read and acted upon information posted in personal student blogs. I’ve been surprised that the students view this as an invasion of privacy. What do you think? Part of growing up includes learning from mistakes. This includes saying and/or

Getting to Know a Digital Textbook promo image

Getting to Know a Digital Textbook

When I switched to open source digital textbooks, I created a series of activities to help students better learn how to use the textbook in digital form.