Our Ethical Obligation

from Technology & Learning

Students are unimpressed by tech-phobic professors.

"The first generation of students that has grown up with digital technologies is hitting our campuses. They have deeply held expectations about how those technologies should be used, and we [college professors] are wholly unprepared for them."

These are the prophetic words of Harvard's Dr. Chris Dede, made during a speech at the University of Minnesota five years ago. A half-decade later, Dede's statement remains more apt than ever.

Perhaps even more so than in K-12, core instruction at the university level looks much the same as it did 50 years ago. Classroom pedagogy typically consists of a professor at the front of the room, lecturing or facilitating student discussion. Some professors have substituted a projector and PowerPoint for the traditional chalkboards and overheads, but usage of more sophisticated, collaborative, and empowering technology tools is still a rarity.

Of course there are obvious exceptions, but campus technology infrastructures, though quite robust, remain primarily in the service of administrative efficiency and student recruitment.

A number of root issues have prevented digital technology from filtering into the day-to-day teaching roles of most professors. Above all, faculty members prize their academic freedom, so it's nearly impossible to require professors to use technology in their instruction or take advantage of university-provided training opportunities. Additionally, few universities have incentive structures in place that reward technology-using faculty. In fact, at many institutions faculty members' technology integration can actually be a detriment to gaining tenure. For example, there aren't many deans that give release time for the extra demands of online courses. Finally, there are raging debates in academia about whether faculty should even cater to students' technology interests. Some professors even request "kill switches" for their classrooms so they can shut off students' wireless access during instruction.

This technology-free approach to learning can't make much sense to the average 20-something college student, who may be deeply skeptical about her professors' general relevance in this digital era. An adult learner who is otherwise immersed in learning and communication technologies must wonder why her extremely smart professors either can't learn this stuff or simply choose not to. It's a fairly damning indictment of postsecondary teaching when students can see the possibilities of digital video, podcasts, blogs, wikis, and other tools when their instructors can't.

Universities are complex, bureaucratic organizations with teaching paradigms that stretch back centuries. I often tell educators that if they think change is slow in K-12, they should visit higher academia. But the possibilities for richer instruction, powerful engagement, and deeper learning are extremely ripe for the picking—if we can just get more faculty members into the digital orchard.

Ultimately, students will migrate to universities that "get it." The explosion in online and distance education programs shows adults are willing to walk away from the traditional university experience. As students' impatience with their instructors' technological obtuseness increases, colleges and universities need to have some honest and open discussions about their ethical obligations to meet technology-related instructional needs and interests. Otherwise, as educator and blogger Clarence Fisher notes, they'll be left to cry in their books while students take their learning, and their tuition, elsewhere.

Scott McLeod is director of the UCEA Center for the Advanced Study of Technology Leadership in Education (CASTLE) at Iowa State University. He can be reached at www.scottmcleod.net or mcleod@iastate.edu.