When it comes to ChatGPT and similar AI tools, Jane Rosenzweig has a simple question: To what problem is ChatGPT the solution?
That question is the focus of a course that Rosenzweig, the director of the Harvard College Writing Center, is teaching this semester. “I imagine there will be many wonderful uses for AI in the future and some tutoring interfaces will be created that will have great value,” she says.
However, she adds, “Given all we know about the limitations of generative AI that we have right now, it's hard for me to see that the tutoring models that I'm seeing are actually solving a problem that has been identified.”
While many educators have touted the tutoring potential of GPT technology and encouraged students to explore ways in which ChatGPT can provide writing advice, Rosenzweig is not convinced, at least not yet. She see limitations in this technology for teaching writing and literature, though she stresses these are her personal opinions and do not reflect any policy or position from Harvard.
Is AI A Reliable Tutor?
Proponents of AI tutors point out the potential to provide more individualized learning and immersive experiences for students. One such AI tutor is Khangmigo, a GPT-4 powered interactive tutor released to select schools by Khan Academy. In a TED Talk about Khanmigo, Khan Academy’s founder Sal Khan, who I have interviewed several times this year, shared an example of how Khanmigo can allow a student to interview a historical figure such as Albert Einstein or even a character from literature. In the example, a student asks an AI Jay Gatsby why he stares at the green light. AI Gatsby answers: “I gaze at it longingly as it represents my yearning for the past and my hope to reunite with Daisy, the love of my life.”
The problem, Rosenzweig wrote in a post on X (formerly Twitter) with the exchange is, “That, of course, does not sound like Gatsby. It sounds like an AP exam answer written in the first person.” She added, “It would be great to meet and talk to historical and literary figures! But this seems more like having a puppet read Wikipedia to us. What problems are solved by offering students the chance to talk to an Einstein puppet?”
Getting flashcard-type answers from digital animatronics is not how Rosenzweig wants her students to “interact with literature,” she says. But more important than her opinion on the tool is that she wants educators to think critically about this technology and its uses. “Maybe there's a great answer for why I want my student to get their answers from Jay Gatsby instead of from a class discussion,” she says.
ChatGPT’s Editing Advice
Many AI advocates say ChatGPT and similar tools can serve as an editor of student work, offering helpful tips the same way a human might. However, when Rosenzweig put a piece of her writing with intentional errors into ChagGPT, the tool not only didn’t fix them, it sometimes made suggestions that made things worse.
“Sometimes the feedback wasn't very good, and I knew that because I'm a writer and I teach writing,” she says. “But if my students, or anyone's students, were to go through that process, have they learned the things that they would need to know in order to assess that feedback?”
She notes that proponents of AI might counter that her prompts weren’t good if she received poor feedback. “I'm not convinced that's the case, but that raises the same question, right? How would our students know what kind of editing advice to ask for?"
Ultimately, Rosenzweig says, educators should avoid using AI simply because it is there and should instead base its use on that question at the heart of her thinking about ChatGPT and AI: To what teaching problems is ChatGPT the solution?
“'What are our learning goals? And how do we want to get there?' should be applied to thinking about these new technologies as carefully and thoughtfully as we make those decisions in other contexts,” she says. She adds that if you are using it in the classroom it should be because you have a pedagogical reason to do so and not because it is the “shiny new thing.”