The Right AI Can Help Students With Assigned Readings, Suggests New Research
A recent study of college students using AI to help understand assigned readings found that they would read AI summaries instead of the text. It doesn’t have to be this way.
Tools and ideas to transform education. Sign up below.
You are now subscribed
Your newsletter sign-up was successful
Researchers from various universities recently set out to understand how students are utilizing AI to help them understand assigned readings. The study, Self-Regulated Reading with AI Support: An Eight-Week Study with Students, looked at how 15 undergraduate students used AI to help them understand various texts over an eight-week period.
Though a small study that is currently in preprint form (meaning it has not been peer reviewed), it sheds compelling light on the ways in which AI use can support and hinder student learning.
Chris Yue Fu, the study’s lead author and a doctoral candidate at the University of Washington, recently spoke with me via Zoom and a follow-up email to explain the key takeaways from this research.
The Study’s Key Findings
Two things in particular stood out about these results to Fu, the first of which is the attention behavior gap.
“Students could clearly articulate what good AI engagement looked like as they told us things like, ‘the better your questions are, the more helpful AI can be,’” Fu says.
But even when they were taught good prompting strategies, only 4.3% of their prompts actually employed these strategies. Fu says. “They knew better but didn't do better, and that gap didn't close over eight weeks, even though the course was literally about AI.”
The second major takeaway was what the researchers termed “reading through AI” rather than “with it.”
Tools and ideas to transform education. Sign up below.
“Students weren't just using AI to help them understand a text. They were using AI-generated summaries as the primary thing they read, and then selectively dipping back into the original text,” Fu says. “The AI output became the text, and the actual reading became a background resource. That was genuinely unexpected.”
Hints At Ways AI Could Be Used More Effectively
Despite those negative-sounding outcomes, the research also identifies ways in which work with AI tools can be helpful.
“For lower-level cognitive tasks, AI is already quite effective. When a student encounters an unfamiliar term or needs a section summarized, AI handles that well,” Fu says.
More effective AI could nudge students toward higher-order thinking, particularly the kind that is often necessary to understand and engage with college-level texts.
For example, after providing a summary of a given text, Fu would like to see AI follow up with something such as, “‘Now that you have the main argument, what assumptions is the author making? Do you agree with them?’ Or after a student asks about methodology, the system might prompt: ‘How would the findings change if they had used a different sample?’”
Fu says that after this study was concluded, there were more AI models designed to attempt this kind of productive questioning.
“The technology is starting to move in the direction our research suggests it should, toward sustaining the conversation rather than ending it after a single answer,” Fu says. “We actually saw evidence that students naturally progress toward this kind of thinking.”
Teacher In The Loop Design
In addition to continuing the conversation and going deeper with questions, Fu and his fellow researchers also called for more teacher control in the ways in which AI chatbots and tutors work in the classroom.
“We envision a system in which the instructor can set learning goals and requirements for specific readings in advance,” Fu says. “For example, an instructor assigning a dense methodology paper might tell the system: ‘Make sure students engage with the limitations section,’ or ‘Prompt students to compare this framework to last week's reading.’”
Right now general-purpose chatbots are “designed to answer questions,” Fu says. “They optimize for task completion. When a student asks for a summary and gets one, the system has done its job, but the student hasn't done theirs.”
Fu adds that there is an opportunity for designing education-specific AI tools “that gently push students toward the deeper engagement we know they're capable of.”
The research Fu led suggests this could be significant in its impact with students. “We saw that capability in the data: students naturally moved toward reasoning when they kept going," he says. "The challenge is designing systems that support that progression instead of letting it get cut short.”
Erik Ofgang is a Tech & Learning contributor. A journalist, author and educator, his work has appeared in The New York Times, the Washington Post, the Smithsonian, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.

