5 Things I Learned About AI This Year
2025 was a big year for AI. New models were released and the impact of the technology on the classroom and society increased
This was a big year for AI and teaching, probably the biggest since ChatGPT launched at the end of November 2022. Just about every week, there was a new advancement for one major AI model or another, and it seems as if school districts and universities got more serious about figuring out ways to harness AI potential for student success. My own AI use changed and increased in a number of ways as the technology became more advanced and helpful.
But it wasn’t all good. AI use in my classes went from common in 2024 to seemingly ever-present in 2025, and AI’s influence and frequent use have made social media even worse (which I didn’t think was possible).
Here’s what I learned about AI in education in 2025, and some of what I predict, hope, and fear is coming in 2026.
1. “Ask Chat” Is The New “Google It”
Let’s start with something positive: This was the year that Google searches started to feel positively old-fashioned, and on the way to flip phone territory.
I still don’t trust AI summaries, but I do find ChatGPT and other AI tools to be quicker ways to search for links to specific things.
For instance, you can say something such as, “Search for all recent studies comparing the efficacy of flipped classrooms to traditional classrooms.” Or, “Can you find good examples of grammar handouts for an introductory college course?”
In both cases, in my experience, better results will come up than if you merely Google these queries. But be careful as I find AI summaries are often simplistic, at best as good as Wikipedia entries, but often not even reaching that low bar. Ultimately, I use these to direct me to other sources.
Tools and ideas to transform education. Sign up below.
2. AI Writing Isn’t Just A Problem For Teachers
AI writing and other AI creations masquerading, often poorly, as genuine work have been a problem for educators since ChatGPT was released. In 2025, that problem jumped out of the classroom, and now AI writing and other forms of AI slop seem to be everywhere, from lifeless social media descriptions to deepfakes and bogus viral videos. It’s also in private communication of all kinds. I know multiple people who use it for email and even messages on dating apps.
All this has taught me that separating AI-generated work from actual work is not just an educational priority but a societal necessity. I predict (ok, hope) that in 2026 we see more societal pushback against rampant AI use. Algorithms should start penalizing AI-generated posts or links, and it should be considered a major faux pas to use AI text in personal communications.
Ideally, some of these changes, if any occur, will trickle down to AI use in school.
3. Everybody Is Wrong About AI Detection Tools
To better detect AI writing, we need better AI detection tools, but these tools are divisive. Many institutions advise against using any platforms in any way.
This is a mistake.
But it’s an equally big mistake for educators to use these blindly and with complete faith: such use leads to horror stories of students being failed or wrongly having to endure disciplinary actions.
First, we need to realize that not all AI detection tools are created equal. As I wrote recently, some careful research has found some to be terrible and others quite good but not perfect.
The education field needs to figure out how to use detection tools fairly and effectively and provide transparent guidance for teachers and students on which ones will be used and how.
4. AI Tutors Could Soon Be Ready For Prime Time
In 2025, ChatGPT, Gemini, and Claude all released new or updated education modes for their AI tools. These modes or features are designed to avoid giving students answers and will instead engage them in the Socratic method with educational best practices in mind.
Such AI learning platforms are getting better and better. In 2026 the education field can and should do more to utilize these. Of course, these AI tutors still need to be studied and compared to traditional methods. Just because a tool should, in theory, be helpful doesn’t mean it actually will be in reality.
AI chatbots primed specifically for teaching are a kind of educational dream come true with enough potential upsides that it’s worth getting serious about using these more broadly going forward. I’ve found these to be personally helpful in learning new concepts in areas for which I am not an expert.
And, for better or worse, students love using AI, so these AI tutors should appeal to them naturally.
5. Education is Still Playing Catch-Up
Perhaps what surprised me most in 2025 is not how fast the AI is still advancing -- that’s to be expected -- but how the field of education, particularly higher ed, is still largely playing catch-up in an AI world.
It’s been more than three years since ChatGPT launched. The education field has done a nice job embracing the tech’s learning potential, but we’re still flat-footed when it comes to combating inappropriate AI use. Worse, three years later, there's still not a real sense of urgency around fixing this problem.
I hope that changes in 2026 because inappropriate AI use remains the number one concern of college educators I speak with, and is an issue that is only getting worse each year. It’s time to accept that embracing new technology should be separate from preventing cheating and that inappropriate AI is a problem that’s not going away on its own.
In the coming year, we need an all-hands-on-deck approach and sense of urgency. The stakes are higher than academic integrity. We need to figure AI out, or, as dramatic as it sounds, authentic human writing, creations, and expression will increasingly take a backseat to AI slop.
Enjoy our content? Make sure to add Tech & Learning as a preferred source on Google to keep up with our latest news, how-tos, profiles, events, and more.
Erik Ofgang is a Tech & Learning contributor. A journalist, author and educator, his work has appeared in The New York Times, the Washington Post, the Smithsonian, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.

