Has AI As A Writing Partner Been Oversold?
Many students and educators have been convinced AI can act as a writing partner, but if that’s true, why is classroom writing getting worse?
Tools and ideas to transform education. Sign up below.
You are now subscribed
Your newsletter sign-up was successful
Since the early days of ChatGPT, educators have talked about how AI can be used as a writing tutor or coach that doesn’t write for students but helps them organize their ideas and clean up their efforts.
I liked this idea when I first heard it. I’d love to live in a world where every student has access to a high-quality writing instructor. But in practice over the past few years, I haven’t noticed my students' writing improving; it's been the opposite. While the grammar has gotten much better, the content has suffered.
Today, in my classes, on social media, and through many emails I receive, I see writing with fewer technical errors, but this comes at a terrible cost. Generally, our writing is more boring, less individualized, and conveys far less meaning than it did a few years ago. The cause is AI, of course, but it’s more than that.
I think many of us have uncritically bought into the idea that AI can be an effective writing tutor. I’m no longer convinced it can be, and here’s why.
AI Is Not Actually Good At Writing
Even though people are taking writing advice en masse from ChatGPT and other AIs, we all acknowledge, subconsciously at least, that AI is terrible at writing anything original. That’s why we call AI-generated content AI slop, and when we read something and say it sounds as if AI wrote it, we are not complimenting it. Why then are we asking it to give us extensive notes, revisions, and suggestions on our writing?
AI doesn't transform our writing into William Shakespeare; when it takes on the role of a writing tutor, it’s still that same unimaginative writer. The fact that so many people are taking advice from a robot writer whose writing they hate is just another example of the cognitive dissonance of our times, and I'm genuinely puzzled by it.
AI Doesn’t Partner With You, It Takes The Wheel
A common refrain is that AI can help a student or educator brainstorm or help clean up their ideas. In this way, it can work with you like a good writing tutor, providing instant and unlimited feedback on your work.
Tools and ideas to transform education. Sign up below.
This is another concept that sounds okay in theory but doesn’t hold up in practice. Unlike a good writing tutor, AI has a tendency to take the wheel of your writing. Even models specifically tuned toward teaching tend only to start by asking questions before offering detailed examples of what your answer can look like. Often, these suggestions are bland and lifeless, but even if these were more creative, this process is hijacking the writing process.
AI Gives Too Much Feedback
Even if the problems mentioned above could be programmed out of future AI models, I’m increasingly convinced that having an on-demand writing coach on your shoulder isn’t a good thing.
Throughout my career as a writer, I’ve benefited from feedback from editors and mentors who have read my work and altered and strengthened it--productive struggle. Seeing how they changed and improved my writing helped me get better as a writer, and continues to help my work. For example, this article will be better than the one I initially submitted, thanks to changes made by my editor.
Some thought AI could mimic this process, but I think that’s a false comparison. If I asked my editor what I should write after each sentence, and he sat next to me and rewrote my work as I was writing it, I wouldn’t be deserving of the byline associated with this story, and I probably wouldn’t earn many more writing assignments. (Editor's note: You would not!)
Even If AI Was Better At Writing, It Still Wouldn't Make Our Writing More Engaging
The fantasy author Brandon Sanderson once explained the problems with using clichéd writing in a way that really stuck for me. I’m paraphrasing here, but the idea is that clichés are not inherently bad; in fact, the first time someone hears one, it is often a great way of describing something, which is why it resonates. The problem is that over time and overuse, it becomes stripped of its power and meaning. For example, the first time I heard the phrase "X is like Y on steroids,” it was descriptive and funny. By the ten thousandth time I heard it, I started to groan.
The same thing is now happening with any writing techniques AI uses regularly. Even if it's non-objectionable to begin with, it starts to be intolerable as it spreads everywhere and everyone is doing the same thing.
Right now, AI writing is lifeless and stale to begin with, but even if it improves its writing voice, without more variation in style and tone, any of its techniques are destined to become overused almost instantly.
I do think AI’s writing will continue to get better, but I’m skeptical it can create the kind of variation necessary to uniquely power the voices of a majority of writers on the planet. And without unique writing voices, I’m not sure why we’re writing in the first place.
Erik Ofgang is a Tech & Learning contributor. A journalist, author and educator, his work has appeared in The New York Times, the Washington Post, the Smithsonian, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.

