AI Writing Programs Are Getting Better. Is That a Good Thing?

AI writing
(Image credit: Pixabay)

Last summer, Liam Porr, then a computer science undergraduate at UC Berkeley, came across a blog post about the potential of the AI writing program GPT-3, created by OpenAI, a San Francisco-based AI research laboratory co-founded by Elon Musk. 

The blog post’s ending held a twist worthy of the Twilight Zone: It hadn’t been written by the blog post’s regular carbon-based author, Manuel Araoz; it had actually been written by the GPT-3 AI program. 

“Everybody realized at the end that the writing wasn't actually human writing, but they believed that it was human writing up until that point,” says Porr, who has since graduated. “That was the ‘wow’ factor for that article.” 

But Porr wanted to take things a step further. 

“I wondered if it was possible to pass off writing as being human written and have people like it? Could you build an audience based off of AI-written content?” he says. 

Through a friend in the tech industry he was able to get access to GPT-3 and began a self-help blog consisting of posts written by GPT-3 based on instructions Porr provided. He chose self-help as a genre because it played to GPT-3’s writing abilities. 

“It complemented the strengths of the AI well, which is just being able to write pretty, motivational language,” he says. “It's not super great at logic and reasoning.” 

For each article, the AI would generate about eight different versions and then Porr would choose the best version. After light editing, comparable to what an editor would do for a human blog post, he’d post GPT-3’s creation. 

These AI writings attracted more than 26,000 human readers and 70 subscribed to receive future posts via email. When Porr revealed the results of his experiment, he wrote an op-ed for The Guardian, which ran with the intentionally startling headline: A robot wrote this entire article. Are you scared yet, human?

GPT-3’s writing capabilities are far beyond other AI platforms but AI tutoring technology and even grading tools are becoming increasingly common in the world of education. How these developing technologies are used will determine whether or not that is a good thing, say educators. 

Potential Ethical Problems with AI Writing and Grading Tools

Allison Parrish is a poet, programmer, and faculty member at New York University’s Interactive Telecommunications Program. She has written programs to help create poetry but she’s very clear that the program is not the author and does not have any poetry skills. 

“It literally isn't in any sense good at creating poetry, because it can't do that,” she says. “This is where I draw a very clear line in the sand that maybe some of my peers in the field don't, but I don't attribute agency to the programs that I write or the machine-learning models that I work with at all. Where it becomes poetry is in the moment that the output of the model or of the program or whatever we're talking about is connected to a communicative act and intention on the part of the poet.” 

Parrish is also wary of AI writing programs’ potential to discriminate. 

“As an educator, part of my role is to help educate about these technologies and how they work, about their affordances and about the hidden structures of power behind them,” Parrish says. “Your iPhone’s autocomplete keyboard is controlled by Apple right? The autocomplete in Google Docs is controlled by Google. And these are normative technologies, they aren't just showing you how you could write, they're showing you how you should write. So that I think is really the thing that's on the cusp of being important in education about these issues is that autocomplete, spellcheck -- these are not neutral technologies. They're forcing a point of view when you're writing, and they need to be viewed with skepticism.” 

Columbia University professor Paulo Blikstein is also concerned about the potential ethical implications of AI writing and grammar programs.

“We all use spell checkers and grammar checkers and all this software that helps you write better, and many of them are powered by AI,” he says. “They make you a better writer, they make you make fewer mistakes, but in that kind of use case, they don't replace the writer.” 

He adds, “In some districts, some companies are offering services for grading essays using AI tools and things like that. So that's a very different way of using AI, which is not to empower humans to do what they do best, but is to kind of replace humans with machines, and to replace humans with AI. And in that case, I think it brings in a lot of ethical problems.” 

Blikstein wonders what happens if a student is inaccurately given a bad grade by what is thought of as an infallible device. “What happens to the 1 percent of students who are wrongly graded and maybe will not go to college because they got graded by a machine and the machine got it wrong?” he says.

Like Parrish, Blikstein sees potential for AIs to enforce conformity in a way that could potentially stifle written expression. “When you start to overly standardize things, and say, ‘Oh, this is an essay, you have to use these kinds of words,’ I think you start to make writing poorer and less interesting and less flexible,” he says. 

Software can help educators correct basic mistakes in their writing but we don’t want to lose all the flexibility of human understanding, he says. For example, something may not be a traditional four-paragraph essay, but it's a highly creative way of presenting an argument that starts with an anecdote or story and then goes into some kind of imagining a dystopian future and then comes back to the scientific argument. 

“We shouldn't disincentivize students to be creative in their writing just because they know, ‘My essay was graded by machine, so let me kind of game the system by doing exactly what the machine expects from me,’” Blikstein says.

How AI Can Be Used in Writing Class  

Porr is more excited by the potential of the technology than worried about its potential pitfalls. He recently presented a workshop to the Polish Japanese Institute of Technology on using GPT-3 for creative writing. 

Anyone can access some functions of GPT-3 through the game aidungeon.com, which allows users to create AI generated science fiction and fantasy-themed stories. While far from a completed narrative, what the game creates based on inputs is sometimes stunning and it’s easy to see how it could be used by a creative writer, especially one new to the field, for inspiration. Porr had workshop participants create story ideas, generate characters, and even write poems using the tool. 

Ultimately, he believes AI programs such as GPT-3 can be used to overcome writer's block and as an idea-generation tool. “It’s not perfect enough to be able to write stuff for us without having a lot of human input and adjustment. But for getting started, it's super helpful,” he says. 

The Future of AI Writing  

In education, Blikstein believes AI tutors and other programs for writing can be helpful. “I think if we use it in the right way, it can be a great thing. I think we need to have humans in control, humans, designing the pedagogy, designing the how,” he says. “I want AI in education to be used to help teachers to do their job better, and help students to do things better, instead of replacing teachers or replacing human contact, or replacing tasks that are intrinsically complex and should be done by humans.” 

Porr is optimistic about the potential of the technology. 

“It’s going to make our lives easier, and should automate some of the stuff that we don't want to do,” he says. “In the short term, that means for content-based websites, using it to help create content faster. For regular businesses, that means using AI to generate your copywriting and your sales writing. For regular writers, it means using it for inspiration and for trying to get out of writer's block and ruts and things like that.” 

But he doesn’t see AI-penned works taking over the classroom or the bestsellers’ list anytime soon. “I think the realm of literature, and poetry will remain in the human realm,” he says. “For now.”

Erik Ofgang

Erik Ofgang is Tech & Learning's senior staff writer. A journalist, author and educator, his work has appeared in the Washington Post, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.