You Probably Already Saw AI Slop Today. What Educators Need To Know About This Fast-Growing and Harmful Trend

AI slop
(Image credit: Getty Images)

At any given moment when you’re scrolling through social media and the internet, you are up to your hips in AI slop – and may not even realize it.

AI slop is the low-quality, often fake content, such as text, images, or videos, that is generated by AI. It’s currently overwhelming social media and the internet, having initially gained momentum through use in political campaigns. It also generates big revenue for its creators on platforms such as Facebook, YouTube, or TikTok.

The concern for education is that AI has improved to the point that it can make it difficult to distinguish reality from this new wave of fictitious or inaccurate AI-generated content. AI slop is also often liked, favorited, and shared at high rates through the use of AI bots deployed by its creators, boosting the fake content to the top of Google search results. Consequently, it becomes a vicious cycle as the AI slop is “validated” by false means, and then is picked up by others and accepted as “real” because it increasingly appears in Google AI Overviews, thus spreading the misinformation to even more sources.

Obviously, trying to determine what is real, which sources can be trusted, and how to find the truth are critical needs in any classroom.

No Stop To The Slop

So if this is the first time you’re reading or hearing of the term, this video from Kurzgesagt is a good primer on AI slop, and its potential for harm. Last Week Tonight with John Oliver, also dedicated an extended segment to AI slop.

If you ask Google about the topic, its AI overview seems to take offense. (Bots of a feather stick together?):

"AI slop" is a derogatory term for low-quality, low-effort content generated by artificial intelligence tools, often found on social media and other online platforms. It includes a wide range of media like videos, images, audio, and text, and can be characterized by a lack of substance, accuracy, or creativity, sometimes appearing bizarre, misleading, or overly formulaic.

This erosion of traditional and trusted online sources and media is happening at a staggering pace. At the moment, AI slop now comprises 50% of all internet articles, according to a recent report from Axios.

“Social media and everything that has changed with our attention-related economy is the flywheel that's driving the slop in so many of these cases,” says Adam Nemeroff, Assistant Provost for Innovations in Learning, Teaching and Technology at Quinnipiac University, who has recently written at length about AI slop. “Just because something looks real doesn't mean it is.”

Nemeroff says that while it can be entertaining to watch AI-generated videos such as cat soap operas on TikTok that ping our dopamine centers, the reach of AI slop is now undermining the most popular information sources that students and teachers use.

“AI slop is taking down Wikipedia communities entirely,” he says. “It’s making Wikipedia–which had such a great trajectory of going from being this thing that nobody trusted to saying ‘Wow it's amazing this thing is created as a human artifact’–to now saying, ‘Well, now there's machines and bots in the loop and the community of moderators can't actually control it.’”

So is there hope in keeping up with computer-powered intelligence? Actually, there is, and it starts with our own brains.

Natural Intelligence Vs. Artificial Intelligence?

When it comes to any online content, educators and students now need to be aware and proceed with skepticism, says Cory Coburn, Educational Technology Support Manager for Austin Community College District in Texas, and recent winner of Tech & Learning’s Innovative Leader Award.

“It's a problem with students and faculty and staff because they have not been taught to evaluate information in a critical thinking format,” Coburn says. “We are coming from a society where you used to watch the TV and what's on the news was the gospel truth. You didn't question it. And now everybody can make that media on the computer or on the TV.”

Coburn says that natural intelligence is a good place to start in combatting the artificial version. “You have to be able to put what you’re looking at through a critical thinking process, ask questions, and find the source and firsthand information about what you're trying to understand,” she says.

“It's really important for educators and students alike that those information literacy and critical thinking skills that you have are all the more important now,” agrees Nemeroff.

Both Coburn and Nemeroff suggest that librarians, media specialists, and those at your school who teach media literacy need to be on the front lines in the battle against AI slop.

“Those are the leaders in making sure that people have media literacy and AI literacy,” says Coburn. “They are the ones who work with the staff and the students to help them master these skills of being able to analyze. ‘Is this good information? Is this good research? Is it real? Is it something somebody created in three seconds using AI, generative AI?’ And actually reading what the AI is providing.”

Nemeroff echoes that. “Higher ed and K-12 leaders need to understand that just because there's a smaller physical collection of books in the library, it doesn't mean that the information tasks happening there are less important,” he says. “There's all these ways that librarians are helping the information research and learning ecosystems in learning environments.”

What Else Can Educators Do?

Nemeroff suggests that when it comes to AI slop, educators also need to be self-aware of it in the form of “work slop,” or the blind trust in AI to handle administrative tasks without oversight. “I think we're seeing it more in our school environments,” he says. “We need to make sure that we're not passing along AI slop to students as grading feedback or in our lesson designs and other assessments and learning experiences.”

Coburn also recommends turning to high-quality research tools such as Scite, an AI-powered tool that provides quality authentic academic sources. “It gets you started, but it's still on you to do the critical thinking and analyzing,” she says.

As with so many other challenges, awareness is ultimately the first step in trying to overcome AI slop.

“Don't just take whatever's in front of you as, ‘This is true’ and don't question it because it sounds good,” Coburn says. “You need to know: who's the creator, where's this information coming from, where's their primary source of information? And people aren't thinking about that. It's, ‘Let's get it done as quickly as possible.’ Quick and easy is always what people want. That has to change.”

3 Things You Can Do To Ensure AI Content Is Accurate

  1. Check Sources - Just because content has a lot of views or pops up at the top of Google rankings doesn’t mean it is real. “Companies are paying Google to be at the top of the search, whether they're presenting factual information or not,” says Coburn. Users need to dig down to see who is presenting the information and if it’s from a legitimate source or not.
  2. Rely On Your Critical Thinking Skills - If content seems too good or fantastic to be true, it probably is. Says Nemeroff: “It’s important for people to keep in mind that you have to be skeptical about everything you're seeing, especially in the social media spaces that aren't moderated and and increasingly being less moderated.”
  3. Look For Real People - Platforms that feature actual humans and human-generated comments and videos, such Reddit, can generally be more trustworthy information ecosystems. “If you go to any of these AI tools, such as Google, and ask, ‘How do I fix my washing machine?’ it takes you to some Reddit subreddit of people who are mechanically inclined and have gone through the process, know and developed the expertise, and have an answer that has been upvoted and validated.”

Enjoy our content? Make sure to add Tech & Learning as a preferred source on Google to keep up with our latest news, how-tos, profiles, events, and more.

Ray Bendici is the Managing Editor of Tech & Learning and Tech & Learning University. He is an award-winning journalist/editor, with more than 20 years of experience, including a specific focus on education.