Shadow AI: What Schools Need To Know And Can Do About It

Shadow AI
(Image credit: Getty Images)

Although artificial intelligence has created many opportunities in education, such as supporting individualized learning, providing tutoring, and increasing teacher efficiency, it also poses risks for school districts. From AI slop and pink slime to harmful deepfakes and scams, district leaders and technology professionals have their hands full trying to provide students and teachers with safe AI.

Adding to those challenges now is the threat of shadow AI, which certainly sounds, well, shadowy. In reality, it’s a bit less nefarious, although it still can cause havoc for IT departments as well as school staff and students.

What Is Shadow AI?

Quite simply, shadow AI is the unmonitored use of any unauthorized AI tool or platform by teachers, staff, or students, such as generative AI chatbots or data visualization tools. Having such unvetted AI programs can open schools to the potential for data breaches and other cybersecurity vulnerabilities.

Typically, the unsanctioned use of these AI tools is unintentional, says Maurie Beasley, network/system administrator for Llano Independent School District in Texas and author of Teachers Have Bigger Fish to Fry: Tales from K12 Education/ How AI Might Help...or Maybe Not.

“It's really often ignorance,” she says, noting that when she informally asks educators in her district about their AI usage, many don’t even realize they’re accessing AI as it’s been baked into so many tools now. “They're not hiding in the closet and trying to do things they shouldn't be doing. It's just the utilization of something that they have easy access to that makes their world or their life easier that they're using without permission and not thinking about the consequences.”

“An IT person literally calls it shadow because ‘If I can't shine a light on it, it's in the shadows,’” says Jim Beasley, Llano ISD’s Director of Technology (and Maurie’s husband). “And if you're using it without my permission, that's an IT perspective.”

Jim can envision a worst-case scenario of shadow AI, such as using an open-source AI agent tool like OpenClaw that, when installed on a local machine, has unrestricted administrative-level access, allowing it to manipulate files and send emails. Such a tool on a network, especially if it exposes API keys, would present "a really big problem."

The Risk For Schools

One of the biggest risks with shadow AI for schools is the potential for data breaches, especially when the evolving aspects of AI can make for new threats and concerns.

“Right now, if there's a data leak, you stop the leak, maybe you can pull the data back in, maybe you can't, but that data, you can draw a bubble around it and say ‘This is what got out,’” says Jim. “Once a data leak gets into AI, you can't remove it. I mean, OpenAI doesn't have a way to pull that data back out. So if you leak a social security number and it gets pulled into some training data for a large language model, it's there forever potentially.”

Of course, any unmonitored access to a school or district network can create opportunities for hackers and other bad actors as well as more sophisticated cybersecurity attacks.

“Being compromised before meant somebody gets access to someone’s email, for example, and starts sending emails out on their behalf,” says Jim. “If some kind of AI gets onto my network, or somebody plugs something in it's got AI access, what does that even mean? How is that going to interact on my network? What does ransomware look like that has intelligence attached to it? My biggest goal is to retire before I have to figure that out because I don't know if I'm smart enough to figure that out.”

What Schools Can Do About Shadow AI

Obviously, as with any potential threat, the first step to protecting against shadow AI is awareness and education.

“If you don't start with education, the technology things are useless,” says Jim. “At the end of the day, it's 90% education and 10% technology, or maybe 10% technology with 5% of that being the processes that you put in place as part of that technology. And if people aren't educated, they are going to do things you cannot anticipate in ways you did not anticipate.”

Maurie encourages skepticism, even joking that she pushes her colleagues to the point of paranoia. “It's very enticing to let AI take the cognitive load,” she says. “Yeah, let it do the easy stuff and let it help you, but when it comes to the cognitive part, make sure that's still the human that's making those decisions.”

If they haven’t already, IT departments should also consider implementing guardrails around AI use, monitor usage, and once again, make users aware of the risks. Jim suggests sticking to IT 101 principles, such as ensuring employees have the least necessary permissions and tightening previously lax policies, especially concerning Chrome extensions, which are now being enhanced with AI. He also advocates spending more time checking for erroneous logins and proactively securing networks.

With how rapidly edtech is evolving and AI being added to so many platforms, school technology leaders need to be extra vigilant, and even consider re-vetting software and tools that were previously approved.

‘It’s Another IT Project’

Ultimately, dealing with shadow AI is the same as dealing with any cybersecurity threat.

“If you're in an IT department, focus on what you know how to do, have the same disaster recovery plans you've always put in place, treat AI that way, but don't ignore it, either,” says Jim. “If you ignore it and you just decide somehow it's going to take care of itself because Google's going to provide it just like they've done everything else, you're making a mistake in multi-dimensions because, A, Google can't plan for everything. And B, Google's AI doesn't do everything, so people are going to go off and use a different AI.”

“Artificial intelligence, when you look at it from an IT perspective, is the same process you've gone through with every new technology that's ever happened,” says Maurie. “You need to have a security plan in place. You need to make sure you have backups ready in case you need to restore something. It's literally just another IT project.”

Ray Bendici is the Managing Editor of Tech & Learning and Tech & Learning University. He is an award-winning journalist/editor, with more than 20 years of experience, including a specific focus on education.