4 Lessons From California’s Teaching with AI Guidance

AI teaching guidance
(Image credit: Getty Images/iStockphoto)

California was one of the first two states, along with Oregon, to release guidelines for teaching and learning with AI. Though not setting official school policy in the state, California’s guidelines provide teachers and school leaders with a vetted blueprint of best practices and guidance for using AI in the classroom. 

Katherine Goyette, Computer Science Coordinator for the California Department of Education and co-author of The Complete EdTech Coach: An Organic Approach to Supporting Digital Learning, helped write this guidance, titled Learning With AI, Learning About AI

Goyette shares highlights from the AI guidance as well as tips for school leaders on crafting their own policies. 

1. Teaching With AI Guidance: Prioritize Student Safety  

When using AI, as with any technology used in a school setting, the first priority is student safety. One area in particular that AI can be a problem is data privacy. 

“There are laws for student privacy,” Goyette says. “We are reminding educators that they have a moral and legal responsibility to attend to those privacy laws.”  

Unfortunately, many generative AI systems were not developed for schools and don’t have appropriate privacy settings. So, with some tools, Goyette says, "We are urging educators to be vigilant and wait until safe, legally compliant ways of actually integrating this for student use into classrooms is possible.”  

2. Teach AI Literacy 

Regardless of what teachers and schools do about AI in class, students will use AI tools at home. That’s why it’s vital that educators provide AI literacy training for students as an extension of digital literacy programs, and talk with them about AI and how AI systems work, Goyette says. 

When it comes to students who want to use AI outside of class in ways that aren’t cheating – for example, helping with research rather than writing a paper – Goyette says she approaches it like she would social media. “As an educator, I'm not going to ask all my students to get on social media for our school project,” she says. However, if students show their learning via a TikTok or Snapchat video, and that project was created outside of school time, Goyette wouldn’t prevent it. 

“That is them taking what is in their world and using that to integrate their academic learning, which is a win, and it also is an opportunity for me as an educator to have conversations about how those technologies work,” she says. 

3. Recognize AI Bias and Potential Social Impact 

The California guidance suggests teaching AI literacy in the context of its social impact and its bias.  Students and teachers should approach AI “with an understanding that these AI systems are not infallible, but they are created by humans and they could potentially exacerbate bias and so we need to be aware of that," Goyette says. 

This bias might have larger impacts on society, influencing everything from hiring to school admissions, and students need to be cognizant of this as modern citizens. “We want them to have this kind of base foundation of understanding of how these things work so that when they're making our laws and when they're deciding as business owners if they're going to use AI systems for certain purposes, they do so ethically and responsibly,”  Goyette says. 

4. Incorporate The Community and Update Tech Procedures 

When drafting AI guidelines, districts and schools should not do so in a vacuum. “It's important to value community voice,” Goyette says. That includes students as well as their parents. 

She adds that looking at school or district AI policies can also be a good opportunity to update technology policies, particularly around student privacy, in general. “It's possible that these guidelines were created a decade ago and haven't been looked at in a while, there may be things that are outdated,” she says. 

On the other hand, these existing technology policies can be a great foundation for building AI policies. Goyette says: “Look at what is currently in place and say, ‘How can we modify it in order to meet today's technologies?’” 

Erik Ofgang

Erik Ofgang is Tech & Learning's senior staff writer. A journalist, author and educator, his work has appeared in the Washington Post, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.