Ohio Is The First State To Mandate School Districts Have A Formal AI Policy. Here’s What Other Districts Can Learn From It
What to consider when developing a formal AI policy, from an Ohio CIO who has been leading the effort to draft his district’s policy.
Tools and ideas to transform education. Sign up below.
You are now subscribed
Your newsletter sign-up was successful
In August 2025, Ohio became the first state in the U.S. to require traditional public school districts, community schools, and STEM schools to adopt an official policy on the use of artificial intelligence. The policy has to be in place by July 1.
The Ohio Department of Education and Workforce has shared a model AI policy, which is a good starting template for districts, schools, and institutions in other states as it features a purpose statement and AI definitions in addition to guidelines around literacy, ethical use, data privacy, and other concerns.
“The Ohio Department of Education and Workforce has been very forward in AI for the last couple of years, including providing some good resources,” says Christopher Lockhart, Chief Information Officer for Columbus City Schools, who is leading the effort to comply in the state’s largest district (46,000 students). “There's just never really been that push to say, ‘Okay, now schools, we need you to do something. Either say you're going to do it or say you're not going to do it, but you have to do something.’”
With AI continuing its exponential growth, it is most likely only a matter of time before every state legislates such a requirement, Lockhart says.
“Any school district that does not yet have a policy should take a hard look at developing one,” suggests Lockhart. “And I would take that work on now just so it's not a strain to get it done if a mandate comes down. It's going to benefit your staff and your students so they have that clarity of how to operate within the AI framework at the district.”
Gathering All The Voices In The Room
Lockhart’s district already had an AI working group in place when the impending mandate was announced last year. It is one of the first steps recommended by the state, which Lockhart jokes doesn’t seem like a coincidence. “I was like, ‘Well hey, we already have one,’ and then I realized I had talked at several functions around the state about the Columbus model and the fact that we had an AI work group,” he says. “So now I'm thinking, wait a minute, maybe they saw what we were doing and thought, ‘Oh, that's probably a good idea. Let's put that in there also!’"
Lockhart’s AI implementation working group provides a good model for other districts, with representatives from across grade levels and departments, including IT staff, teachers, administrators, and other support personnel. He also includes AI experts who understand the legal and practical ramifications, such as data security and privacy.
Tools and ideas to transform education. Sign up below.
“I wanted to have a lot of voices at the table to make sure that everybody understands what this is,” he says. “We already know we're going to have some resistance, but at least the loudest voices were also in the room while we were drafting the policy, being heard and saying, ‘I don't like it, but can we at least do this and add this to it?’”
Lockhart’s group is also meeting with students to make sure their input is considered in any policy, including recommendations around teaching AI literacy and ethical use.
“The reality of it is what I've argued for as a proponent of AI in schools is that if we're not teaching it to the students in school, they're going to use it outside of school,” he says. “And if we're not teaching them the proper ethical safe way to use it, they're going to just be out there on their own.”
Lockhart also says any policy should consider equity as some students may not have access to high-speed internet or computing devices after school hours.
“We should find that balance for what we can give them as a safe tool here in the district to learn to get comfortable with AI in all its different iterations beyond just the chatbots and the generative AIs,” he says.
3 Considerations When Developing A Formal AI Policy
Other practical considerations for any district drafting formal AI policies:
1. Secure district-wide leadership support beyond the IT department, especially getting buy-in from the superintendent.
“If it's seen as an IT initiative, then it doesn't go very far,” Lockhart says. “But when it's seen as a district leadership initiative, you get a lot more buy-in.”
2. Do not make the policy overly restrictive. "You really don't want to put that type of restrictive language in a policy for something that's changing so fast," Lockhart says, "You certainly don't want to put in an application name. You don't want to specifically have ChatGPT or Claude or Gemini in your policy, because this all is going to keep changing.”
Lockhart says keeping the policy general also eliminates having to go in front of the school board every time a change is needed. “You can get more specific in what we use in schools with the administrative guidelines,” he says. “Once the school board sets policy, they then direct the superintendent to then build out the details of what that looks like in the administrative guidelines.”
3. Ensure the work continues beyond the initial drafting to include ongoing professional development and curriculum development
“As AI keeps changing, its prominence also keeps changing. The policy is going to have to change,” he says. “The administrative guidelines, or practices as some districts call it, will also have to change to keep up with the technology.”
AI Is Not 'Fly-By-Night'
Although AI seems like a new technology for many, Lockhart notes that’s not the case. For example, when he does presentations on the history of the technology, he starts in 1956 with the first work with AI at Dartmouth, then goes through the various iterations over the subsequent decades, from Big Blue playing chess to ChatGPT now. And he suggests that it is not going away any time soon.
“So I tell people that AI is not new; it is a mature technology,” Lockhart says. “It is not something that just popped up overnight–it's just new to us having access as the public. And I think it actually makes them a little more comfortable with saying, ‘Oh, okay, you know, we get it. We don't understand it, but we have a little more insight now that this is not something that's going to be fly-by-night.’”
Ultimately, forging that basic understanding around AI can be helpful for everyone involved in implementation and policy.
“Overall, this policy has been a good thing,” Lockhart says. “I think for a lot of school districts, they've struggled with the idea of whether or not they even wanted to have an AI policy, or to just bury their heads in the sand and pretend like, ‘If we don't talk about it, we don't have to deal with it’. So, I think this gets everybody a good footing of how we're going to deal with it.”
Ray Bendici is the Managing Editor of Tech & Learning and Tech & Learning University. He is an award-winning journalist/editor, with more than 20 years of experience, including a specific focus on education.
