There is good reason many people – parents, teachers and other community members – are cautious about using generative AI in classrooms. That’s largely because our overall use of AI is an experiment and we don’t know what the long-term outcome for society and learning will be.
We do know that AI makes it easier for students to cheat and shortcut their way through school, even when schools and teachers put restrictions on its use. There are other questions, including: How do we know which AI content to trust? Will AI diminish or destroy the human element in music, art, videos and other creative fields? Will AI make future generations dumber?
‘Iolani School’s Gabriel Yanagihara says the biggest issue about AI in schools is the lack of clarity about when or even whether students can use AI tools in individual classes. The Hawai‘i DOE has general guidelines about ethical and transparent use of appropriate AI tools, as well as data privacy. Its Digital Design Team outlines these guidelines for teachers and students on its website; go to tinyurl.com/HIDOEai.
“Teachers aren’t sure what’s safe, what’s allowed, or what tools are worth their time,” he says. “They are worked so much already, so even though AI can help lower their workloads and help with work-life balance, it’s a big ask to ask them to learn a whole new system on top of all we already ask of our educators.”
Yanagihara helps with ‘Iolani’s AI strategy and teacher support. “We were able to move quickly because we had the flexibility to build an internal task force, test tools and implement support systems in-house.”
However, schools vary greatly in resources, he says. Some schools lack devices or reliable internet access, which makes hands-on training harder. Others worry about ethics or student misuse, and lack of understanding by teachers and parents.
Ethical and appropriate AI use
Brian Grantham of Mid-Pacific Institute is working with teachers and departments on appropriate AI tools in each subject area. The goal is to use AI for deeper learning, not just copy and paste and turn the assignment in.
Teachers’ knowledge of AI is limited, and they are concerned about students cheating. His response is to have students and teachers work together, “co-creating classroom expectations.”
When kids are invited into the AI discussion, they see ways to use AI tools from their perspective and how each individual learns, Grantham says.
Teachers can create assignments that make sense to themselves, but students might see them differently, particularly if they have a learning difference such as ADHD, he says. That means the teacher’s instructions for the assignment can seem overwhelming or ambiguous.
Mid-Pacific’s syllabi include clear guidelines on use of AI, including when students must ask permission to use it. Last year teachers had varying policies about AI use, which confused students. So, this year AI policies are consistent within each subject.
Mid-Pacific high school students can directly use ChatGPT, Gemini, CoPilot, Adobe and Apple Intelligence. Elementary and middle school students can use the SchoolAI platform.
Mid-Pacific also has an AI Certification Course that prepares students for the workplace of the future, providing them with skills that are in high demand across industries, Grantham says.
AI literacy training and ethical standards
High schools and colleges are concerned that AI use without clear guardrails to keep students safe is rampant.
AI’s effects are “terrible,” says an English teacher from Farrington High School. She says she’s returning to pencil and paper assignments so she can better assess her students’ writing. In fact, many English teachers agree on the importance of having students do more in-class writing without access to the internet.
Both independent and public schools in Hawai‘i are addressing issues of ethical AI use through AI literacy training and by having guardrails on AI tools used at school. The idea is to encourage AI use in ways that enhance critical thinking while keeping student information safe.
That training on the ethical use of AI is crucial because schools cannot monitor and protect students when they use AI and social media at home.
AI detectors don’t work
Educators interviewed for this story unanimously said that AI detection tools don’t work effectively and can be wrong. A teacher might end up punishing a student when a teacher’s guidelines for AI use aren’t clear. Teachers who get to know their students can recognize when a student has turned in work using AI without being transparent about it and not labeling how they used the tool – but if you have dozens of new students each semester, knowing each of them well can be a challenge.
Mid-Pacific Institute has done a lot of testing with AI detectors and found they hallucinate worse than AI itself does. Plus the detector provides the percentage of content that might have been generated by AI, which can be misleading.
“How do you take a 70% accusation and then leverage that against a kid when you may or may not be right?” Grantham says. Because once you accuse a student of cheating, the student might get suspended or expelled, and that will hurt their future.
He suggests that early in the semester, teachers assign several writing pieces on paper in class, “so you can start to capture how your kids speak.”
When the teachers see the student’s work later, the teacher will know whether the writing is too clean or above the student’s level.
Left: Mike Latham of Punahou, Center: Mike Sarmiento of Purple Maia, Right: Michael Ida of Kalani High School
AI ethics and trust
In 2021, Hawai‘i’s Legislature passed Act 158 to improve digital literacy among young people. It required all K-12 public schools to offer computer science courses or computer science content by the 2024-2025 school year. And, according to the state Department of Education’s Miki Cacace, AI will be integrated into updated computer science standards next summer. (See guidelines on computer science education from a national consortium at reimaginingcs.org.)
“The mandate from Act 158 provides crucial support needed for expanding professional development,” Cacace says.
This legislation has driven a significant increase in the number of computer science instructors, growing from 1,237 in 2022-2023 to 3,815 in 2024-2025, she says. The DOE has created AI guidance and training through its Office of Curriculum and Instructional Design. In fact, DOE administrators are ahead of many school districts in the U.S. They’ve drawn up guidelines for ethical use of AI in their schools (tinyurl.com/Hawaiiai), encouraged computer science and AI literacy training, and are developing approaches for holding students accountable for ethical use of AI. Find the DOE’s AI guidance for teachers and resources at tinyurl.com/Hawaiiai2, and guidelines for students at bit.ly/hidoe-ai-students.
Public school teachers who want to learn more about computer science/AI learning opportunities can email cs@k12.hi.us. The Magic School AI pilot is only available for HIDOE K-12 public schools.
The challenge faced by the DOE in increasing AI literacy training and education lies in the size and complexity of the public school system. Reaching about 152,000 students and 13,000 teachers, librarians and counselors takes time and patience. However, there are schools with pockets of early AI adopter teachers who are motivated to experiment and lead others.
Winston Sakurai, executive assistant and chief of staff at the DOE’s Office of Curriculum and Instructional Design, says schools should complete a self-assessment and build AI literacy by identifying key personnel, finding areas for growth, and setting a vision for how they want to use AI while meeting DOE goals for graduation.
Each school will approach this differently, just as they do with curriculum and instruction, because communities have unique needs. The overall goal is the same: to use AI responsibly and effectively while also legally protecting students’ private data and information that they provide to an AI tool or chatbot. Such protections are required by federal laws such as COPPA, or the Children’s Online Privacy Protection Act.
Safeguarding student data
Some of Hawai‘i’s private schools have moved quickly with teacher training on AI, including syllabi with ethical standards and guidelines on when to use AI tools.
Mid-Pacific’s Grantham says educators use AI tools at school that have guardrails to protect student data. Mid-Pacific’s elementary school uses SchoolAI, a “wrapper” program that adds guardrails around other AI programs to protect student data. It limits AI responses to those appropriate for the child’s age.
Just as with Magic School AI, students do not have their own SchoolAI accounts. They piggyback on the teacher’s account, letting the teacher set the boundaries.
Grantham says the process of doing an assignment can be more important than the final product. Some teachers focus on individual steps rather than the final exam, paper or project. That makes cheating less likely too because a final paper can often be generated by AI.
“It’s about, ‘What did you do to get to that product and the thinking?’” Grantham says.
Yanagihara of ‘Iolani calls this process “scaffolding,” or “steps” in the learning process.
Michael Ida’s students in computer science and math at Kalani High School collaborate on whiteboards on stands in the classroom and also use Chromebooks at school that have internet access for AI use. Ida says AI won’t replace teachers but is a valuable tool to save teachers time on tasks like generating sets of problems with unique answers, which takes hours to create manually.
“If I try to be creative, maybe I can use it to supercharge my teaching in a way that I wasn’t able to do before,” Ida says.
You can’t “unring the bell”
Punahou School’s approach is focused on AI “as a kind of critical literacy,” according to Punahou President Michael Latham. “We want our students to have a clear understanding of how the technology works, of actually what’s involved in the technology,” he says.
“We want them to be aware of and able to use it in ways that provide them with advantages, especially in personalized learning. But we also want them to be aware of the pitfalls, the ethical problems” and risks, he says.
“I think the biggest challenge for us, and really for probably any school, is to figure out how to use these tools in ways that amplify or enhance the kind of teaching and learning that we do, but not undermine our core learning objectives,” Latham says.
“You can’t unring a bell; you can’t put the toothpaste back in the tube. These tools are out here,” Latham notes, adding that it would be a disservice to students to ignore AI.
Latham says that “we plan really carefully about where AI is best deployed in the curriculum and the pedagogy and where it should be avoided.” One of those areas where AI should be limited, he says, is English instruction.
A lot of ideas are communicated in writing, and that involves critical thinking skills, Latham says. “I think students need to go through the cognitive work of framing an idea, figuring out how to express it, thinking about the evidence they’re going to use, how they’re going to structure an argument,” Latham explains.
ChatGPT can create an essay, but that deprives the student of the struggle involved in framing ideas, which in turn erodes critical thinking skills. An experimental study by MIT researchers supports the idea that cognitive skills are compromised when using AI, but the study also notes in its preliminary findings that “rewriting an essay using AI tools (after prior AI-free writing) engaged more extensive brain network interactions.” (See the study at tinyurl.com/568tjud3.)
Punahou’s Candace Cheever adds that “more teachers are doing in-class writing where they can monitor students on school computers with lockdown browsers so they cannot access the internet or AI.”
Social and emotional learning
Teachers are coping with AI’s arrival at the same time they are dealing with leftover effects of the Covid lockdowns, which left students on average with diminished propensity to interact with each other and to speak up in class.
“I feel like there’s more hesitancy for kids to collaborate, communicate in person. So we always try to emphasize a human element in education,” says Kalani High’s Ida.
Now, collaboration is an essential skill. “When we were younger, the stereotype of a programmer was a lone person in their basement just hacking out code,” he says. But now projects require cooperation and working in teams.
Many other teachers mentioned this same issue and say that’s why they are focusing more on social and emotional learning, or SEL. That means more in-class discussions and collaborative learning to encourage students to speak in class – and they say AI tools can support these verbal presentations and class discussions in many ways.
“The teacher is no longer the central figure or authoritarian of content because kids have access to content far more than what we know at this point,” Grantham says. “So, the teacher’s job is going to be much more focused on SEL.”
Teachers will check in on how individual kids are doing. “How’s your group dynamics in your class?” he says. “Do your kids feel comfortable talking? Do you have a safe environment where everybody is contributing?”
At a recent event at ‘Iolani School, a student said he’d be comfortable being interviewed for a job by an avatar or chatbot. But Grantham says some people fear that children will become less social if they have companion bots. “Because now I can just go talk to this friend who’s always going to be nice to me, and I don’t have to worry about dealing with the actual humans in the world, but that’s why human-in-the-middle matters.” Keeping children engaged socially matters more than ever, he says.
On the other hand, the DOE’s Sakurai says AI can help students who have anxiety or social struggles. Conversational tools, including voice-to-text, let them practice safely, which can lead to real-world conversations. Therapists now use AI to help students manage anxiety and learning challenges in ways we never imagined, Sakurai says.
Punahou is also “leaning into discussion skills,” Cheever says. The school brought in educational leaders from the University of Wisconsin-Madison to teach instructors how to engage students in authentic discussions. Substitute teachers were brought in and regular teachers were given two days off to attend a retreat to learn how to conduct effective discussions in their classes.
Grounding tech in Hawaiian culture
Purple Mai‘a is a nonprofit whose mission is to build the next generation of culturally grounded, community serving technology makers and problem solvers. Its website says it empowers community through the application of Indigenous innovation, technology and computer science.
Mike Sarmiento, VP of educational design for Purple Mai‘a, shares an important ‘ōlelo: I ka wā ma hope, ka wā ma mua – we move forward looking back. The future is found in the past.
“The first thing we do is that we understand our ‘ike, our knowledge base, is rich,” Sarmiento said during a discussion during Honolulu Tech Week in September.
“When I think about AI, we start with the framing of ancestral intelligence, about ancestral knowledge.” The common thread or knowledge base has always been ‘āina and mo‘olelo (our stories).
(Hawaii Business Magazine is working on a report that more fully examines how people in Hawai‘i are integrating AI with Hawaiian culture, local values and the specific needs of Hawai‘i. Look for that report in 2026.)
Cacace wonders whether those running AI companies will embrace diversity and cultural identity so AI produces accurate responses. “Do they have our best interests at heart?”
Hawai‘i’s schools are in the early stages of a transformation. AI already supports personalized practice, reduces teacher workload on repetitive tasks, and enables creative projects that were previously impractical. At the same time, it raises difficult questions about access, authenticity, ethics and cultural relevance. The journey has just begun, but Punahou’s Latham is correct: You can’t unring this AI bell.
What parents need to know
Here are questions parents and guardians should ask their child’s school about its use of artificial intelligence.
How is AI being used at the school? In which classes?
What are the guidelines for ethical AI use at school?
Which AI tools can my child use for homework?
How are my child’s privacy rights being protected?
Please give me examples of how AI tools will enhance learning.
Will my child still learn basic skills like how to write and multiply?



