Student AI policy to include ample access to digital tools

At the Jan. 29 board meeting, the Bainbridge Island School District took another step to update its code of conduct for students using technology for schoolwork, establishing best-practices for the use of generative artificial intelligence, “digital citizenship,” and media literacy.

The policy’s new language includes new recommendations from the Washington State School Directors’ Association (WSSDA), the statewide school board group. BISD technology director Kiyo Toma also provided the first Student AI Code of Conduct as part of the policy, a document that will outline how students interact with generative AI technology in the classroom.

“The substantive changes to the policy include the expanded definition of media literacy, details of student instruction, and inclusion of artificial intelligence and social media, and with the recent edits after the first reading, including references to (Office of Superintendent of Public Instruction), and, I’ll state for the record, a phrase of age appropriateness,” said Toma.

If the policy is adopted as-is, teachers will be able to choose between three tiers of “AI permissibility” for students to use during assignments or classwork: an “AI Recommended” tier, in which students are encouraged to use generative AI to complete an assignment or project to “enhance their work;” then a second “AI Permitted” tier, in which students may choose to use gen-AI to complete an assignment, but its use is not required; to the “AI Restricted” tier, in which students may only complete their work using “only their own knowledge and skills.”

All student projects that include AI-generated material require citations that identify the source of the tool or technique used.

BISD staff have not drafted an equivalent “AI Code of Conduct” document for teachers or building administrators, but a district policy adopted in October noted that “any use of Artificial Intelligence that does not align with expectations outlined by a classroom instructor or building administrator” is considered inappropriate.

The prevalence of Gen-AI models like ChatGPT, Claude, and Gemini in academic spaces has erupted over the last four years, reshaping the bedrock of the American school system. From classroom instruction to college admissions processes to test-taking, AI has changed how students interact with learning material and disrupted the typical means of assessing students’ skills.

It’s not just classrooms; the world at large is still reckoning with the full extent of AI’s impacts. Studies from the National Institute of Health and the Brookings Institution identify positives and negatives with the normalization of student use of AI. The tools can “offer benefits such as personalized learning, mental health support, and improved communication efficiency,” per the NIH study, but the costs may outweigh the benefits.

“This is largely because the risks of AI differ in nature from its benefits—that is, these risks undermine children’s foundational development—and may prevent the benefits from being realized,” wrote Brookings researchers.

WSSDA has not issued specific guidance on the use of AI in classrooms, but OSPI has issued some guidance for an ethical, “human-centered” approach to AI use.

“The rapid development of AI tools has created opportunities for educators to rethink the way they approach student learning. As our state embraces these changes, it is important to remember that human reflection and understanding are key to AI generation. This ‘human-AI-human’ approach to AI puts our students and educators at the beginning and end of all interactions with AI,” OSPI officials wrote on the department’s website.

Personal tutoring programs, virtual assistants, and lesson plans are all valid examples of “AI-assisted learning,” per the OSPI website.

“It is important to note that educators and students must remain at the center of instruction and learning in these cases,” OSPI added.

Tim Satre, director of Say Tree Productions, spoke out at the meeting for stronger language in the district’s digital citizenship policy.

“Generally speaking, the policy statement from WSSDA is pretty broad; it’s full of truisms, and that’s fine. I generally support that we need to prepare our students to be digital citizens when they are ready,” said Satre. “And we need to be very specific about when they are ready, so they don’t end up like me, constantly staring at my phone, sucked into the dopamine reward system of social media — which has been proven, study after study, to be detrimental. As we all know, Australia has banned it for anyone age 16 and under.”