Microsoft Copilot is Coming to All VSB Classrooms. Why Is There No Set Policy?
Photo Credit: Adobe Stock via Forbes
Emails, book reports, course outlines, tests, university applications — all tasks that once required brain-power and a decent amount of motivation. But now, these difficult, time-consuming chores can be easily completed with a quick AI prompt, tempting students to forgo their homework in favour of a quick solution, and ultimately, their learning experience.
This school year, the Vancouver School Board (VSB) is gearing up to give students across the district access to Microsoft Copilot, a generative artificial intelligence (AI) chatbot similar to ChatGPT. Students will be given access to a specialized education version of Copilot developed for ages 13+.
According to VSB Communications, further details about the timing and rollout of Copilot “are still being finalized.” Hamber principal, Mr. J. Lauzon (Administration), predicts that Copilot will be released to students sometime this November, but he is unsure about the VSB’s specific implementation timeline.
As this platform is introduced into students’ classrooms, questions emerge about the future of AI in education and the VSB’s role in promoting AI literacy.
How is AI already affecting learning at Hamber?
ChatGPT, one of the most prominent and popular AI platforms, was only released at the end of 2022. Similarly, Microsoft Copilot was released in February 2023. This rapid emergence of AI technology – accessible to teachers and students alike – has dramatically impacted the structure of many secondary school classes.
In schools, humanities-centred subjects like English and Social Studies have been disproportionately affected by the rising prominence of AI. These classes tend to assess students qualitatively based on creativity, research skills, and writing samples — a stark contrast to quantifiable tests and quizzes found in STEM subjects. With no detailed policy regarding AI usage at Hamber, expectations can look different in every classroom, and especially across subjects.
For instance, Ms. K. Hewett (Social Studies) follows a “no-AI” policy, restricting AI platforms of any kind from entering her classroom. One of her main reasons for this is to promote equity, as some students may not have the same access to the technology as others, putting them at a disadvantage compared to their classmates.
Ms. Hewett shared with The Nest that “Since [the technology] became a lot more accessible, I have effectively stopped assigning anything that's written for homework to be handed in for marks.”
As a longtime Social Studies teacher, she emphasized the importance of the process of writing and researching a longer piece of work, like an essay, to expose students to integral critical thinking concepts.
However, in STEM-based subjects, it’s harder for teachers to draw the line at where AI can be used, and even harder for students to comprehend it.
Many students find AI platforms like Copilot and ChatGPT very helpful as a resource to receive step-by-step explanations for complicated questions, namely in subjects like math and physics.
“I do AP Calculus, and I personally think it’s really hard,” Brittany Lau (12) shared. “Chat GPT helped me understand the questions that I didn’t understand before.”
Meanwhile, Emmett Calliston (10) told The Nest, “I use AI sometimes to teach me concepts I don’t get or ask smaller questions when teachers aren’t available.”
Mr. Lauzon has praised AI as an out-of-class learning tool, believing it to be a great replacement for tutors, which some students cannot afford.
“When I started working over in the West Side, I found that more students had tutors, while more students on the East Side would have jobs,” said Mr. Lauzon. He found that this “was a huge equity problem,” as students with tutors were more likely to be successful in school. However, with the launch of AI, all students have access to free, online “tutoring” that allows them to take control of their own learning with a simple prompt.
On the other hand, some students also shared that the prominence of AI, especially in humanities-focused classics, is a concern. “For English, if you’re using AI to write your essays, you’re not really thinking about the literature that you’re supposed to be studying,” Ada Chan (10) shared. “I believe that you should think through creative subjects with your head.”
Mr. N. Despotakis (Administration) believes that the use of AI in education is inevitable. “ I think teachers need to understand that they should expect students to use AI,” he said. He suggested that teachers should conduct their assessments in person, because they could no longer rely on students not to take shortcuts after school hours when completing their work.
“Assessed homework is dead,” Mr. Lauzon added. “I think teachers should expect and encourage students to use AI in an appropriate, responsible manner.” He added that AI literacy education should be added to all teachers’ professional development plans.
“Years ago, the internet was seen as a threat,” Mr. Lauzon said. “I think that as educators, we need to embrace it and teach it.”
How are educators at Hamber using AI?
The increased usage of AI is not just limited to students – some teachers and administrators also use it as a tool to support their work.
“AI is a game changer,” Mr. Lauzon said, emphasizing that it should be used “not to do our work but maybe to enhance our work.” Often, he asks AI to write an initial draft of something, and then reviews the draft with other administrators.
Mr. Lauzon also shared that he uses AI to synthesize long email threads from parents or students. “I don't know if the parent concern [or student concern] is being addressed,” he shared. “Rather than look back through it, I would say [to AI], ‘what action items do I need to do to address the problem for the student or for the parent?’”
Mr. Lauzon believes that “ for assessment evaluations, it’s helpful for teachers to be using AI, and we support teachers to use [it] in an ethical and positive way.”
However, as teachers adapt to the modern AI landscape by incorporating more in-class assignments, some students have found themselves at a disadvantage.
Chan shared, “I’m the kind of person that feels really pressured in school, so I don’t actually end up writing well in in-class English essays.”
She explained that because she must write nearly everything under strict time constraints, she no longer has time to review her mistakes or to think about the arrangement of her paragraphs, which is hindering her ability to improve her writing, as well as lowering her grades.
Regarding the implementation of Microsoft Copilot at Hamber, Ms. Hewett is waiting on more information before jumping into a change in her class structure. “This is such a different, unusual step forward for all of us,” she said. “I find it hard just to [...] go forward in a way that is compliant with the curriculum we have to teach.”
She specified that from her perspective as a teacher who has dealt with very little AI technology in her previous work, “we need to get a handle on what we can think through as human beings, because [...] if we're relying on AI, I think that's a dark alley.”
How is AI affecting the VSB’s curriculum?
In the 2024-2025 school year, BC’s first AI course was taught by Chiang Hsu at Sir Charles Tupper Secondary. The provincially approved elective course is meant “to expose students to the transformative technological shift that is AI,” Hsu told The Nest.
Hannah O’Keene, a grade 12 student who was enrolled in Hsu’s AI course, found the class very useful. “I’m able to think more critically about specific AI-centric issues given the background knowledge and hands-on experience I acquired through the course,” she shared.
Hsu believes that AI education should be mandatory for all students. “The best thing we can do is not to reject it, but to learn as much as we can about it in order to use it in meaningful ways.”
However, the AI course is not currently running at Tupper for the 2025-2026 school year. Rather, the school is currently looking for ways to integrate the course’s curriculum into their computer science courses.
Zhi Su, the VSB District Principal of Learning Information and Technology, shared with The Nest that there are no plans to implement the AI course across the Vancouver district. However, he added that “AI concepts and development can be meaningfully integrated into existing courses such as Computer Programming, Information Technology, and locally developed Board/Authority Authorized (BAA) courses.” He commented that the VSB’s goal is to integrate AI literacy by “ensuring students develop the critical thinking, ethical reasoning, and digital fluency needed to engage with AI in any context.
As AI models become more prominent in education, The Nest is also hopeful that there will be a level of instruction provided for incoming students unfamiliar with the technology. Ms. N. Sandhu (Counselling) shared that after attending a seminar regarding university proposals on AI literacy, she could see something similar coming to Hamber in the future. “[Maybe] in grade eight, one of your rotations is AI literacy,” she said. “I think that would be a very interesting class.”
While Microsoft Copilot is not necessarily going to be integrated into the curriculum, Su shared that it is a “natural choice for introducing AI in a safe and secure way. It is also rolled into our licensing for staff, and free for our students.”
Su commented that this implementation is meant to support learning and not hinder it in any way. On Copilot, students can translate languages, learn about new topics, summarize information, and generate ideas for inspiration.
The implementation of Microsoft Copilot is “a move in a positive direction,” according to Hsu, “[because] it's better to have purposeful exposure to AI in our schools as opposed to students downloading random AI chatbots on their devices or at home.”
As a student who will be using the technology, O’Keene is hopeful for this new addition and believes that “Copilot is a competent model with no glaring or alarming issues.” She believes that it provides users with “accurate and relatively unbiased responses” while also helping their learning.
While some other districts have begun to roll out the technology, Su shared with The Nest that the VSB is taking more time to reshape the current learning experience to better suit the future of technology to the best of its ability.
“There is so much to learn and yet to be known or seen, so we are taking a little more time to release to our students with the hopes of learning from other districts and leapfrogging the obstacles and challenges they encountered,” Su said. “Sometimes, you need to go slow to go fast.”
How is AI integrated in independent schools?
As AI policies are not ubiquitous across the district, students at other schools around Vancouver are affected by different regulations and use AI in different forms.
Hailey Mah, a grade 12 student from York House School, an independent school for girls, shared with The Nest that AI has become very integrated into the programs she uses for research on school projects. York House students use a platform called Jstor to find journals and articles, which has introduced AI summaries as a main feature. Mah was encouraged by her teacher to use the AI summaries to find quotes and themes on longer articles.
In general, at York House, students are not supposed to be using ChatGPT or other AI services, but it has become so predominant in their resources that the rules have become unclear.
“You’re definitely not allowed to copy and paste out of ChatGPT, but nothing really stops people from using it and paraphrasing work,” Mah explained. She also expressed that everyone is using AI and that the rules need to be less about whether students are using it or not, and more about how they are using it.
The situation is much stricter at Little Flower Academy (LFA), a private Catholic school in Vancouver. Natalie Chow, a grade 11 student, explained that students are not allowed to wear smart watches or have their phones outside of their lockers without explicit permission, much less use AI for schoolwork. Students at LFA receive a Chromebook at the beginning of the year with numerous websites and games blocked off.
The use of AI at LFA is highly discouraged or even banned in most classrooms, except for a select few science or math classes that allow students to use AI to break down difficult problems. Chow explained that even if you are allowed to use AI to understand questions, “any work handed in cannot be AI written, and if it is found to be, most likely you will be given a zero.”
Mah stressed that the policies around AI at York House are “in theory helpful and are meant to help students actually use their brains and build critical thinking skills. But when it comes around to it, AI makes students’ lives easier.”
What are the AI policies of school districts across BC?
At Hamber, policies on cheating and plagiarism are clearly outlined for students and teachers to refer to, providing necessary expectations and consequences. Yet, there are no district or school-specific policies that govern the use of AI for educational purposes.
The Nest reached out to the VSB for a comment regarding whether there are plans to draft a standardized AI policy as Copilot enters schools. VSB Communications shared that “The District does not have a standalone AI policy as AI is seen as a tool in the digital space. AI tools are governed under existing policies, including the District Code of Conduct and Acceptable Use of Technology guidelines.”
Additionally, they shared that a cross-functional Digital Literacy Working Group of educators, IT, and privacy staff is currently evaluating “best practices and emerging guidance to ensure responsible and effective use of AI in classrooms.”
“The work of this group is ongoing and the District will communicate updates to staff, students, and families as planning advances,” they said.
Educators at Hamber have been given limited guidance on how to approach AI problems in the past, and are uncertain if there will be future regulations.
“You would hope that once [the VSB] implements AI into schools, they would give [teachers] some precise guidelines to follow,” Ms. Huett said. “That would be helpful.”
Mr. Lauzon and Mr. Despotakis shared with The Nest that they are unsure whether or not the VSB has any concrete AI policies for students, or if the district is currently developing any before rolling out Copilot.
On the other hand, other districts across the province have already implemented their own AI policies.
The Central Okanagan School District, which serves Kelowna and the surrounding area, released a 17-page document guiding educators and students in the responsible use of AI.
“In the context of the educational environment, the use of [AI] technology should be meaningful and enhance, but not replace, the valuable human connection and social nature of learning,” the district says. It says that teachers who choose to embrace AI will “equip students with the skills needed to thrive in a future integrated with AI.”
However, the district cautions that “AI output is a great starting point but should never be the final product.” As well, the district advises teachers not to use AI writing detection tools, because it says the tools are inaccurate, damage trust between students and teachers, create privacy concerns, and are more likely to create false positives for English Language Learner (ELL) students.
Meanwhile, Surrey School District Superintendent Mark Pearmain has hosted multiple AI info sessions for staff and families in his district. Surrey Schools has also released videos from experts covering AI applications in the workforce and post-secondary, AI’s implications for student safety, and the views of the district’s AI Student Council.
In Langley, the school district only allows students in grades 9–12 to use the special, age 13+ version of Microsoft Copilot, while all staff are allowed to use a standard version of Copilot. The district also requires students to include a statement of AI use when submitting work containing AI content.
In its guidelines for AI use, the Langley district allows staff to use AI to generate teaching materials and help remove barriers for diverse learners, but cautions that staff should also consider the risks of AI use and prioritize considerations about the privacy of students’ data.
What does The Nest think about Copilot?
As student journalists who spend hundreds of hours fact-checking, crafting eloquent sentences, and curating the perfect headlines, the emergence of AI technologies in schools poses major concerns for us. To those who prioritize and promote the importance of integrity in your work, Microsoft Copilot feels like a cheat code.
Confusion around the proper usage of AI has become prevalent at The Nest as well, so much so that this year’s updated version of our Reporter’s Manual & Editorial Policy Guide outlines a policy for reporters and editors regarding the usage of AI for newswriting.
We believe that information published by journalists should flow directly from the sources to reporters, bypassing unnecessary “third parties” like AI platforms. Therefore, The Nest believes that all student work should be completely written and produced by students. The value of distinct human character, style, and prose will always outweigh the support provided by AI.
Nevertheless, as the lines blur between what is ethical and what is barred, The Nest calls on the VSB to be transparent with educators and students about its expectations regarding the appropriate usage of AI, and the consequences of unauthorized AI use. This includes the creation of either district-wide or school-specific AI standalone policies.
Boundaries should be outlined regarding the proper use of AI, specifically Copilot, by students and educators alike.
Finally, policies should establish universal standards for how teachers are expected to address students who have used AI in an unauthorized way.
Currently, teachers react differently to the misuse of AI in their respective classes: one student could receive a light reprimand for generating an entire essay, while another might face serious consequences that could impact their academic career.
The Nest believes that AI policies should be clear and concise, like those related to cheating, leaving no room for interpretation. With students’ learning, integrity, and ethics on the line, policies surrounding AI usage have never been more crucial than they are now for our education system.