AI Has Entered the Classroom. Are We Ready?

Generative AI has made its way into K-12 schools whether we like it or not. Here's how teachers and students are using the technology.

Deborah Waldron’s first encounter with ChatGPT was nearly two years ago, when one of her students demonstrated how well it could write code. Intrigued, the Yorktown High School physics teacher began experimenting with ways to incorporate the AI chatbot into her classroom. But she quickly discovered that the artificial intelligence got things wrong nearly as often as it got things right. 

So she decided to use the generative AI program’s inherent flaws to her and her students’ advantage. She asked ChatGPT to create a 10-question multiple-choice quiz for her 11th-grade physics students about the relationship between mass and acceleration when the force is held constant. 

The resulting quiz questions were fine, but Waldron noticed that more than half the answers provided were incorrect. Rather than ditching the whole exercise, she asked her students to determine which answers ChatGPT whiffed on and correct them. She also challenged students to identify the overall misconception that had prompted the ChatGPT “brain” to spit out answers that were wrong.

- Advertisement -

“This is an unbelievable untapped tool that in five years could completely change so much of what we do,” Waldron says. “We’d be damaging the kids long term [if we ignore generative AI]. I would say to teachers, ‘Your job isn’t going to be replaced by ChatGPT, but your job might be replaced by someone who knows this skill set.’ ” 

♦♦♦♦♦

In November 2022, ChatGPT, arguably the best-known generative AI program, reached 1 million users within just five days of its release. By the summer of 2024, the program had about 600 million monthly visits, according to its parent company, OpenAI. But ChatGPT is just one star in a rapidly expanding AI universe.

Generative AI—a type of artificial intelligence that uses existing data to create writing, imagery, music or other new content—is showing up in our online searches, social media feeds, political campaigns, design and entertainment apps, and much more.

Not surprisingly, it’s showing up in our schools, too. Student use of ChatGPT and other generative AI for research, writing and other purposes is growing exponentially, leaving administrators and teachers struggling to keep pace with the technology. Because the field is evolving so rapidly, teachers have largely been left to do their own research and experimentation with AI. Today’s tech-savvy kids are often miles ahead of them. 

- Advertisement -

To test out ChatGPT for myself, I asked the program to write an opening paragraph for this article. (As a journalist, I know that passing off other writing as my own is verboten, but this was research.) In about two seconds, the program wrote this: 

“In Arlington’s vibrant educational landscape, generative AI is poised to revolutionize K-12 learning experiences. As schools embrace cutting-edge technologies, generative AI stands out for its ability to personalize education, foster creativity and enhance student engagement.” 

A little vague, and maybe a little overblown, but not terrible. I wasn’t surprised that a ChatGPT response would emphasize the positives of its own existence. 

Educators are more wary of AI’s sudden ubiquity. Last fall, a Pew Research Center survey of K-12 teachers nationwide found a quarter of respondents saying the use of generative AI in K-12 education is more harmful than beneficial. Nearly a third (32%) said AI tools offer an equal mix of harm and benefits, while only 6% of teachers saw the technology as a net positive. The remaining 35% of teachers said they simply weren’t sure—reflecting a potentially concerning lack of understanding about a technology that is progressing by the second.

- Advertisement -

Parents are leery, too. “Generative models are unreliable, unregulated and unchecked for accuracy,” one Arlington mother of two elementary-age students recently told me. “They are dangerous to the entirety of society.” 

With this sentiment in mind, I then asked ChatGPT to summarize the downsides of using generative AI in primary and secondary education. Uploading prompts to ChatGPT is as easy as sending a text, and I couldn’t help but picture a little chatbot scratching its head before responding. 

“The use of generative AI in K-12 education presents several downsides,” the program wrote, “primarily revolving around issues of dependency, equity and data privacy. Overreliance on AI tools for tasks such as writing, problem-solving and studying can inhibit students’ development of critical thinking and creativity.” 

That pretty much nailed it. These AI-driven paragraphs were accurate and coherent, but that was about the extent of it. ChatGPT wasn’t going to help me write about how artificial intelligence is being used in my neighborhood school specifically—and how it could be used in the future—because there just isn’t enough publicly available data to assimilate yet. And it couldn’t replicate my own literary voice—that undefinable thing that makes someone’s writing unique to them. 

Nevertheless, the speed with which the program produced usable content was compelling. I could see how a stressed-out high schooler with an essay due on Romeo & Juliet would be tempted to use it. 

♦♦♦♦♦

Many students have already given in to that temptation. Last fall, Pew also conducted a nationwide survey of U.S. teens to gauge their experiences with ChatGPT. The survey showed that 19% had used ChatGPT to help them with schoolwork, with higher percentages of upperclassmen (juniors and seniors) using the tool versus lower grades. From an ethical standpoint, 69% of teens felt it was acceptable to use for research, 39% said it was acceptable for solving math problems and 20% considered it acceptable for writing essays. 

Anecdotally, local students seem to feel similarly. I polled 14 Arlington public high school students—all sophomores, juniors or seniors from Wakefield and Washington-Liberty—promising anonymity so they could speak candidly. Although it’s an admittedly small sample size, 11 out of the 14 said they had used ChatGPT in one or more classes. Most used it for English or history. 

“I used it to help me come up with a thesis statement in English,” one student said. 

“I’ve used it to summarize chapters of books as well as give me ideas for projects,” shared another. “It helped summarize stuff in exactly the way I needed.”

A third student noted: “It was helpful because it allowed me to either get started on the work by giving me a seed idea, or allowed me to check my answers and ensure I was right.”

Yet another student confessed: “I’ve used it for all my classes because I didn’t feel like doing the work.” 

At the same time, these local students had concerns. Chief among them: that using a tool like ChatGPT would be considered cheating even if they were only using it for research or ideas; that it would inhibit their learning; that the material it generated would be the same as for other users; and that the information would not be correct. 

“I can see how it is being used maliciously, but when calculators were invented, they weren’t banned for giving an unfair advantage,” one student surmised. “If someone can write a good essay with generative AI, what is the issue? I have never been good at using ChatGPT, and in its current state I don’t think it’s very useful. I just don’t understand why this is [potentially viewed] as cheating when we treat spell-check and other writing assists as tools.”

A nationwide survey by the RAND Corp.’s education and labor unit, published earlier this year, reported that only about one in five K-12 teachers  (18%) had used AI tools in the classroom. But the vast majority of those users (73%) expected their use of AI programs to grow—especially virtual and adaptive learning platforms such as Khan Academy’s Khanmigo app, as well as chatbots such as ChatGPT or Google Gemini. As of last fall, only 5% of school districts surveyed had adopted AI-specific policies for students.

Local school districts are developing policies around AI, but the rollout has understandably been slow. Frank Bellavia, a spokesman for Arlington Public Schools says APS has provided guidance to staff on the ethical and responsible use of AI in the classroom, and that administrators will begin working on an official AI policy this fall. The Arlington County Advisory Council on Teaching & Learning (ACTL) is keeping close tabs on the process as well. 

“The ACTL Council and the ACTL educational technology subcommittee have touched on this topic in the past year and will delve more deeply in the coming year in conjunction with APS’ work to develop guidance for teachers and students,” ACTL Chair Jenny Rizzo shared in an email. “With how rapidly the field of AI is moving, school systems will need to be nimble and willing to adjust frequently to keep pace. ACTL looks forward to supporting APS in doing so.”

As an International Baccalaureate Continuum school division (and in keeping with the IB program’s stance on AI), Falls Church City Public Schools (FCCPS) has decided not to ban the use of AI software. “We recognize that artificial intelligence has the potential to become as integral to our daily lives as traditional tools like spellcheckers, translation software and calculators,” FCCPS spokesman John Brett said in an email. “Our goal is to equip our students with the skills to use AI responsibly and effectively, upholding our core values of academic integrity and ethical conduct.”

Fairfax County school officials have contracted with the nonprofit International Society for Technology in Education to test best practices around AI and innovation in learning. The Virginia Department of Education, for its part, has also made resources available to educators about generative AI, but so far the technology hasn’t been referenced in protocols for the Virginia Standards of Learning or other official policy. 

As of last fall, the RAND survey noted that only 23% of school districts nationwide had provided training to teachers on the use of generative AI, although 37% of districts said they had plans to provide such training within the 2023-24 academic year. Presumably, those numbers will continue to grow this school year. 

The survey also underscored that, as with a lot of digital technology, historically disadvantaged school districts had less access to generative AI tools. The study found 27% of districts serving mostly White students having provided some teacher training by fall 2023, compared with 11% of districts serving mostly students of color. “Faster take-up of AI in historically advantaged settings,” the authors concluded, “will only widen already large disparities in students’ opportunities to learn.”

♦♦♦♦♦

This leaves most teachers navigating the digital brave new world on their own. To share her experiences and what she’s learned, Yorktown’s Waldron has made presentations to other teachers, both in person and online, in Arlington education forums. 

“The core issue [about AI] that scares teachers is that it is easy for kids to take the easy way out,” Waldron says. (Like all the teachers quoted in this article, she is speaking only of her own experience and not on behalf of APS officially.) “It’s going to be our job as educators to teach them how to use AI to be smarter, more creative thinkers. We saw similar things back in the ’90s when graphing calculators came out. We didn’t know half of what they could do, but the kids quickly learned how to cheat with them. We’re in the very same spot with AI, and there are easy ways to use it poorly. We have to find ways to show kids that ‘Hey, you can’t get away with cheating, and here are ways to use it wisely.’ ”

Waldron notes that free generative AI tools for educators can be quite handy and save busy teachers invaluable time. Gamma and Canva use AI to help build presentations; Eduaide and Magic School can help generate lesson plans; and QuestionWell, Twee and Blooket all create quizzes and assessments.  

Jennifer Kirschbaum, a French teacher and the world languages chair at Bishop O’Connell High School, reports that the local Catholic diocese is working on an official AI education policy, although digital assists are already part of world languages anyway—think of Google Translate or Duolingo—so it’s a natural space for AI experimentation and learning. 

At the same time, the easy availability of ChatGPT has instigated a return to old school pen and paper. “In my classes, I had students writing a short story [longhand] to show off their understanding of past tenses,” Kirschbaum says. “If they have a personal connection to the topic, they want to tell it in their words. You can’t just say, ‘Don’t use AI.’ You can create situations where AI is not useful. It’s about giving students the opportunity to want to tell something they care about because they don’t have to rely on tech.”

Megan Lordos, an English Learner teacher at Wakefield High School, has gone back to pen and paper with her students, too. “I know we’re not moving back to word processors, but I’ve redesigned some prompts so they are micro-specific to the school, so I know they can’t use AI,” she explains. “I’ll ask them something like, ‘Do you think it’s unfair that the B hallway has more windows?’ Sometimes it’s fun that it’s a shared experience.”

Teachers are always facing new technologies, apps and programs, Lordos says, but with AI, the pace is accelerated. “Teachers have become accustomed to playing catch-up with technology, but the catching up never happens. There’s always something new in technology that you have to master, and we have to do it quickly, because the students are already there.”

The AI boom makes social-emotional learning even more important, Lordos adds: “You have to build trust in the classroom so you can have open discussions about AI. Sometimes we’re going to have to scrap all these other lessons because we have to set expectations about honesty and trust.”

♦♦♦♦♦

Where do elementary-age students fit into the picture? Even though they might be too young to use ChatGPT for schoolwork, chances are that AI is part of their lives already, some way, somehow. 

Jennifer Burgin, a special projects teacher who works with K-5 students at Hoffman-Boston Elementary in Arlington, contends it’s important for younger students to have controlled exposure to AI, so they begin to understand that it’s a tool, albeit an imperfect one. AI’s elasticity is quite useful, she asserts, to help young learners have a growth mindset and a willingness to experiment with different ideas.

AI also has the potential to make classrooms more equitable and adaptable for students spanning a wide range of reading levels and learning abilities. Take the example of a class in which a majority of students are reading at a sixth-grade level. “An English learner can drop an article on the Roman Empire into AI and make it [understandable to them] on a fourth-grade reading level,” Burgin offers. “But the headline and concepts are the same as what the rest of the class is reading [on a higher level].” 

For a recent AI-based art challenge, she asked students to make original art based on a series of prompts, while she fed the same prompts into generative AI programs like NightCafe. Students then compared their own creations with the sometimes beautiful, sometimes ridiculous art that the AI created. 

One prompt asked students to draw pictures of white-tailed deer under falling autumn leaves—an exercise aligned with Virginia social studies and science Standards of Learning. Student images were highly varied, including a Pokémon-looking deer with wide eyes, a 3D paper deer in autumnal hues and a deer in the shape of an ice cream cone (my personal favorite). The AI images were generally less imaginative—usually a stock image of one or two deer standing in a fall forest—but many were odd in some way, with misshapen faces and extra limbs. In one case, an eyeball appeared to be floating by itself. 

“I told my students that AI was a program where people were using algorithms to teach it to think for itself in a computer kind of way,” Burgin says. “We talked about how you can’t always expect AI to be perfect, and the children would comment on how crazy it looked.”

There was an unanticipated upside: Burgin says students benefit when they see that they don’t have to be perfect either. “One of the things I loved most about showing the children the AI artwork is that it was hit-or-miss, just like humans,” she says. “High-performing children are often afraid to take risks because the result might not be perfect. They don’t know that I’ve had to grow as an artist and as a writer. We don’t often show the process of failing attempts and making mistakes.” 

♦♦♦♦♦

On a Saturday afternoon in June, I stopped by the Columbia Pike Blues Festival—not just to listen to music, but to visit a vendor booth touting Ello, a new, AI-driven literacy app that helps young children to read. 

Jackie Neumann, the company’s “head of people and partnerships,” demonstrated the app while a costumed mascot named Ello the Elephant waved and took photos with kids walking by. Using speech recognition software, Ello is shown on screen “listening” to the child read the story aloud, while Ello provides encouragement and gently corrects words the child is struggling with. 

According to Elizabeth Adams, the Falls Church-based co-founder and chief experience officer at Ello (she’s also a clinical psychologist), ed tech products tend to be either highly engaging and pedagogically questionable, or pedagogically sound but dry and boring. 

Ello, she says, is an attempt to strike the perfect balance between the two. 

“The idea of Ello was to re-create the experience of a child sitting with a reading tutor,” she says, “but instead of a person, it’s this elephant. We’re leaning into engagement.”

The company maintains an astonishing 700-book library that users can read either online or in hard copy, mostly written with the use of generative AI. One Ello book called We Are Engineers tells a simple story about two friends who build a model of a wooded hillside and a bridge that a ball can roll down without falling off the edge. 

So far, Ello has focused on selling its product directly to families, but the company hopes to one day have reading stations in libraries. “We’ve reached out to local libraries, including Arlington, asking them if they want an Ello station,” Adams says. “Our mission is to teach any child to read a book from start to finish, which means eventually we go into the schools. That’s on the road map.” 

In May, OpenAI launched ChatGPT Edu, a version of ChatGPT designed specifically for universities. The program is designed to assist with various objectives, including personalized tutoring for students, helping researchers write grant applications and assisting with grading and student feedback. The company says that ChatGPT Edu is more sophisticated and advanced than the standard program, while ensuring high levels of security and data protection. 

Could a ChatGPT designed expressly for K-12 schools be next? And if so, will we be ready for it?

Kim O’Connell is a writer in Aurora Highlands and the mother of two teenagers. She also writes the magazine’s Back Story column on local history.

Our Digital Partners

Become a digital partner ...