Game design professor Chris Barney was used to scrutinizing his students’ assignment submissions. He used Canvas’ built-in plagiarism checker and occasionally checked manually for telltale signs of plagiarism. But in the past year and a half, Barney has had to inspect assignments with even more scrutiny given everyone’s new classmate: artificial intelligence.
On Nov. 30, 2022, OpenAI launched the Chat Generative Pre-trained Transformer, or ChatGPT. Since its creation, the use of artificial intelligence, or AI, has become a heated and ongoing debate in the context of education.
Originally, the company marketed ChatGPT as a tool that could be used for coding and generating basic paragraphs in response to prompts. However, concerns over students potentially using the technology to cheat on class assignments quickly took hold alongside a nationwide discourse surrounding the rapidly evolving landscape of AI.
Northeastern’s Institute for Experiential AI, or EAI, held a seminar shortly after OpenAI launched ChatGPT to address how the tool could shape education, said Kevin Lim, a business development specialist at EAI.
“The entire higher education industry was taken by storm of, ‘This is going to end education as we know it,’” Lim said. “Our goal during that event was to demystify some of the things that were swirling around at the time [and] really bring home the technology and what it can and can’t do.”
Questions about the new technology quickly focused in on the ways and frequency it could be used. Ultimately, Northeastern left professors to decide the role of AI in their classrooms.
“As an institution, Northeastern aims to preserve faculty autonomy and agency in deciding what constitutes appropriate use of Generative Artificial Intelligence (AI) in their disciplines and courses,” a position statement published by the Office of the Provost in August 2023 reads.
To media innovation and technology professor John Wihbey, AI is an opportunity for his students to gain experience with technology used in professional journalism, namely an image-generation AI dubbed “Midjourney.”
“We’re using [Midjourney] as part of an exploratory unit to think about how we can become better prompt engineers, how we can evaluate the strengths and weaknesses of AI-generated visual media,” Wihbey said. “I think reframing [AI] as a potential learning tool, something that can be harnessed to enhance and amplify learning, is a great thing.”
Kelly Garneau, a teaching professor and director of Northeastern’s First-Year Writing Program, chaired the program’s ChatGPT and Generative AI Working Group. She said the group started last spring as “a way for [faculty] to think through what the new technologies were going to mean for [their] pedagogy.”
“Writing has always been shaped by the tools we use to do it since the beginning. I think there are some ways in which, especially for students at the brainstorming stage when [they] are playing with ideas, [AI] can create this conversational back-and-forth discussion where it can be generative of ideas,” she said. “[But] you don’t want to use it to replace the uncomfortable space that you might be in coming up with things on your own. Students tend to want to outsource what they don’t view as valuable. So thinking about the value of the work that we do and making that value clear, I think, is vital.”
Though some professors have found ways to incorporate AI into their coursework, a clear issue with the accessibility of tools such as ChatGPT is that it provides students with shortcuts in their class assignments. As a teaching assistant in a data science class, Sarah Popeck, a second-year data science and economics combined major, said she catches many students who have possibly used AI.
“[AI] can be a really great resource,” Popeck said. “The problem is when people are using it as a substitute for learning because then they’re just falling behind when they use it on the first assignment. Then, the second assignment. And then they’re gonna say, ‘Oh, I’m going to actually try and do this for myself,’ but AI and ChatGPT have done their entire homework from there on out.”
Last semester, Barney reported 10 cases involving 18 students to the Office of Student Conduct and Conflict Resolution, or OSCCR, for cheating, he said. In prior semesters, Barney said he would report zero cases with one incident every couple of years.
“I understand the argument, ‘Well, it’s like having a calculator. You expect me to do my long division by hand? There’s a calculator, why shouldn’t I use it?’” Barney said. “And the answer is, I’m trying to teach you something, and you’re not learning it. If you aren’t doing the reading, you aren’t learning it if you aren’t doing the writing yourself. Whether or not it would be acceptable to do that in industry is irrelevant if it’s preventing you from learning what I want you to learn.”
Barney’s new cheating detection process involves running assignments through five AI technology detectors.
Another layer to this debate is the extent to which professors should incorporate AI into their courses, which would ultimately depend on their department and the subject they teach, Garneau said. The Writing Program’s AI working group developed a statement saying they “do not support instructor use of AI content generators in the assessment of student learning,” adding that they believe “active engagement” with students’ writing is at the core of communities built in classrooms.
“Is it no longer necessary for students to understand a textbook if, when they need to exhibit the knowledge in that textbook, they can just ask an AI to do it,” Barney said. “I feel like having an internal comprehension of material is always going to matter. And having artists and writers with the ability to produce prose and art that is meaningful matters. … [We need to] have rules around what is a reasonable way to use AI, and currently, we don’t. It’s a real concern for me both as a human and as a professor. And I don’t think we have good answers at this point.”
Editor’s Note: Sarah Popeck currently serves as a staff writer for The News.