Over the past 10 years, artificial intelligence has gone from a distant, futuristic concept to a tool used in everyday life.
In 2014, Generative Adversarial Networks were invented, enabling “realistic-looking AI-generated images and videos.” Two years later, Sophia the robot was activated, and in 2018, OpenAI introduced GPT-1, which could answer questions. In 2020, AlphaFold correctly predicted protein folding, and in 2022, generative AI was introduced to the public.
The quick advancements of AI suggest that students will inevitably encounter it in college and during their careers. As a result, Northeastern has introduced a concentration in AI for computer science majors in the Khoury College of Computer Sciences. Students who choose to pursue this line of education study everything about AI, from building and training models to ethics.
Additionally, Northeastern has established a partnership with Claude for Education aimed at changing how students utilize and approach AI. With Anthropic’s AI, Northeastern will offer the platform to all students and staff, hoping that “properly designed AI can benefit the entire academic ecosystem.”
President Joseph E. Aoun argues in his book “Robot-Proof: Higher Education in the Age of Artificial Intelligence” that education must evolve to prepare students for a future fueled by artificial intelligence rather than teach students to compete against it.
The university’s mission is focused on changing the status quo of higher education and providing a space where students can gain valuable experience through innovative teaching and an unconventional degree path. The approach Aoun has to AI and Northeastern’s willingness to embrace various systems is not surprising given the university’s core values and outlook on education.
“[AI] is here now; I think that knowing what to expect and kind of using it as a tool and not as the enemy is helpful,” said Sophia Goulopoulos, a second-year computer science and business administration combined major.
Although computer science has the most robust AI education at Northeastern, more students may begin to explore AI as an area of study directly related to their program. A study showed that in 2022, 19% of American workers were in jobs that tasks were “either replaced or assisted by AI.”
“AI is only going to continue to grow in its prevalence as well as its efficacy, so it is important for all majors, regardless if they are [computer science] or not, to get familiar with it and learn more about it and how to use it,” said Jalen Wu, a fourth-year computer science major who recently completed a concentration in AI.
Northeastern has expanded AI education to various programs of study outside of computer science. For instance, the criminology and criminal justice program requires students to take “Computer Science and Its Applications,” which utilizes AI as an assistant when learning the basics of coding.
Similarly, the communications curriculum offers “Communication in a Digital Age,” which examines historical and technical aspects of communication and the societal impact of digital communication, and “Online Communities,” which focuses on the formation of and challenges with digital communication.
While many of these courses explore using AI as a coding assistant or the societal impact of AI, many students feel that the ethical implications of AI need more attention within coursework offered at Northeastern.
Isabella Peña, a second-year criminology and criminal justice major, took “Computer Science and Its Applications” to complete the digital skills requirement in her program of study but was disappointed with the lack of instruction on the dangers of AI.
“We did not learn anything about AI implications at all,” she said of the class. “I would suggest [Northeastern offers] an intro to AI ethical implications.”
The AI concentration for computer science majors covers the ethical concerns regarding AI, including its environmental impact.
“[Systems] like ChatGPT use up many computing resources, they consume a lot of water, they consume a lot of energy,” Wu said.
On average, one Google search spends 0.0003 kWh or 0.3 Wh of energy, while a ChatGPT search uses 2.9 Wh of energy on each prompt. Furthermore, ChatGPT’s mark on the environment started before the platform launched; estimates suggest that GPT-3 “consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent” before it was introduced to consumers.
Northeastern’s AI courses also emphasize the social implications of artificial intelligence and its effect on marginalized communities due to reinforcing socioeconomic inequities through its algorithms. And while non-computer science students feel that AI’s ethical implications are a weakness within the general curriculum, Wu said that the university has a good platform for AI education.
However, he suggests Northeastern expand AI courses to general computer science students.
“I think that AI is not sufficiently taught to the general [computer science] major. If you are not specializing in AI, I don’t think there are any courses that are part of the [required] curriculum related to AI,” Wu said. “If I had taken a software concentration, I would have never gained exposure to the AI education and AI tools.”
Some Northeastern students say that keeping AI out of the classroom, unless studied as a subject, is appreciated.
“My bottom line is that AI hinders our learning in the classroom,” Wu said. He believes that AI can “[encourage] lazy coding, students to not pay attention in class, copy answers and not understand topics on a basic level.”
Goulopoulos shared this sentiment.
“If you are using AI for something that you do not know how to do yourself, that is where the problem lies,” she said.
Despite these concerns, some classes at Northeastern are beginning to incorporate AI in completing assignments. Computer science professor Karl Lieberherr teaches “Computer Science and Its Applications,” and artificial intelligence is a focal point of the course. Assignments frequently require students to ask ChatGPT questions and use it to aid their coding. However, students are not permitted to use AI resources during tests to ensure they understand the concepts and methods being studied.
Lieberherr said that he believes AI enhances learning in the classroom under specific conditions.
“Courses that teach how to create a synergy between humans and AI are crucial,” he said. “When you get output from ChatGPT, you have to be alert to find what might be wrong with it … they don’t solve everything, but [it] gives ideas [on] how to approach problems.”
While the discourse surrounding the use of artificial intelligence remains highly contested, especially in the context of higher education, Northeastern continues to approach the subject with a future-oriented mindset.
“[I have] been here for 40 years, and [Northeastern is] always using useful new technologies appropriately,” Lieberherr said.
Still, Peña said universities must be careful when incorporating this new technology and be mindful of how students feel regarding the subject.
“The whole point of college is to break out of a bubble, learn new perspectives and if you are being taught by AI, you no longer get that because you lack those perspectives,” Peña said. “You learn more about yourself and your world when you interact with people.”