Khoury News
Khoury College faculty on when — and when not to — use AI in the classroom
On the question of AI's place in the computer science curriculum, Khoury College's instructors are taking a balanced, forward-thinking approach — grasp and master the fundamentals first, then leverage AI to accelerate tasks.
The exponential growth of generative AI has sent shock waves through the computing field that created it. With AI now coding at the level of most junior developers, many software engineers are “vibe coding” — focusing on high-level prompting and design while AI generates most of the actual code.
For colleges of computer science, this developing industry practice begs questions. What is the best way to prepare students for this evolving industry? When and how should AI be folded into the curriculum?
“We’re AI-forward in our education,” says Khoury College Dean Elizabeth Mynatt. “We’re embracing our students learning the fundamentals of the field but also understanding how AI tools are transforming the types of software we can create.”
Mynatt characterizes the approach as “crutch to coach to colleague” — a strong belief that while AI cannot replace a rigorous grounding in computer science fundamentals, it will nonetheless be a valuable and ubiquitous coding tool going forward.
“Crutch is when you come in, and you’ve been using AI to fill things in for you, which we know isn’t the right way to learn coding fundamentals,” she explains. “We will expose you to ways that AI can become a coach so you can become a better developer. By the time you graduate, AI will be a colleague you’d work with just as you’d work with other software developers.”
AI use depends on the course
According to Christo Wilson, professor and associate dean of undergraduate programs, each Khoury instructor decides whether AI is forbidden, acceptable, suggested, or required for each assignment or project. However, these decisions are guided by overarching principles.
“Khoury College encourages less — and potentially zero — AI use in introductory courses,” Wilson says. “This approach is because research has shown that unrestricted AI use can hinder learning for beginners.”
READ: Can newbie coders use ChatGPT instead of learning to write code?
“Students who are learning to code,” adds Associate Teaching Professor John Rachlin, “need to struggle and experience firsthand the process of line-by-line coding, testing, and debugging. Then they can level up to eventually use AI effectively and productively.”
Following this pattern, introductory courses like CS 2000 and CS 2100 will emphasize learning and practicing the fundamentals of program design and implementation, with only a brief introduction to the types of tasks AI can do. Iterative assignments, in which students continuously build and improve a project in response to feedback, will hammer these fundamentals home. Only after students grasp these core concepts will they learn to use AI to accelerate coding tasks.
“Effective usage of AI programming assistants requires humans with the ability to specify programs, validate that they do the right thing, and provide constructive feedback to improve them,” says Associate Professor Jon Bell.
In higher-level courses, faculty will incorporate AI in a variety of novel ways. They could explain how to use AI to gather and analyze software requirements, to generate documentation, or to design, implement, and test programs. Some use a “Stump Claude” activity, in which students interrogate the Claude chatbot on course topics to discover where it makes mistakes. Other faculty use AI tools to inject bugs into programming assignments for students to detect.

Along the way, faculty will evaluate the efficacy of these approaches by comparing them to established teaching methods.
“This curriculum is also informed by discussions with employers,” Bell adds. “We draw on a significant internal research project at Google that surfaced the tasks that their most productive software engineers use AI for, and the skills and knowledge that they report needing for those tasks.”
By phasing in AI assistance at higher levels where students have already grasped the fundamentals, and by tracking developments in industry, Khoury College ensures that students are prepared for work.
“Due to our co-op program and our collaborations with industry partners, our classrooms are becoming less and less traditional,” Mynatt notes. “Instead of imagining an invisible wall between university and industry, we’re blending the two together. This collaboration helps us keep pace with industry expectations around AI usage because it’s being constantly integrated into our classes.”
Ethics and responsibility
This rapid change also requires a keen eye toward ethics and accountability to ensure that learning remains paramount.
“An ethical approach to AI in the classroom means treating technology not as a replacement for teachers, students, and human effort, but as a supportive tool — a partner that expands what is possible in the classroom and opens new opportunities for learning, reflection, and growth,” says Saiph Savage, assistant professor and director of the Civic AI Lab. “This commitment means choosing AI tools carefully and being transparent about their limitations.
“The rapid development of AI can create fear and urgency,” Savage adds. “Education should not be about adopting every new tool as quickly as possible. The focus should remain on pedagogy and ensuring that technology supports learning goals.”
This learning-first obligation falls on faculty and students alike.

“It is inappropriate to use AI to circumvent the learning outcomes of an assignment,” Rachlin says. “If a student allows AI to do their thinking for them, they are likely relying on AI as a crutch with the narrow aim of getting the program to run correctly. They are ignoring the broader design objectives that are equally important for long-term career success.”
Faculty and TAs are increasingly using in-class, closed-book exams, as well as code reviews that require students to show they understand the work they’ve submitted. Unreadable or poorly designed AI-generated code will likely receive poor marks, and egregious or repeated violations may be referred to Northeastern’s Office of Student Conduct and Conflict Resolution.
By adhering to the spirit of the law, Rachlin argues, students will make themselves indispensable to industry.
“Software design and architectural best practices aim to create code that not only meets specifications, but is also reusable, extensible, efficient, modular, readable, explainable, and secure,” he says. “As of today, generative AI is far less effective at addressing these higher-level design principles.”
For students graduating into a tech sector that’s shifting under their feet, understanding these principles and wielding them to collaborate with AI is the only way forward.
“When other major revolutions arrived in computer science — be they mobile devices, cloud computing, or anything else — we thought of them as new capabilities that were relevant for students, and that we needed to integrate into the fundamentals of what it means to be a computer scientist,” Mynatt says. “We’re thinking about AI in the same way.”
The Khoury Network: Be in the know
Subscribe now to our monthly newsletter for the latest stories and achievements of our students and faculty