How can professional education equip students for a workforce being rapidly reshaped by artificial intelligence? That question took center stage as the Columbia University School of Professional Studies (SPS) convened its inaugural Faculty Summit on AI Teaching and Learning, a daylong gathering of educators, industry experts, and technologists committed to understanding AI’s opportunities, risks, and implications for the classroom.
The event—the first in a planned series—featured opening remarks from Dean Troy Eggers, Senior Vice Dean Steve Cohen, and Senior Associate Dean for Academic Affairs Erik Nelson, followed by a cross-school faculty panel, roundtable discussions, and an interactive workshop.
“AI is redrawing the boundaries of professional practice,” Dean Eggers said in opening remarks. “Our students are watching the transformation in real time. The question they’re asking us, often before we can ask ourselves, is: What will my education mean in an age of intelligence systems?”
Cohen, who also directs the M.S. in Sustainability Management program, noted how AI has already shifted his own teaching practice, including one assignment that evolved from a manual memo-writing exercise into a project testing students’ AI-querying skills.
“We’re going to be judged on our ability to take it to the next level—to be creative and analytic and take our life experiences and the diverse life experiences of the people we work with,” he said. “I’m not worried about it. The use of new technologies has improved human life since the beginning of agriculture.”
Watch a video recording of the Faculty Summit on AI Teaching and Learning event:
Industry Perspectives on AI in the Workplace
The summit’s panel discussion—moderated by Travis Bradford, president of the Prometheus Institute for Sustainable Development and lecturer in the Sustainability Management program—focused on how AI is reshaping work across industries. Acknowledging that some students may already possess more advanced AI skills than their instructors, Bradford urged faculty to begin with a foundational question: “What’s the target we’re trying to hit in terms of professional development?”
Chris Reitz, senior director of artificial intelligence at Elevance Health, emphasized that generative AI should not be conflated with automation. “AI is changing how work is organized, period,” he said. Rather than eliminating jobs, he argued, the opportunity for generative AI is to be the “connective tissue” that streamlines information flow and coordinates tasks behind the scenes. He added that AI may also inevitably challenge entrenched bureaucratic norms within organizations.
Ranjit Kumble, who leads data science and AI teams at Pfizer, highlighted AI’s transformative potential in the life sciences: accelerating medical discoveries, optimizing internal processes, and supporting individual employees’ work. He compared today’s transition to the rise of personal computers, noting that although some roles disappeared, most evolved—and entirely new fields emerged.
Ben Kinsella, formerly with OpenAI’s human data team and now an advisor with the nonprofit HumaneIntelligence, addressed concerns about entry-level hiring. While AI may change how employers evaluate résumés and cover letters, he emphasized that candidates will still need to demonstrate real competence through coding interviews and knowledge assessments. His deeper concern centered on autonomy: how much of a student’s authentic thinking remains present in AI-assisted work.
From Users to Designers
One of the biggest shifts noted by the panel was that AI is enabling students to transition more easily from being users of technology to designers of technology. Reitz noted that anyone can now create custom software solutions for specific problems, from a parent building a personalized chore app to students reorganizing their syllabi into operating systems for academic success such as Due Gooder. Expressing hope, Reitz said that although we may see an “explosion of software solutions,” this customization also reflects proof that we have potential to express agency within the AI landscape.
This democratization of software creation, however, requires new skills. Kumble emphasized that success depends less on technical training than on cultivating curiosity and critical thinking. “If people are coming in from whatever background with that kind of mindset, they’re perfectly positioned to do incredible things without needing to do technical courses,” he said. “With that said, I think the more technical knowledge, working knowledge people are able to acquire, the better—simply because it helps them understand what’s going on under the hood and therefore whether or not to accept an output as being appropriate.”
And technical understanding remains valuable for evaluating AI outputs. Kinsella expressed concern about students’ ability to assess model outputs for accuracy, hallucinations, and ethical implications. “I’m very surprised when it comes to questions around evaluations,” he noted. “There’s little critical thinking when it comes to evaluating the output.”
Kinsella acknowledged these concerns of overreliance, emphasizing the importance of modeling intrinsic motivation and the “sanctity of learning” for students. “It’s very easy to lose that intrinsic motivation when the chatbot can come up with a shiny answer,” he reflected. But it helps, he noted, when students are able to learn from one another.
“One of the beautiful things I love about SPS, where people come from many different backgrounds,” Kinsella said, is knowing that he doesn’t always have the answer, “but I know that it makes me appreciate human collaboration more and more.”
Travis Bradford, president of the Prometheus Institute for Sustainable Development and lecturer in the Sustainability Management program, speaks to attendees at the inaugural Faculty Summit on AI Teaching and Learning.
Integrating AI into the School of Professional Studies
Erik Nelson, senior associate dean for Academic Affairs, framed the summit as the beginning of an ongoing conversation. SPS has established an AI Faculty Committee to provide governance and guidance as the school charts its course forward, with a second summit planned for March 27.
Nelson emphasized that the goal is not to outsource human intelligence to AI but to ensure students use these tools effectively, ethically, and productively—in ways that elevate rather than replace critical thinking. “We want to focus on what we’re doing in the classroom and how we prepare our students to be leaders in a world that’s being disrupted by AI,” he said.
Programs across the School have already adapted their curricula—hiring new faculty and introducing courses like AI and Ethics in the M.S. in Bioethics program and Artificial Intelligence and Machine Learning for Technology Leaders in the M.S. in Technology Management program—and updated specific assignments to allow for the use of AI, boosting AI literacy and skills. As each program has different needs and is affected by AI in different ways, there is no one-size-fits-all approach to integration.
For example, according to Steve Safier, director of the M.S. in Human Capital Management program, “Human resources teams must develop AI expertise to enhance the delivery and impact of HR programs.” He noted examples of where HR professionals leverage AI-backed strategies: organizational and team design; talent acquisition, coaching for learners about career path options, assisting with pay equity audits within total rewards, and surveying sentiments toward organizational objectives among employees.
“Generative AI is quickly changing negotiation and conflict resolution practices in fundamental ways across the many fields where our students work, from peacebuilding to organizational consulting,” said Peter Dixon, director of the M.S. in Negotiation and Conflict Resolution (NECR) program. “It can enhance preparations for complex negotiations, identify mediated solutions across diverse stakeholders, and power realistic conflict simulations, to name a few.” But he noted the field-specific risks of hallucinations and human rights concerns—issues that lead NECR courses to approach AI through a critical and human lens.
The question facing educators is not whether to integrate AI into professional education but how to do so in ways that amplify human capabilities, preserve what makes learning meaningful, and prepare students to shape rather than simply respond to technological transformation.
As Dean Eggers urged in his opening remarks: “Don’t frame this as the end of an era of teaching. Frame it as the beginning of one, where every instructor has the power of a thousand assistants and every student has access to the best tutor in the world. Let us use today to imagine not how we protect education from AI but how we rebuild education with it.”
(l to r): Rich Lauria, associate director of the M.S. in Enterprise Risk Management program, and Dr. Viorel Popescu, director of the M.S. in Biodiversity Data Analytics program.