AI and the Future of Higher Education: Preparing Students for “the Good Life” and “Good Livelihoods” in the Age of AI
Historical Jesus College at the University of Cambridge
This morning at the café, I was speaking with a friend and neighbor, an engineer in a Paris based AI start-up. We were discussing the world-changing realities of Anthropic's new Mythos model and how Gen AI and AI agents were already transforming the way he works as a software engineer.
In our conversation, we kept circling around the reality that human capabilities (i.e. vision, taste, judgement, practical wisdom, relational skills, leadership skills, moral sensibilities) are becoming more important as machine capabilities advance at breakneck speed. Working at the bleeding edge of AI, my friend is seeing what the rest of the economy will experience in dramatic ways in the coming months and years.
Last week, I had a similar conversation but with a very different group of leaders: deans, provosts, professors, higher education funders, economists, and research scientists from frontier AI labs sharing their insights about the latest AI models and their use in the learning experience.
We reached the same unsettling conclusion as my engineer friend. AI is forcing higher education to reckon with what it has long neglected: forming human beings, not just producing capable workers. AI isn't just creating new problems in higher education. It is exposing and accelerating the ones we've been avoiding: the lack of human connection, meaning, and formation at the heart of how we've designed our institutions.
Playing a bridge between the two worlds of "deep tech" and "deep humanity", Noēsis Collaborative co-hosted a roundtable at the University of Cambridge, facilitated by Sandy Speicher, former CEO of IDEO. Our partners included Ian Marcus Corbin, Director of the Public Culture Project at Harvard, Julian Huppert, Director of the Intellectual Forum at Jesus College, University of Cambridge, and our sponsor Terri Taylor, Strategy Director for Postsecondary Education at Lumina Foundation.
On the majestic grounds of Jesus College, founded in 1496, we wrestled with a single question: how might we design higher education for human flourishing in the age of AI?
What follows are insights from the roundtable, shared under Chatham House Rule:
Noēsis workshop at Jesus College; Photo Credit: Rob Hill, March 2026
The Cognitive Risks
Use of AI introduces the risk of “cognitive surrender” formany students because it shortcuts the cognitive processes that help turn information into understanding, memory, and durable capability.
No one would take a forklift into a gym to lift weights and think that they are going to build muscle. Students are doing this everyday, going to AI for answers instead of struggling to find the answers themselves, with negative impacts on cognitive development. Currently, the Generative AI models most students use are not designed for learning because they lack friction and simply provide answers. That’s what the market wants. But these models could be designed in a more Socratic way for dialogue and “intellectual weight lifting.”
Still, designing AI for learning purposes is significantly harder than designing an AI to solve our hardest math or physics problems because developing a human is not an equation to be solved but a person to be formed for deeper, more complex purposes: meaning, relationships, mental health, character, civic participation, and other human goals.
The Human Advantage
Strong metacognition, the ability to think about one’s own thinking, seems to be the best predictor of positive student outcomes when using AI. In some ways, this revelation is a gift for those that have been stewarding the arts and the humanities in a previous age that has long undervalued them. Now, more than ever, we need professors that can both model human intellectual virtues and transmit the wise use of thinking about human purposes and values. History, art, philosophy, literature and religion are all deep sources of wisdom for students' most profound questions. Wrestling with these questions sometimes matter more than the answers.
Wisdom and character are caught, sought, and taught. Our core human capabilities (i.e. social, emotional, behavioral) are developed through practice and experiential models, not simply through being taught. Teaching and learning involve modeling a way of being, where students are drawn toward how someone thinks, acts, and engages with the world and a good teacher can embody this.
The Systemic Problem
At the very moment we need wise, relational humans, we are producing humans as machines. Because of our extractive economic models, workplaces and higher education institutions have been turned into knowledge factories, moving from a north star of flourishing to economic utility. We prioritize measurement, optimization, and treating people as resources, often encouraging a more instrumental and individualized orientation rather than one grounded in care, relationships, and shared purpose. This degrades the human capabilities of the individual worker and the larger organization. Those workplaces that buck this trend and prioritize more humane cultures are paradoxically seeing better economic performance.
Business models and market incentives shape what is possible. As we’ve shared before, flourishing is fundamentally relational. Students need high quality relationships with each other and their teachers. As Julia Freeland Fisher has warned in other venues: AI “disrupts help” and the very social connections that help students experience the “good life” and get access to jobs for “good livelihoods.” Students are using Gen AI instead of the study group and that has social and economic consequences for their future. This didn’t begin with AI. The business model of higher education isn’t designed for relationality. What students need is not the same as what students demand. Relationships matter more than they are valued by students, teachers and administrators and they aren’t valued in the current business models. But they could be.
The Inequality Stakes
What’s at risk for our society? AI can be used either to support or harm learning depending on a student’s judgment and foundational capabilities, with stronger students benefiting more and weaker students at risk of offloading cognitive effort in ways that undermine long-term learning. There seems to be a trend at the elementary levels of poorer students spending hours on screens talking to AI chatbots and richer students spending their time in screen free Montessori or Waldorf schools with human teachers. Ironically, the very funders of the ed-tech for poor schools often send their kids to the screen free schools.
Rich, elite universities could have the resources to attract students with stronger capabilities, advance a more humanistic, higher cost approach to learning with small groups, personalized engagement and advanced access to AI models specially designed for learning.
Poor universities could continue in existing, mechanistic approaches that are no longer fit for the age of AI and force their students to access free or cheap versions of Gen AI models that are not designed for learning. A widening gap of both human capabilities and experiences of the “good life” and “good livelihoods” could accelerate even further.
Over the last two decades, we’ve seen how widening wealth and income gaps have deteriorated trust in each other and our shared institutions, especially higher education institutions. It is socially unsustainable and ironically it is a drag on economic growth.
What We Are Working on Together
With every Noēsis event, we aim for both shared understanding and a clear path for collective action. This moment demands both. Here's what we believe needs to change.
Noēsis Collaborative Event on AI & the Future of Higher Education; Photo Credit: Rob Hill, March 2026
To help all students pursue the “good life” in the age of AI, we need to redesign the student first year experience to include space and resources to think about ENDS (meaningful activities that are intrinsically valuable, that make life worth living) not just MEANS (such as instrumental skills and credentialing). This could include rediscovering the tutorial, membership based model of small group learning in dialogue with a human professor focused on difficult texts, in a low risk, ungraded context. The costs of this more humane approach could be offset with the use of AI in other parts of the institution. The goal of first year experiences and small group learning is to form students as human beings before credentialing them as workers.
To help all students achieve good livelihoods in the age of AI, we need to design curricula based on complex, real world problems (climate change, poverty, etc.) that develop students' metacognitive and ethical muscles in tandem with specially designed AI usage and experimentation that challenges rather than shortcuts students' critical thinking. Within this problem solving approach, we need to provide opportunities for experiential, work integrated learning to develop practical wisdom and ethical prowess in the context of using AI in real world scenarios and contexts. We can use these opportunities to build the leadership and relational skills necessary for more humane workplace cultures. Work-integrated learning may also offer a pathway to reimagine the business model of higher education itself. Students could gain practical wisdom and learn practices from their disciplines. Institutions could gain sustainable funding through meaningful paid partnerships with employers.
Many of these ideas are not new, some of them are ancient. Whether these ideas are new or not isn’t the question. The question is whether universities will use this moment to finally do what they were always meant to do: form human beings, not just produce capable workers.
My engineer friend in Paris is already living the realities of AI’s transformation of human work. At the bleeding edge of AI, he is rediscovering the importance of vision, judgment, and the irreducibly human qualities that no AI model can replicate. Higher education has a choice: wait until AI forces that reckoning upon its graduates, or lead it. The good life and good livelihoods of an entire generation depend on which path we choose.