They’re Not Just Students. They’re the Class of AGI.
AI isn't killing critical thinking, but the way we talk about it might be. Here's what to do instead.
Over the past few weeks, headlines have once again reignited fears about students using AI to cheat. These narratives are not new, but they reveal a deeper and more persistent concern, one rooted not in technology itself but in our assumptions about young people, learning, and trust.
Beneath the headlines claiming that “AI is killing critical thinking,” “AI is destroying human connection,” and “AI is replacing teachers,” what we are truly struggling to confront is the erosion of trust and the breakdown of our relationships with one another.
There is a growing tendency to interpret students’ use of AI tools as a sign of moral failure rather than as an indication that they are navigating a rapidly changing world without the guidance they need. Few schools are offering meaningful instruction on how to engage with AI responsibly. Fewer still are inviting students to help shape what responsible use might look like.
While writing this post, this article in the New York Times by Aneesh Raman, Chief Economic Opportunity Officer at LinkedIn, “I’m a LinkedIn Executive and I See the Bottom Rung of the Career Ladder Breaking,” came up on my feed.
The surveys are not wrong when they share that anxiety about their future is a top concern for young people:
The unemployment rate for college grads has risen 30 percent since September 2022, compared with about 18 percent for all workers. And while LinkedIn’s Workforce Confidence Index, a measure of job and career confidence across nearly 500,000 professionals, is hitting new lows amid general uncertainty, members of Generation Z are more pessimistic about their futures than any other age group out there.
However, there is one line in here that should motivate and inspire us all:
In our survey, executives on LinkedIn still believe that entry-level employees bring fresh ideas and new thinking that is valuable to their businesses.
I keep wondering what that moment will be…the moment when we finally realize that everything has already changed.
When we stop talking about cheating and start talking about creative confidence.
When we stop policing the use of AI and start preparing students to use it with intention.
When we stop asking how to preserve the past and start designing for the future.
As I often share, this moment is not simply a technology challenge, one where we have to simply just upskill everyone. This moment is human challenge where we have to decide if we have the courage to lead with trust, not fear.
So maybe it’s time to stop asking if students are doing it right, and start asking if we are.
What Students Are Telling Us About AI
This year’s graduating seniors will be the third class to enter a workforce where AI is embedded in nearly every industry, from healthcare and finance to education and design. And yet, many schools remain focused on preserving outdated practices, struggling to articulate a clear vision for what learning could look like in an AI-integrated world. Rather than preparing students for what lies ahead, we risk holding them back with systems designed for a world that no longer exists.
One of my favorite activities recently has been partnering with schools to lead what I call an AGI Learning Walk. It’s a simple but powerful experience designed to create cultures of innovation that begin with empathy.
Rather than starting with tools or training, we begin by stepping back and asking a foundational question:
Are we designing for what’s next, or maintaining what’s always been?
During an AGI Learning Walk, educators observe just a few classrooms, not to evaluate, but to notice. What skills are students practicing? How are they thinking, collaborating, using technology, or navigating ambiguity?
At Antioch Community High School as soon as we walked in here is what we saw. So many beautiful reminders and quotes about the importance of responsibility, and trying new things. These are all signals we can use, language we can build on as we help young people make ethical choices that balance innovation and integrity.


Then, through a single empathy interview, with a student or a teacher you start to uncover the signals beneath the surface. You’re not looking for perfection. You’re looking for patterns, gaps, and opportunities.
Because clarity doesn’t come from guessing. It comes from paying attention. It’s moments like these where the work you have done in years prior gives you the foundation to build on. Community District High School 117 have an incredible learner profile, one that has all the language they need to design AI guidance and strategy to navigate what’s next with their community.
If you’d like to try it with your team, I’ve created a free guide you can use to get started here.
How do we move our faculty into the 21st century?
As I get ready to work with a new high school on designing their AI strategy, I’m reminded of a moment in 2014 when I was Director of Innovative Learning at USC. At the time, I was hired to answer this familiar business question:
It made sense on paper. But like many business questions, it led us toward pockets of excellence, not a systematic change.
One of the reasons I love design thinking is that it begins with empathy. Taking this approach led us to asking human questions, not business questions. So one day while observing a class, I noticed that students were downloading PowerPoints from the LMS and taking notes in the presenter section, listening for what was not on the slide. A terrible way to take notes! So I asked, has no one ever taught you how to take notes on your devices? The response was unanimous: no.
The one conversation quickly led us to also learn, that the faculty were not the ones resisting change, students were. But they were not resisting the technology because they didn’t like it. They were overwhelmed. They didn’t want to learn new content and new tools at the same time. And that one conversation led to us creating a new orientation before school began. As a faculty we decided on a core set of tools that would be used across all classes, and we would teach the students how to use them before classes began. At the end of the orientation one student summed it up best:
“I used to think that with all the technology available, more people would know how to use it in the classroom. I was surprised to see they were not even if they were fluent with it in other areas of their lives.”
That experience taught me the power of reframing.
When we moved from How do we implement? to How might we make this meaningful? everything changed.
Not only did this lead to students being happier, it led to incredible results.
And that’s exactly where we are again with AI.
As I read through the surveys for my workshop this week, the patterns are strikingly similar. Students are experimenting with tools like ChatGPT and Gemini on their own, often without any guidance or context. Most said their teachers hadn’t provided instruction on responsible use. Some weren’t even sure what counted as cheating. Many wanted to learn more, but didn’t know where to start.
They asked thoughtful, grounded questions:
How does AI actually work?
How can it help me without doing the work for me?
What does “responsible use” really look like?
And once again, the message was clear:
They’re figuring it out alone.
We cannot keep mistaking personal familiarity with technology for intentional, supported learning. If we keep asking business questions, “How do we get teachers on board? How do we stop students from cheating?” we will keep designing systems of control.
But if we ask human questions, “How might we help young people build agency, confidence, and curiosity in a world shaped by AI” we can design systems of possibility.
Designing with, not for, the Class of AGI
The temptation with any new technology especially one as powerful and fast-moving as AI is to move straight to answers. Policies. Guardrails. Guidelines.
But if there’s one thing design thinking teaches us, it’s that meaningful solutions begin with empathy. Not assumptions. Not panic. Not pressure from headlines.
That’s why, when I work with schools on their AI strategy, we begin not with tools, but with people. We ask:
What are students actually doing with AI?
How do they feel about it?
Where do they feel confident and where do they feel left out?
This week, I sat with a group of students and asked them to walk me through their real experiences. What emerged wasn’t a story about cheating. It was a story about curiosity, confusion, and a deep desire to be prepared for the future.
One student said,
“I use it sometimes to rephrase sentences or help me organize my ideas. But I don’t know if I’m allowed to.”
Another shared,
“It helps with studying, but I still have questions. I don’t want to get in trouble, so I don’t talk about it.”
Again and again, students described a kind of quiet experimentation done in isolation, without context, without conversation.
That’s the risk. Not that AI will replace critical thinking, but that our silence will.
Empathy interviews are one of the most powerful tools we have to break that silence. When we invite students into the process not just as users, but as co-designers we uncover insights that policies alone can’t reach.
Because if we want to build an AI strategy that works, we have to design it with the people it’s meant to serve.
And if we want to prepare the Class of AGI, we can’t do that by handing them rules. We have to hand them agency. Download the guide and do an AGI Learning Walk on your campus with your team.