What Executives Need to Unlearn Before They Can Lead with AI
6 min read
Upskilling is getting all the attention. But the harder and more important work for senior leaders is unlearning — letting go of the mental models that no longer serve.
What Executives Need to Unlearn Before They Can Lead with AI
Every conversation about executive AI development focuses on what leaders need to learn: new tools, new frameworks, new vocabularies. The upskilling industry is booming.
But the more important — and significantly harder — work is unlearning.
Unlearning is not forgetting. It's the deliberate process of recognizing that a mental model, habit, or assumption that served you well in one environment is creating drag in a new one. It requires honesty about what you're holding onto and why.
Here are the seven most consequential things senior leaders need to unlearn to lead effectively in the age of AI.
1. "Success Is Scaling What Works"
The traditional leadership playbook is built on finding what works and expanding it. Identify best practices, standardize processes, and scale the winning formula across the organization.
AI breaks this model in two ways. First, "what works" changes faster than scaling cycles. A workflow that's optimal today may be obsolete in six months. Second, AI enables a fundamentally different success model: rapid iteration and learning at speed, rather than patient optimization of a known approach.
Unlearn toward: "Success is the speed at which we learn when things break."
2. "Humans Tell Machines What to Do"
For most of computing history, technology was a tool that executed human instructions precisely. You defined the input, the machine produced the output. Control was absolute.
Agentic AI systems don't work this way. They reason, make choices, and produce outputs that their operators didn't specifically instruct. Leading in this environment requires shifting from scripting to context-setting — defining goals, constraints, and values rather than step-by-step procedures.
Unlearn toward: "Humans and AI are intelligence partners — each contributing what the other cannot."
3. "IT Owns AI"
Treating AI as a technology initiative — something that belongs in the IT department, with a technical budget and a technical roadmap — is one of the most expensive mistakes senior leaders are making.
AI is a core leadership capability. The CEO who delegates their organization's AI thesis to a CTO and disengages personally will find, in two years, that they can no longer evaluate what they've built or direct where it should go.
Unlearn toward: "AI strategy is owned at the CEO level. Technical implementation is owned by IT."
4. "Data Is an Output"
Most leaders think about data as something their organization produces — reports, dashboards, analytics that inform decisions. This is true, but it's the smaller part of data's role in an AI-enabled organization.
Data is increasingly the infrastructure that powers AI agents. The quality, structure, and accessibility of your organization's data determines the ceiling on what AI can do for you. Leaders who don't understand this will consistently underestimate the gap between their AI ambitions and their actual capability.
Unlearn toward: "Data is the foundational infrastructure. Context engineering — making our knowledge AI-ready — is a strategic priority."
5. "Culture Is the Soft Stuff"
Culture has traditionally been treated as important but secondary — the values wall and the team offsite, the "how we do things around here" that makes the company pleasant but isn't really where the real work happens.
In an AI-enabled organization, culture is your training model. The behaviors that leaders reward, the questions they encourage, the failures they normalize — all of these directly shape how your organization learns and adapts. An organization with a culture that punishes failure will not effectively adopt AI, because AI adoption requires rapid experimentation, which requires tolerance for things not working the first time.
Unlearn toward: "Culture that rewards learning over predictability is a competitive advantage, not a soft benefit."
6. "Transformation Is About Tools"
Most enterprise AI initiatives are designed around tool deployment: choose the platforms, roll them out, measure adoption. This is necessary but insufficient.
Research consistently shows that 70% of the value from AI transformation comes from changes in how people work — their processes, mental models, and collaboration patterns — not from the tools themselves. Organizations that buy tools and call it transformation consistently underperform those that invest equally in the human side of the change.
Unlearn toward: "Technology is 30% of transformation. People and workflow redesign are the other 70%."
7. "Expertise Means Having Answers"
The highest-value leaders used to be the ones who knew the most. Their expertise was demonstrated by the quality and speed of their answers.
In the age of AI, the highest-value leaders are the ones who ask the best questions. The ability to frame a problem precisely, to identify what information matters and what doesn't, to recognize when an AI-generated answer is missing something important — these are the skills that create genuine competitive advantage. They're also, not coincidentally, the skills that decades of senior leadership experience are uniquely positioned to develop.
Unlearn toward: "Expertise is demonstrated by the quality of inquiry, not the speed of answers."
The common thread through all seven of these is a shift from control and certainty toward stewardship and judgment. The leaders who make this shift will find that AI amplifies their contribution rather than diminishing it. The ones who don't will find themselves increasingly sidelined by organizations that have moved on.
More on Unlearning to Lead
- From 'Smartest Person in the Room' to 'Best Question Asker in the Room'
For decades, executive authority was built on having the best answers. AI changes the game. Now it's built on asking the questions that reveal what the AI missed.
- The Control Trap: Why the Skills That Made You a Great Leader Are Now Holding You Back
The habits that built your career — decisive action, clear direction, tight control — can become liabilities in an AI-enabled organization. Here's what to unlearn.
- Why AI Adoption Starts with Leadership Behavior, Not Technology
Most AI adoption programs focus on tools, platforms, and training curricula. The ones that work focus on what leadership does differently first.
Work with CoCreate on executive AI leadership
Workshops, advisory, and facilitation for leadership teams — built on the same methods we use with design orgs at enterprise scale.