Five muscles every leader needs to build in the age of AI

There's no shortage of AI rollout plans in most organisations right now. There are tool selections, platform decisions, use-case maps, and governance frameworks. What there is a shortage of is the human capability to actually lead through it.

The gap in most AI strategies isn't the technology. It's leaders. And before I get some angry emails, let me clarify that I am not saying that leaders aren't smart enough or committed enough. What I am saying is that his era demands a different set of muscles that are distinct from the ones most of us were trained on. We were trained to optimise, control, and scale repeatable systems. AI doesn't respond to any of that. It responds to judgment, curiosity, and the ability to hold uncertainty without flinching.

I've been in the room with leadership teams navigating this next wave and quite frankly next industrial revolution. The one none of you signed up for, yet here you are. What I keep seeing is the same gap, and it's not the tech. Below are the five muscles that separate leaders who are navigating AI well from those who are quietly falling behind. None of them are about the technology. All of them are about what you bring to it.

1. Systems Thinking

You can't fix what you can't see.

Most organisations are rolling AI into processes without stepping back to look at how those processes connect to everything else. The result is localised efficiency gains that create bottlenecks somewhere downstream in another team, another system, or in the people who now have to absorb the knock-on effects of a change nobody mapped.

Every decision in an AI rollout has a ripple effect across people, systems, and processes. Automate one step and you change the workload, the data flow, and the expectations on every step around it. Leaders who act in silos deploying AI into their function without understanding the wider system aren't solving problems. They're just moving work around or relocating problems.

The muscle to build here is seeing connections, not silos. Mapping ripple effects before acting. Asking "what happens to the rest of the system when we change this part?" before optimising anything. AI amplifies whatever it's pointed at, including broken processes. If you're automating a broken workflow, you're just creating a faster mess.

2. Critical Thinking

AI isn't replacing thinking - well at least we hope not! It's actually demanding better thinking.

This is the muscle most leaders assume they already have. But working with AI requires a specific kind of critical thinking that many organisations haven't developed. The ability to interrogate outputs that sound confident, polished, and authoritative, but may be wrong, biased, or incomplete.

The default response to AI in most teams is acceptance. The output looks good, it reads well, and it arrived in seconds. Questioning it feels slow and counterproductive. But the research is clear: when people don't challenge AI outputs, performance drops, bias transfers, and the quality of decisions degrades, often without anyone noticing.

Leaders need to develop this muscle in themselves first. That means learning to spot bias and lazy logic in data and decisions. It means habitually asking "what's missing?" and "what's next?" when reviewing AI-assisted work. And then it means coaching their teams to do the same so that you are building a culture where challenging outputs is the norm, not the exception.

The leaders who will always be a step ahead are the ones who catch the "oops" before it becomes a problem. That makes critical thinking a daily practice.

3. Adaptive Learning

Change won't wait for a training calendar, your LMS to play catch-up, or a set of polished slides.

The pace of AI development means that by the time most organisations have built a formal training programme, the tools and capabilities have already moved on. The old model where we design, deliver, assess, repeat assumes a stable technology environment. AI isn't stable. It's constantly shifting.

The muscle here isn't about training people faster. It's about building an environment where people can learn, test, and adjust on repeat. Where the default isn't "wait for the course" but "try it, learn from it, share what you found." That requires psychological safety so that people know they can experiment with AI without being judged for getting it wrong.

Leaders who build this muscle focus on three things. Teaching people how to learn and evolve alongside technology that's always changing. Creating space for curiosity over fear making it safe to explore, ask questions, and admit when something doesn't make sense. And turning small wins into trust currency. The teams that adopt AI well aren't the ones with slick training modules, they're tactually the ones where early experiments went well and people told each other about it.

4. Digital Curiosity

Innovation isn't about chasing shiny things. It's about solving real problems with purpose and intention.

There's a version of digital curiosity that leaders need to resist, the breathless, tool-of-the-week, "have you tried this new AI?" energy that creates noise without direction. Let’s be clear, this is not curiosity, its distraction and noise amongst a hype curve that has a steep cliff we are about to fall off.

The muscle to build is purposeful curiosity. Asking "why this?" before "how fast?" Testing with intention and keeping line of sight to the whole system, not just the task in front of you. You don't need to experiment with everything. You just need to explore what actually moves the needle and stay on task with the problem you are actually trying to solve. Don’t be fooled by Chattys sparkly “would you like me to create a…”

This means teaching curiosity and a growth mindset across your team as a deliberate capability, not a personality trait. Reducing the mistrust and fear that makes people avoid AI rather than engage with it. And making space for the kind of exploration that starts with a problem worth solving, not a tool looking for one.

The best AI use cases in most organisations weren't discovered in a strategy document. They were found by curious people who had permission to explore, fail fast and learn without fear.

5. Anchor Leadership

In every wave of change, there's one system that keeps it all together. It's the human operating system. And the person responsible for it is the leader.

This is the muscle that underpins all the others. Systems thinking, critical thinking, adaptive learning, and digital curiosity are capabilities a leader builds in their team. Anchor leadership is the capability a leader builds in themselves.

Leaders who forget the human operating system end up with shiny tech and broken trust. Teams that don't feel seen, heard, or supported through a change this fundamental won't adopt it, no matter how good the tools and comms are. They'll comply on the surface and disengage underneath. And disengagement in an AI-augmented environment doesn't just reduce productivity, it creates higher risk.

Your role as a leader right now isn't to keep control. It's to keep connection. That means listening before leading. Translating change into human language, not jargon, not corporate comms, but honest acknowledgment of what's hard and what's uncertain. And balancing performance with empathy and vulnerability, so that the people being asked to adopt this technology are supported with the psychological weight of it.

Connection is your number one currency. Everything else, the strategy, the tools, the roadmap runs on it.

The Real Investment

Organisations are spending heavily on AI platforms, infrastructure, and use-case development. Most of that investment will underperform. Not because the technology doesn't work, but because the leaders deploying it haven't built the muscles to lead through it for the long game.

These five muscles aren't a leadership model for the distant future. They're the operating system for right now. The technology is the easy part. The messy human side - the thinking, the judgment, the trust, the connection, that's actually what your strategy should be about.

And it all starts with the leader.

Previous
Previous

The problem with your AI rollout was named in 1984