The AI Paradox: Why Your Smartest Investment Might Be Breaking Your People

I keep having the same conversation. A Head of HR sits across from me, tells me about the new AI tools rolling out across the organisation, how leadership is buzzing about efficiency gains and productivity dashboards. Then their voice drops. "But Annie, the team is not okay."

They're right.

We were sold a story that AI would free knowledge workers from the mundane. Admin, scheduling, first drafts, data crunching. And in some ways it has. But zoom out from the dashboards and look at the actual humans. There's something off. A jittery energy. Forced optimism covering over something nobody wants to name.

Our tools are getting smarter and our people are getting more fragile. That's the bit nobody in leadership wants to sit with. So let's sit with it.

We've been here before. Sort of.

Every generation panics about new technology. When electricity was installed in the White House in 1891, President Harrison refused to touch the light switches. People thought telephones would invite spirits into their homes. AI has been around since 1956. We had machines beating humans at chess by 1959.

The panic is familiar. What's different this time is the speed, and the fact that it's landing squarely on knowledge workers. The people who built their careers on thinking, analysing, advising. The ones who assumed the robots would come for the factory floor first.

They didn't. And unlike previous waves of automation, this one doesn't just change what people do. It changes how people feel about whether they're still needed. That shift in identity is what makes this so destabilising, and it's showing up clearly in the data.

What's actually happening in your building

The APA's 2023 Work in America Survey found 38% of workers worry AI will make their job duties obsolete. If you manage a team of ten, that's roughly four people sitting in meetings wondering whether their role will exist next year. They're not telling you. They're just quietly disengaging.

And the worry isn't just in their heads. Of those anxious about AI, 64% feel tense or stressed during the workday, compared to 38% of those who aren't worried. 37% report emotional exhaustion. 51% say work is negatively impacting their mental health, nearly double the rate of people who aren't carrying this weight. These aren't people who are struggling because they can't keep up with the tech. They're struggling because no one has told them what their role looks like on the other side of it.

Then there's the monitoring piece, which makes all of this worse. Half of all workers know their employer uses surveillance technology. Among those being watched, 56% feel stressed and 39% are emotionally exhausted. They're also almost twice as likely to be job hunting. Think about what that means in practice. We rolled out the AI to make work better, then bolted on tracking tools that make people feel like they're under performance review every hour of the day. And we're surprised they want to leave.

Gallup's 2025 data tells the bigger story. Global employee engagement has dropped to 21%, the first fall since the pandemic. Two-thirds of workers report at least one burnout symptom in the past month. And from the APA data, workers worried about AI are nearly twice as likely to want out within the year: 46% compared to 25%.

This isn't a blip. It's a pattern emerging that we need to take notice of.

Why this keeps going wrong

Here's where most organisations misread the situation. They see the stress, the disengagement, the turnover, and they assume it's a skills gap. So they commission an upskilling programme. Maybe a lunch-and-learn. Maybe a Slack channel called #ai-tips. And then they're confused when nothing changes.

It doesn't change because the problem isn't technical. It's actually a biological challenge, one that we see time and time again during significant change.

When someone feels threatened by AI or change, their brain does what it's done for 2.5 million years. It defaults to survival mode. The limbic system takes over. The frontal lobe, the part that handles reasoning, perspective, complex thinking, goes quiet. That's how human brains are wired. And if your organisation's response to AI adoption is "here's the new tool, get on with it," you are triggering exactly that response in your people. You're asking them to learn while their brain is in fight-or-flight.

The research on neuroplasticity tells us brains can rewire at any age. A 55-year-old with curiosity and openness will outlearn a 30-year-old who's rigid and defensive, every time. The biggest predictor of whether someone adapts to AI isn't their age or their technical ability. It's personality. Which means the real job isn't to train people on the tools. It's to create the conditions where their brains are actually capable of learning, and that starts with psychological safety, not a login to the latest platform.

The AI Goo problem

There's another layer to this that nobody in leadership is naming: the quality of the work is getting worse.

I hear more and more stories about people submitting AI output they can't explain. Pages and pages that look polished until someone has to actually engage with the content. Reports that sound authoritative but say nothing specific. Someone's "efficient" AI use quietly becoming someone else's extra work, because now a manager has to read through it, figure out what's real, and redo the thinking the person skipped.

We keep talking about the productivity gains but in reality we are wading through AI goo, and it has significant implications across individual work and a teams relationships.

The organisations getting this right have figured out the difference between using AI as a thinking aid and using it as a thinking replacement. It's the difference between a dishwasher and a microwave dinner. One frees up your time by replacing tasks so you can spend more time cooking something better. The other replaces the cooking entirely, it's a fast meal and everyone can taste it.

AI should be handling the drudgery so humans have more capacity for judgement, context, and the work that actually requires a brain. When it starts replacing the thinking instead of supporting it, you start to see a quality problem emerging. For some organisations this hasn't shown up in the metrics yet.

What actually needs to change

I use AI every day. I'm not anti-technology. But I am going to say what a lot of HR professionals are thinking: we are rolling out tools that are measurably increasing stress, anxiety, and turnover, and the organisational response so far has been an upskilling programme and a poster about wellbeing in the kitchen.

Here's what I'd want any HR leader reading this to sit with. The data is telling us that AI adoption, done badly, is a mental health issue. Not a future risk. A current one. And "done badly" doesn't mean your tools are wrong. It means your people don't know what their job is anymore, nobody has created the space for them to say that, and the organisation is measuring output without looking at what it's costing the humans producing it.

If you want this to go differently, three things need to happen. First, role clarity. People need to know what their job looks like now that AI is part of it. Not "your role is evolving" corporate speak. Actual clarity about what's expected, what's changed, and what still requires their brain. Second, learning that meets people where they are. Not a one-size-fits-all programme, but permission to be slow, to ask questions, to not know. That means leaders who model that behaviour too, who show up with vulnerability and the curiousity to learn and explore. Third, honest measurement. If you're tracking productivity gains from AI without also tracking stress, engagement, and turnover in the same teams, you're only reading half the data. The half that tells you what you want to hear.

The efficiency gains leadership is celebrating have a cost. It's sitting in your engagement surveys, the undiagnosed burnout, and the quiet resignation of people who used to care about their work.

This is my rally cry. Stop treating this as a technology project. Start treating it as what it actually is: a leadership capability shift. And if there's one profession that knows how to lead people through uncomfortable change, it's ours. This is the moment HR was built for. We just need to step into it.

Sources:

  • APA 2023 Work in America Survey: Artificial Intelligence, Monitoring, and Worker Well-being (2,515 employed adults)

  • Gallup (2025), State of the Global Workplace Report (global engagement at 21%, burnout indicators)

Previous
Previous

The $40 Billion Blind Spot: Gallup Just Told Us Why the AI Revolution Is Stalling, and It's Not the Tech

Next
Next

Why AI Change Is So Hard (And What the 16% Who Get It Right Are Doing Differently)