We’ve all heard the buzz: artificial intelligence is revolutionizing how we work, make decisions, and drive results. But in the rush to automate and optimize, one critical factor often gets overlooked—the human side of this change.
In our latest episode of Reroute Reflections, we explore this head-on. How do we bring AI into our organizations without losing accountability, empathy, or the competitive advantage that made our business successful in the first place? More importantly, how do we use it to support and empower our people—instead of focusing on how it can displace them?
AI as a Thought Partner
AI can synthesize data, generate options, and support decision-making at scale—but it can’t understand the nuances that each individual in a room or organization brings with them as they navigate change. In other words, AI can very easily handle complicated problems—not the complex ones.
I’ve been involved in similar missteps in the past. During the digital transformation era, we saw the same patterns: different technology, same mistake. Organizations underestimated the impact people have on future-state strategy—and paid the price. Failed implementations eroded morale, damaged reputations, and ultimately cost more than they saved.
It’s not that I don’t think AI can replace some jobs. We’ve seen it done successfully—Walmart’s deployment of autonomous forklifts is a prime example. In 2024, Walmart began replacing human-operated forklifts with AI-powered models across its distribution centers, leading to measurable gains in productivity, reduced labor costs, and greater efficiency. This was considered a major win for the firm; it led to faster operations, fewer physical demands on employees, and better throughput overall.
But the difference lies in the type of work. Applying the same framework Walmart used to automate pallet unloading to knowledge work—like software development or customer service—is flawed. Knowledge work is fundamentally different.
In warehouse environments, workflows are linear and tasks are clearly defined—making them ideal for automation. But knowledge work is messy, interpretive, and deeply contextual. It requires emotional intelligence, creative problem-solving, and human judgment.
When we treat these different work types the same, we repeat the same mistakes. The result? Poor adoption, resistance, morale hits, and costly re-hiring efforts.
In early 2025, major tech companies accelerated layoffs citing AI-driven efficiency gains. But as this NBC News article highlights, these decisions sparked widespread employee uncertainty and raised serious questions about long-term sustainability versus short-term optics.
Similarly, Klarna made headlines when it reversed course on AI customer service, quietly rehiring human agents after their AI pivot damaged customer experience and internal morale. It’s a cautionary tale: without empathy, context, and trust-building, even the best AI tools can backfire.
What do these examples have in common? They’re missing a key piece to the puzzle: humans eat frameworks for breakfast. When you’re dealing with increasingly complex change, you need an equally complex solution.
The Flawed Math of AI-Driven Cuts
And another thing—how are we coming up with these reduction figures?
In the episode, we called out a recurring but deeply flawed practice: the belief that if AI can eliminate, let’s say, 5% of a task—and 100 people perform that task—then 5 jobs can be eliminated.
Let’s pause here.
In case you need to hear it: if you take away 5% of someone’s job through automation, AI, or any other tool—and 100 people are affected—that does not mean you can cut 5 people. That’s like saying if you take a pinky toe from 100 people, you can build 5 complete human beings. It sounds ridiculous because it is.
And yet, this is the kind of math far too many organizations are using to justify layoffs. Not only is it reductive, it’s reckless.
Where are these numbers coming from? Often, they originate from vague benchmarking or surface-level time-tracking exercises. The problem is, they assume all tasks are standalone and all capacity is interchangeable. They overlook context, collaboration, and interdependencies—and they ignore the time, training, and cultural alignment required to shift work meaningfully.
These aren’t just bad calculations. They’re lazy shortcuts that devalue human contribution.
If you truly want to realize the benefits of efficiency, it requires intentional role redesign, a shift in team dynamics, clear change support, and investment in upskilling.
I get it—it makes sense on paper. But when has what makes sense on paper ever consistently translated into something that works well in practice? This case is no different.
Why? Because that 5% is usually embedded in complex, cross-functional roles. You can’t extract and consolidate that work cleanly without significantly restructuring jobs, responsibilities, and even team dynamics. Yet organizations often skip that step, opting to cut headcount prematurely and hope the math holds.
People form a system in organizations. You remove one part of a machine—regardless of how small—and the system starts to falter. It might not break immediately, but it slowly deteriorates until something important fails. That pinky toe? It was connected to muscle memory, balance, and coordination. That “5% task”? It was holding together a client experience, a compliance process, or a team rhythm. Pull it out without a plan and you’ll be left dealing with the fallout.
Real transformation requires more than an Excel formula. It demands a rethinking of how work gets done, who does it, and what support and upskilling are needed to sustain that shift.
When organizations fail to do this—and instead treat AI like a plug-and-play app—they’re not just making a tactical error. They’re making a careless one. It reveals a mindset that views people as interchangeable units, rather than integral parts of a system. The fallout isn’t theoretical—it’s real, and we’re already seeing it play out in botched implementations, reputation hits, and employee disengagement.
As we’ve seen with companies like Klarna, the reputational and operational costs of moving too fast can be substantial. And the worst part? These outcomes were predictable. Because they ignored the most basic truth in organizational design: if you change one part of the system, you change the whole thing.

