Overview
AI Is Everywhere Except in the Results
A recent NBER study surveyed nearly 6,000 executives across four countries. 69% of their firms are actively using AI. Over 80% report no measurable impact on productivity, and over 90% report no impact on employment over the past three years.
And yet, your news feed is full of confident predictions that all white-collar work will be unrecognizable within a year.
Those two things aren't at odds. Both are true.
AI works at the task level, but it isn't moving the organizational needle. Understanding why is the difference between a deployment that transforms and one that just burns timelines and budget.
Solow's Paradox, Round Two
In 1987, Nobel economist Robert Solow wrote: "You can see the computer age everywhere but in the productivity statistics."
By that point, companies had spent two decades investing heavily in mainframes, PCs, and networking equipment. The expected productivity gains were invisible in the aggregate data. U.S. productivity growth had actually declined, from 2.9% before 1973 to 1.1% after, with the steepest drops in the sectors investing most aggressively in the new technology.
The productivity gains from the computer era did eventually materialize, but not until the mid-1990s. That's nearly twenty years of heavy capital investment before the macro numbers moved.
The delay had a specific cause. The companies that put a PC on every desk and kept doing the same work the same way saw nothing. The ones that redesigned their processes, restructured their teams, and rethought how decisions got made saw massive returns. The technology made it possible. The hard work was redesigning how decisions got made, how teams were structured, and how work actually flowed.
That is precisely the dynamic playing out with AI today.
Optimizing the Wrong Step
In 1984, Eli Goldratt published a business novel called The Goal that introduced what he called the Theory of Constraints. The central idea is that every system has exactly one bottleneck. The throughput of the entire system is determined by the throughput of that bottleneck. Nothing else matters until you address it.
The corollary is what most organizations are currently ignoring: when you optimize a step that is not the bottleneck, you don't get a faster system. You get a more congested one. You pile inventory in front of the constraint and call the pile progress.
Knowledge work organizations run on information and decisions, not widgets. If your constraint is the quality of decisions getting made, or the speed at which the right information reaches the right people, or the clarity of what problem you're actually trying to solve — then AI tools that make execution faster don't move the needle. They accelerate work flowing into a bottleneck that hasn't changed.
This is where the productivity data lands. Most organizations have deployed AI where it was easiest to deploy: execution. Writing faster, summarizing faster, generating faster. The NBER data reflects this pattern. Across nearly 6,000 firms, the most common use of AI was text generation using large language models. Organizations found the easiest place to deploy and deployed there. Execution was not the constraint. The work piles up in front of the same bottlenecks it always did, only now there's more of it.
What the Bottleneck Actually Is
The question most executive teams need to be asking is simple: “where does value actually stop moving in this organization?”
The answers are rarely about execution speed. They're about:
- Decision latency — how long it takes for the organization to act on information it already has.
- Information asymmetry — where the people closest to the problem lack the context to make good calls, and the people with the context are too far from the problem to apply it.
- Strategic ambiguity — where teams are executing at high speed toward objectives that haven't been clearly defined or haven't been revisited since conditions changed.
These are the constraints that determine organizational throughput. AI tools don't address any of them by default. Deployed without that understanding, they make the symptoms worse. More output, same bottlenecks, more noise between the leadership team and the signal they need.
Try this diagnostic. Imagine you suddenly had 10,000 people added to your organization tomorrow. What would you do with them? Most leadership teams struggle to give a specific answer. That's the point. If capacity isn't the constraint, adding more of it doesn't help — it just creates more work flowing into the same bottlenecks.
Where the Gains Are Actually Coming From
The fact that AI isn't showing up in the macro data isn't proof that AI doesn't deliver. Sector-level data tells a different story.
Kansas City Fed research shows real productivity gains at the industry level. MIT studies document 20-40% improvements in targeted applications. The gains are real, and the pattern behind them is consistent: they concentrate in organizations that identified an actual constraint and redesigned work around removing it, not organizations that purchased AI tools and applied them to existing processes.
This matches the computer era exactly. The macro data lagged for nearly two decades. Disciplined organizations were pulling ahead the whole time. By the time the aggregate numbers caught up in the mid-90s, the competitive gaps were already set.
The organizations that get this right in the current cycle will share a characteristic: they asked what was actually limiting their throughput before they decided where to deploy the technology. Some of those constraints are in execution. Many more are in decision-making, information flow, and organizational structure. That's the work that produces real gains rather than faster activity reports.
What This Requires
The aggregate data describes a gap between deploying AI and capturing value from it. That gap is a strategy problem, and it doesn't close on its own.
It requires understanding your organization well enough to know where value actually gets stuck. It requires the discipline to redesign work around removing real constraints rather than deploying technology against visible friction. And it requires treating the current period as the beginning of a multi-year transformation, not a productivity initiative with a quarterly return.
The computer paradox took nearly twenty years to resolve in the macro data. This cycle will move faster. But the organizations that outperform will be the ones that figured out early what everyone else eventually learned: the technology was never the hard part.
Phase2 partners with enterprise organizations on AI strategy, architecture, and delivery. Reach out if you're exploring how AI can fit in your organization.
.jpg)