Adaptive Strategies

Don't Trade Skyscrapers for Tents: Why the AI Fog Calls for Living Plans, Not Smaller Bets

Don't Trade Skyscrapers for Tents: Why the AI Fog Calls for Living Plans, Not Smaller Bets

The AI fog is real. The choice it's asking you to make — between skyscrapers and tents, between long bets and short optionality — is not.

In Harvard Business Review's April 2026 essay, UC Berkeley's Toby Stuart argues that the most consequential effect of artificial intelligence may be the one we can't see: the uncertainty it creates about everything else. Confidence in a specific vision of the future is the foundation of long-term investment, Stuart writes. Without that confidence, the calculus of capital allocation breaks down. His warning is striking — leaders, peering into the fog, will be tempted to trade the potential future gains of skyscrapers and railways for the temporary utility of tents and bicycles. His prescription is to optimize for the unknown.

Stuart's diagnosis is correct. His prescription is incomplete.

The fog is an execution problem dressed as a strategy problem

The instinct to retreat into optionality — smaller bets, shorter horizons, more flexibility — feels like prudent strategy. It isn't. It's an admission that your planning system can't keep up with reality. When a leader looks at a five-year capital plan and decides to pitch a tent instead, they aren't responding to genuine uncertainty about whether the building is needed. They're responding to a more uncomfortable truth: they have no mechanism to update the plan as conditions change.

That's the strategy-market gap we've written about before. The strategic intent is sound. The market is moving. And the artifact in between — the plan, the roadmap, the OKR sheet — is static. It was written in conditions that no longer exist, and updating it requires a quarterly all-hands and three weeks of reconciliation. By the time the new version ships, the fog has moved again.

In clear weather, this is wasteful. In fog, it's catastrophic. The longer your re-planning cycle, the shorter your effective horizon, regardless of how ambitious the original bet was. Stuart's tents-versus-skyscrapers framing is what happens when leaders feel the cost of re-planning so acutely that they preemptively scale down ambition to match it.

The Coordination Tax compounds in fog

Every shift in conditions creates a coordination event. Teams need to re-align. Decisions need to be re-validated. Dependencies need to be re-checked. In a static planning system, that work is largely manual — meetings, decks, threads, hallway conversations — and the cost of doing it scales with the number of people involved.

This is the Coordination Tax: the hidden charge a company pays every time reality changes. In stable conditions, it's an irritation. In the AI fog, where assumptions are revisited monthly instead of yearly, it becomes the binding constraint on horizon. Most organizations don't conclude that their long-term bet was wrong. They conclude that the cost of staying aligned to it is too high. So they shrink the bet to fit the coordination budget.

Stuart's metaphor inverts neatly here. The skyscraper isn't expensive because of the steel. It's expensive because of the schedule — every floor depends on the floors below it staying coordinated for years. If you can't trust the coordination, you don't build skyscrapers, no matter how much steel you have. You build tents.

Living Plans: same horizon, adaptive route

The alternative isn't to shrink the bet. It's to fix the artifact.

A Living Plan is a long-term direction wired to short-term reality. It commits to the destination — the skyscraper, the railroad, the franchise — but lets the path through the fog re-draw itself continuously as new information arrives. The destination is anchored. The route is liquid.

The mechanism is the GPS framework: Goals, Plans, Status, kept in continuous relationship. Goals describe the long horizon. Plans describe the current best path to get there. Status reflects what's actually happening on the ground today. When status changes, plans update against it. When plans update enough, the system surfaces the question of whether the goal itself still holds — instead of burying that question in a quarterly review where it's already too late to ask.

This is the move Stuart's framework is missing. He's right that the fog destroys confidence in static forecasts. He's wrong that the answer is to reduce the size of the bet. The answer is to make the plan capable of absorbing the fog without breaking the bet underneath it.

We've called this approach Adaptive Strategies — the recognition that the era of the five-year plan as a fixed document is over, and the alternative isn't no plan, but a plan that learns. The AI fog doesn't change that conclusion. It accelerates it.

What this looks like in practice

A coordination layer that maintains a Living Plan does three things a static system can't.

It keeps the goal visible while the path changes, so teams don't lose direction every time conditions shift. It re-routes work when status diverges from plan, so the cost of re-alignment doesn't scale with headcount. And it surfaces the moments when the goal itself needs revision — distinguishing fog (the path is unclear) from genuine signal (the destination is wrong).

This is what an Intelligent Management System is for. Not a dashboard. Not a project tool. A live model of how the business actually runs, continuously reconciled against the strategy it was built to execute. AI's role here isn't to replace the judgment of the people making the bet. It's to maintain the reality the bet is being made against — closing the gap between what the plan assumes and what the world is currently doing. In clear weather, that's a productivity gain. In the AI fog, it's the difference between staying in the skyscraper business and quietly migrating to tents.

The takeaway

Stuart is right that the fog is here, and right that it punishes confident long-term forecasts. He is wrong that the answer is to make smaller bets. Smaller bets don't reduce uncertainty. They just lower the ceiling.

The fog doesn't ask you to shrink your horizon. It asks you to upgrade your coordination. Optimize for the unknown by building a system that can absorb the unknown — not by abandoning the kinds of investments that only pay out at scale.

The skyscrapers will still get built. They'll just get built by the companies that figured out how to keep the steel coordinated while the weather moves around them.

Related articles