Rethinking Economic Measurement in the AI Era
A Dispatch from the National Association of Business Economics (NABE) Policy Conference

I had the pleasure of attending this year’s NABE Economic Policy Conference, “The Great Realignment: Navigating AI, Demographic, and Geoeconomic Change,” and left the meetings energized by both the intellectual honesty and forward-looking tone of the discussions.
The central thread running through the conference was clear: artificial intelligence is no longer a speculative tailwind but an active force reshaping economic performance in real time. Much of the dialogue focused, appropriately, on productivity, where gains may emerge, how quickly they may diffuse, and what it means for labor markets and capital allocation.
But an equally important theme, and one that resonated deeply with me, was measurement.
As AI accelerates the pace of economic activity, our traditional statistical frameworks, survey-based, lagged, and often revised months later, are increasingly strained in their ability to capture what is happening on the ground. Multiple panels touched on the need for new data architectures: higher-frequency, more granular, and alternative datasets capable of observing the economy as it evolves rather than after the fact.
As AI accelerates economic activity, the institutions and frameworks we rely on to measure it will have to evolve just as quickly.

One highlight of the conference was the discussion between Jeff Frieden (a valued contributor here on the Atlas Substack) and MIT’s Daron Acemoglu on the emerging political and economic order. Their conversation explored how technological transformation, industrial policy, and geopolitical fragmentation are interacting to reshape globalization’s next chapter. It was a thoughtful and nuanced dialogue that underscored how economics and political economy are becoming increasingly inseparable in the AI era. Still, the question remains: how will governments and individuals reconcile with the vast and frenetic change brought on by this most recent wave of innovation, and what will that mean for institutions going forward?

Another memorable moment was seeing Christine Lagarde recognized with the Paul Volcker Lifetime Achievement Award. It was a fitting tribute to her stewardship across international finance and central banking, and the standing ovation in the room reflected the depth of respect she commands across the global policy community. But it also felt emblematic of a deeper institutional transition underway. Central banking itself is being pulled into a new measurement regime, one where policymakers are increasingly expected to interpret higher-frequency signals, alternative datasets, and AI-derived indicators alongside traditional releases. The frameworks built for slower, lower-resolution economic data are now being stress-tested by a world that moves, and can be observed, in real time.
Stepping back, what struck me most was the degree of openness, across both public and private sector economists, to new tools and new methodologies. There is growing recognition that understanding the modern economy will require integrating machine learning, alternative data, and interdisciplinary approaches alongside traditional macro frameworks. If economic measurement becomes AI-enabled and more granular, the practical implications are significant: policy reaction times compress, informational advantage shifts toward those measuring activity first, and aggregate GDP becomes less sufficient as a standalone signal.
The future of economic measurement is being written now, and it will be faster, more granular, and increasingly AI-enabled. At Atlas, we are building for that world, not just to interpret the data, but to measure the economy as it unfolds.

