We’re joined today by Matilda Bailey, a networking specialist whose work provides a unique vantage point on the convergence of technology and energy infrastructure. As the AI boom fuels an unprecedented demand for electricity, companies like Duke Energy are striking massive deals with tech giants such as Microsoft and Compass. These agreements, now totaling 4.5 gigawatts, are reshaping the energy landscape, raising critical questions about scale, financial stability, and the long-term viability of these enormous investments.
With data center power contracts recently increasing to 4.5 gigawatts, can you describe the scale of this new energy demand? What are the key steps, from generation to transmission, that a utility must take to meet these massive new requirements for clients like Microsoft and Compass?
The scale is staggering, and it’s hard to overstate. When we talk about 4.5 gigawatts, you have to visualize what that means in the physical world. A single gigawatt is roughly the output of a traditional nuclear reactor, so we’re essentially talking about dedicating the power of more than four nuclear plants just to these new data center contracts. For a utility, this isn’t just about flipping a switch; it’s a monumental undertaking. It begins with planning for new generation capacity—whether that’s expanding existing plants or building new ones—and then engineering and constructing entirely new transmission lines and substations capable of handling such a concentrated load. It’s a multi-year process that requires immense coordination and capital to ensure the grid can deliver that power reliably, 24/7, without disrupting service for millions of other customers.
When building infrastructure for massive new data centers, how do financial mechanisms like minimum billing and refundable payments work in practice? Could you walk us through an example of how these provisions ensure that existing residential customers are not subsidizing these large-scale commercial projects?
These financial tools are absolutely critical for protecting the public. Imagine a tech company requests a new, high-capacity transmission line that costs millions to build. The utility can’t just pass that cost onto its existing residential ratepayers. Instead, the contract includes provisions like a refundable payment, where the tech company essentially prepays for the construction of that dedicated infrastructure. Then, there’s minimum billing. This ensures that even if the data center’s construction is delayed or its power usage is lower than projected at first, the tech company is still required to pay a predetermined minimum amount each month. This guarantees a revenue stream for the utility to cover its investment, effectively insulating residential customers from paying for a project that doesn’t directly serve them. It’s about making sure these massive corporate ventures pay their own way from day one.
Investors have shown some hesitation about the massive capital spending required for AI, even when tech companies’ core businesses are strong. From your perspective, how do long-term energy contracts provide stability for both the utility and the tech company amidst this market uncertainty?
That hesitation from investors is palpable, as they see these astronomical capital expenditure figures and worry about the immediate return. Long-term energy contracts act as a powerful anchor in this storm of uncertainty. For the utility, it’s a guaranteed, multi-decade revenue stream from a creditworthy client, which justifies the massive upfront investment in new power plants and grid infrastructure. For the tech company, it secures a predictable, long-term operational cost for its most critical resource: electricity. This allows them to tell a more stable story to their own investors, assuring them that despite the high initial build-out costs, the ongoing energy expenses are locked in and managed. It transforms a volatile variable into a fixed, predictable line item, which is exactly the kind of certainty markets crave.
Some analysts compare the AI boom to past investment frenzies, questioning the economic timeline. As a foundational energy provider, how do you assess the long-term risk of this boom, and what specific metrics or signals do you monitor to validate these decade-spanning infrastructure investments?
That’s a very real concern, and we approach it with a healthy dose of pragmatism. While some see a speculative bubble, from an infrastructure standpoint, the demand feels incredibly tangible. We aren’t just looking at stock prices; we’re signing legally binding, long-term contracts for massive power blocks. A key metric for us is the commitment from a diverse set of major players—we’re not just seeing one company go all-in, but giants like Microsoft and operators like Compass, backed by serious institutional investors. Furthermore, the CEO’s comment that there’s “a lot more in the hopper” isn’t just talk; it reflects a deep and growing pipeline of formal requests. We validate these investments by the sheer number of serious, well-capitalized customers demanding these decade-plus contracts. This isn’t a fleeting trend; it’s a fundamental shift in the economy’s backbone.
Given the strong pipeline for more data center deals, can you elaborate on the scale of future demand you are preparing for? Beyond just adding gigawatts, what are the most complex logistical or grid-management challenges you anticipate as this trend continues to accelerate?
The pipeline suggests that the 4.5 gigawatts we’re discussing now is just the beginning; the demand curve is steepening dramatically. The most complex challenge isn’t simply generating more power, but managing the grid with this new type of industrial load. These data centers are incredibly power-dense, concentrating massive energy needs into relatively small geographic footprints, which puts immense strain on local transmission and distribution systems. We have to think about grid stability, power quality, and the integration of renewable sources to meet corporate sustainability goals. The real puzzle is building a smarter, more resilient grid that can dynamically balance these huge, constant loads with the needs of all other residential and commercial customers, especially during peak demand periods. It’s a logistical and engineering challenge of a whole new order.
What is your forecast for the intersection of AI development and energy infrastructure over the next decade?
Over the next decade, I believe the relationship between AI and energy will become completely symbiotic. We’re currently focused on how energy powers AI, but soon the focus will shift to how AI optimizes energy. The massive power demands will force an acceleration in grid modernization, the development of new energy sources, and smarter distribution technologies. In return, AI will become the brain of that new grid, used to predict demand with incredible accuracy, manage energy flows in real-time, prevent outages, and integrate intermittent renewables seamlessly. The infrastructure we are building today to feed the AI boom will, in turn, become the platform that a much more intelligent and efficient energy future is built upon. It’s a challenging but incredibly exciting feedback loop.
