Home / Insights and Inspirations Posts / Perspectives / Inside the Race to Build—and Power—the World’s Largest Data Centers

Inside the Race to Build—and Power—the World’s Largest Data CentersĀ 

How Modular Nuclear Power and 4D Planning Are Reshaping Hyperscale AI Data Centers

Kathleen Moore Profile Image

Kathleen Moore

Split image showing a power line tower, close-up of network cables in servers, and a brightly lit corridor in a server room.

Share

You know that something extraordinary is happening on the outskirts of Abilene, in the Big Country part of Texas, even before you arrive. By 6 a.m., a line of cars stretches a mile from the highway to where thousands of workers are building one of the biggest data center sites in the world. 

Preston Williams of DPR Construction, the project’s general contractor, is part of the team building the Abilene data center. Its size is staggering. One of the largest such centers in the U.S., and possibly the world, the site will span 4 million square feet, roughly 70 football fields, and include eight buildings that will each house up to 50,000 Nvidia artificial intelligence (AI) chips. The center’s power needs will approach 1.2 gigawatts, rivalling the output of a nuclear plant. The center will be part of the $500 billion Stargate program that will provide the massive infrastructure needed to power the boom in AI. ā€œIt really is a place to test how to plan a job correctly, because you have to have some way to organize the madness,ā€ Williams says.Ā 

Williams sat down with the Bentley Horizons podcast to talk about the massive project, in an episode that’s all about hyperscale data centers—about what it takes to plan, build, power, cool, and secure megaprojects that rival the size of New York’s Central Park.Ā 

The podcast’s latest episode also hosts David Lawson of Assystem, one of the world’s top nuclear engineering firms, Dara Khera and Sukshma Paranjpe of the startup Bohm, and David Ayeni, global director of Infrastructure Cloud Partner Experience at Bentley Systems. You can read about more of David’s work on the blog. Join the conversation below:

Ground investigation is often seen as a project cost, not the critical insurance policy it truly is, says Carl Grice, who leads geotechnical data management at Seequent. ā€œHaving a good understanding of the underground is how you de-risk your project, and that starts with good data,ā€ he says.

Site investigation data—such as boreholes, core photos, field tests, and lab results—needs to be captured, quality-checked, and instantly made available across disciplines. Everyone involved in the design process should be working from the same trusted ground model.

This is where OpenGround comes in. Seequent’s cloud platform for subsurface data management, OpenGround tracks every site test, sample, and lab result, showing exactly which boreholes informed which models and which analyses were run using them. This transparency is essential for collaboration, accountability, and learning—especially on multi-year projects.

But good data doesn’t only come from today’s investigations. The industry is waking up to the untapped value locked in decades of legacy records. ā€œThere’s a frenzy of activity across the industry,ā€ Grice says, ā€œto extract information from historic ground investigation reports—sometimes stored in boxes and boxes of paper.ā€

When done right, the payoff is substantial. For example, the U.S. Army Corps of Engineers recently used OpenGround to centralize its historical ground investigation data from 8,600 sites across the U.S., including boreholes, lab results, and cone penetration testing data. That’s roughly 2.5 million meters of exploration. The Corps now has a half-billion-dollar data asset that will enable future savings from more targeted drilling and data reuse.

A Soaring AI Workload

Demand for AI infrastructure, which is already growing at a rapid clip, is set to rise further, with some analysts predicting that the workload of data centers will triple in the next five years alone. For these massive projects, 4D planning, which links 3D models to construction schedules, can be key. It allows users to design and visualize the project and get maximum stakeholder buy-in before any concrete is poured. Williams notes the advantages of using the 4D planning capabilities in Bentley’s SYNCHRO platform, particularly when so many workers at the Texas site are new hires. ā€œThe model paints a picture of the scope of work they’re going to install even if they’ve not built it before,ā€ he said.Ā 

The Henry Ford Approach to Nuclear Power and Data

Another challenge is powering the power-hungry centers. That’s where nuclear power and small modular nuclear reactors, or SMRs, come in, says Assystem’s Lawson. He says SMRs—which are smaller than traditional nuclear power plants and can be partially built from prefabricated components—offer an important advantage in the standardization and modularity of their designs. Lawson calls it ā€œthe Henry Ford approach to nuclear power plants.ā€ It allows them to be built faster and more cheaply compared to conventional reactors.Ā Ā 

ā€œWhat we’re seeing is a huge demand for reliable baseload power,ā€ where the reliability of the power is 95% plus, Lawson says. ā€œData [centers] rely on continuous power, something which solar and wind—at least in the U.K.—can’t quite make it. They need the baseload power, and in reality, in the fight against climate change, only nuclear can provide that kind of power level.ā€Ā 

Trusting Trustlessness

The demand for hyperscale data centers is also creating the market conditions where new alliances and partnerships are happening fast. As head of Infrastructure Cloud Partner Experience, Ayeni is taking the lead in Bentley’s partner experience. ā€œNo one company can solve the problem of infrastructure projects,ā€ Ayeni tells the podcast. ā€œMy role is all about finding best-in-breed partners all through the infrastructure lifecycle.ā€ā€ÆĀ 

One of those new partners is innovative construction startup Bohm. Khera, the company’s CEO, tells the podcast that outside of a few lucky hydropower-rich places like Scandinavia or Quebec, nuclear is the ideal, most carbon-neutral power source for data centers.Ā 

SMRs, of course, need to be deployed with strict security, safety, and compliance with regulations. Here, Khera says ā€œtrustlessnessā€ā€”a core concept in blockchain technology and cryptocurrencies—would be key.Ā 

ā€œWhen we say ā€˜trustlessness,’ we mean your data doesn’t go to a centralized entity,ā€ Khera says. ā€œBy distributing this to thousands of different nodes, it effectively becomes tamperproof, like Bitcoin.ā€Ā 

The Bentley Horizons podcast is hosted by Tomas Kellner, Bentley Systems’ Chief Storyteller, and Paul Wilson, Chair of the Advisory Board for the Online Publication SmartCitiesWorld.net. Listen to the Bentley Horizons podcast episode about data centers and their construction here.

Relevant Tags

Subscribe to The Bentley Brief

Stay ahead of the curve with the latest infrastructure news and insights.