OpenAI is pouring an estimated US$500 billion (C$665 billion) into a network of United States data centres branded Stargate, positioning itself as an infrastructure heavyweight alongside energy utilities and cloud giants. Yet insiders and analysts remain split on whether the pace of construction matches a clear long-term plan.

Technology analyst Rob Enderle warned that the firm “has gone off the rails,” arguing it lacks the solid governance needed to win cautious enterprise clients, a view he shared with Fortune last week.

The comment followed chief executive Sam Altman’s recent “code red” memo calling for sharper focus on core products after a rush of model, app and tool launches. Short-term delivery continues, but debate over strategy shadows the physical build.

Rapid expansion masks strategic tension

Construction crews in Abilene, Texas, have already energised parts of Stargate I, receiving Nvidia GB200 racks and running early training jobs, according to a July OpenAI blog post. The site forms the first slice of at least 10 gigawatts of planned compute capacity across several states, a scale comparable to a medium-sized provincial power grid.

OpenAI claims the programme will create more than 100 000 jobs during construction and operations, spreading benefits to electricians, equipment operators and local suppliers. Investors applaud the momentum, but some staff say rapid iteration strains safety review cycles that once defined the organisation. Governance became a flashpoint after Altman’s brief removal in 2023, and uncertainties linger over how independent oversight will evolve as the datacentre footprint grows.

Stargate aims at hyperscale reach

OpenAI’s stated goal is to become a “hyperscaler” that offers chips, platforms and finished applications under one roof, echoing the vertically integrated models of Amazon, Microsoft and Google. Futurum Research chief executive Daniel Newman supports the ambition, telling Fortune the spending fits “a multi-decade super-cycle” in artificial intelligence and could mirror the streaming pivot that lifted Netflix. He argues that unmet AI demand justifies the billions earmarked for concrete, cables and cooling.

Still, industry veterans note that hyperscale power prices are rising and grid connection queues are lengthening in several U.S. regions, realities that could delay or reprice parts of the build. For OpenAI, keeping investors, regulators and customers aligned while steel rises may prove as hard as training its next model.

The coming year will test whether governance reforms can keep pace with physical growth. If cost overruns or power shortages bite, skeptics may cite Enderle’s doubts about the foundation. Should demand hold and new capacity switch on smoothly, Newman’s mansion metaphor may look prescient. Either path will shape the broader AI infrastructure race and inform how much capital future projects can raise. Confidence, like computation, remains a finite resource.