In a news conference on Tuesday, President Trump announced the Stargate Projectwhich he called “the largest AI infrastructure project, by far, in history.”
With the CEOs of OpenAI, Oracle, and Softbank at his side, Trump said that these companies and other private sector partners will invest up to US $500 billion in building data centers across the United States, with the first $100 billion coming this year. The announcement came the day after Trump rescinded former President Biden’s executive order on AI, which aimed to increase safeguards for the technology.
While details about Stargate are scarce, experts in AI, energy, and data centers had a range of reactions to the news.
What AI Experts Say About Stargate
Shelly Palmera tech pundit and consultant, argued that Stargate will give the United States a strategic advantage, and that it will bring benefits we can’t yet imagine. He wrote:
As for the 100,000 jobs the project is supposed to create? Some construction jobs will be created as the data centers are built, but many more (millions more) will be created as the data centers come online. We’ve never had a compute cloud like this— there’s literally no way to calculate the economic impact of this amount of AI compute. It will be massive.
There are many tech skeptics, and it has become fashionable to denigrate and vilify big tech. To me, the Stargate Project is the first step in securing the future of the U.S. economy as well as our digital and cyber security. Every business will benefit from the power and promise of AI, and—like it or not, believe it or not—warfare will be dominated by AI. Today, the U.S. has a clear lead. The Stargate Project will help ensure it stays that way.
But not everyone is so bullish. The noted AI critic Gary Marcus responded to a post on X (formerly Twitter) from OpenAI CEO Sam Altman saying that the project will be “great for our country” (itself in response to Elon Musk casting doubt on the project’s financing). Marcus took issue with Altman’s rosy optimism:
Like a lot of what Sam says, this is based on the conjecture, or in this case multiple conjectures:
1) The entirely speculative conjecture that LLMs or something else OpenAI figures out how to be build will be enormously profitable. So far the (cost of) infrastructure field-wide ($250B, perhaps) has enormously outweighed total revenue, perhaps 50:1.
2) The entirely speculative conjecture that any profits will actually do much to help the American people, as opposed to just enriching those who own that infrastructure. Yes, some people will be employed building data centers; but if the data centers work towards better AI, many others will lose their jobs. Net effect is entirely unclear.
Meanwhile,Doug hotsenior vice president of government affairs for the advocacy group Americans for Responsible Innovationtold IEEE Spectrum that the core of this initiative may not be new.
My sense is that it’s mostly a repackaging of commitments that have already been made (especially by SoftBank) coupled with an aspiration to raise even more money to get to this higher target. Given the extreme level of investment interest in this area and the players involved, I think it’s likely but not certain they’ll be able to pull it off, especially since Trump seems to be allowing them to tap funding from the Middle East. The spectacle surrounding the announcement and the public support for the project from President Trump will likely make it easier for them to hit their target.
What Energy Experts Say About Stargate
The biggest ongoing discussion in the energy sector over the last year has centered on how to meet the coming onslaught of electricity demand from AI operations. Researchers have predicted that U.S. electricity demand will grow as much as 15.8 percent over the next four years led by power demands of AI and data centers, and those figures predate whatever additional electricity demand might come from Stargate projects.
So when Trump announced the initiative, it was his accompanying energy plans that raised eyebrows among the energy crowd. On Monday, his first day in office, Trump declared a national energy emergency, halted off-shore wind development, and paused disbursement of funds from the Infrastructure Investment and Jobs Act (IIJA) and the Inflation Reduction Act (IRA), which largely support clean energy projects.
Line Roaldelectric power systems expert at the University of Wisconsin, Madison, calls the moves “a huge contradiction,” and says she is concerned about tech companies getting preferential treatment when connecting their data centers to the grid.
It’s strange that at the same time that Trump is expressing his support for AI initiatives, he is also trying to restrict development of new wind generation. Wind is a cheap source of electricity that could help support the needs of new AI infrastructure. To support these electricity needs, we also need new power plants and transmission lines. This costs a lot of money to build, and is typically covered by all consumers in the region where the data center is built. As data centers are getting interconnected to the grid, they should pay their fair share for the expansion of the grid. Otherwise, electricity prices for everyone could rise.
Costa Samarasdirector of the Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon University says the rapid growth and localized impacts of data center electricity use are the biggest challenges to a large AI rollout, but they can be managed.
The easiest way to get things on the grid quickly is to bring your own power. BYOP. And an even better way is to not just bring your own, but to bring enough for the community. AI electricity load will only break the grid if we’re not proactive and don’t come together to manage it appropriately by deploying lots of new clean electricity, by maximizing energy efficiency, and by deploying virtual power plants. If we want to ensure our AI competitiveness and our national security, we don’t have the luxury of taking cheap, clean energy off the table.
Thomas Wilson is principal technical executive at the independent, non-profit Electric Power Research Institute (EPRI). His organization doesn’t comment on specific political announcements, but he offers some general observations on powering data centers:
New energy generation takes time to deploy, but the shortest lead times right now are wind, solar, batteries, and natural gas. So if the data center community is interested in speed, that’s what they’ll be looking at, in addition to placing longer-term bets like advanced nuclear. New transmission takes time too. Data centers that operate flexibly, lowering or self-powering compute when the grid is stressed, require less grid buildout. And if tech companies can spread this compute over multiple connected facilities separated by tens of kilometers rather than concentrating them all in one area, this will give them access to multiple existing transmission lines. Both strategies could help them get connected faster.
What Data Center Experts Say About Stargate
Data center providers have reason for excitement here. Kevin CochraneCMO of Vultra cloud infrastructure company, was hopeful that this would be a boon to the industry and also increase much-needed geographic diversity of computing access:
Stargate will act as a catalyst for data center providers of all types, across all geographies to recognize the importance of building out the capacity needed to support a wholesale transformation of the cloud stack and businesses around the globe. Every national government needs to have a strategy for the build-out of critical infrastructure to support AI.Data center capacity needs to be more broadly distributed across regions; specifically capacity-optimized for deploying next-generation GPUs with optimal energy efficiency and sustainability in mind. As with the building of data center capacity to support the internet revolution at the turn of the century and later the build-out of data center capacity to support new cloud services, similarly, we need to see another hyper-expansion of data center capacity worldwide.
Josh Mesoutchief innovation officer of another cloud computing provider, Civowas also hopeful, but expressed concern that highly-coveted GPUs are in short supply and only available to huge enterprises, as Civo noted in a recent report. Mesout warns that this huge investment shouldn’t just benefit OpenAI (as the Financial Times just reported it would).
Any government backing for AI initiatives should be met with support. Much of the promised (initial) $100 billion will need to be put towards overcoming the GPU gap we’ve seen in our research, along with improvements to energy infrastructure to keep data centers running.Most importantly, the benefits of AI should be for everyone. Public and private organizations across all sectors stand to gain a huge amount from AI use, improving the lives of their customers and users. We’ve seen a major shift in the industry away from training models and towards more costly inferencing, so projects like Stargate ought to focus on keeping the cost of accessing GPUs as low and flexible as possible for businesses. Whilst large firms like OpenAI and Oracle certainly have the chops to deliver, it’s vital that all that funding isn’t just funneled into applications like ChatGPT.
Data center provider Iron builds facilities powered by 100 percent renewable energy. IREN’s chief commercial officer Kent Draperwhile also excited, was concerned that energy considerations will be the bottleneck in this endeavor. He also warned of the need for public investment in AI.
This announcement underscores the importance of next-generation data center capacity for supporting the growth of AI. It marks an inflection point whereby the highest levels of government and largest companies in the world are collaborating to address the data center supply shortage.
Access to power is the key bottleneck to new data center development. Timelines for securing new grid interconnections to build out data centers is longer than ever before. The challenge will be managing the already strained grids and ensuring power is available for the data center development.
It’s likely that a large amount of hyperscaler compute will be for proprietary use. Hopefully increasing public investment in AI infrastructure will help mitigate the risk of centralization of compute.
From Your Site Articles
Related Articles Around the Web
GIPHY App Key not set. Please check settings