The Three Mile Island nuclear power plant in Middletown, Pennsylvania, which in 1979 recorded the worst nuclear disaster in U.S. history, will be back in operation in 2028 to power Microsoft data centers.JIM LO SCALZO (EFE)

Share this article if you believe this is an important cause for concern given the dramatic AI Infrastructure frenzy of BigTech’s capex and an AI-race that might have harmful consequences.

Share

Good Morning,

How do we frame the new Datacenter reality of the United States? AI data centers could need 10 gigawatts of additional power capacity this year in 2025. (That’s more than the capacity of Utah.) Where is that energy going to come from?

AI’s Power Requirements

According to a report by Rand, if exponential growth in chip supply continues, AI data centers will need 68 GW in total by 2027 globally — that’s almost a doubling of global data center power requirements from 2022 and close to California’s 2022 total power capacity of 86 GW. So a doubling in just five years counted from 2022.

Here’s the thing, we keep revising our projects up!

AI to drive 165% increase in data center power demand by 2030 – Goldman Sachs (Feb, 4th, 2025)

  • OpenAI considering 16 states for data center campuses as part of Trump’s Stargate project. It’s not clear if they will even have the supposed $100 billion initial spend for 2025. The scale of Stargate, is Elon Musk-esque. Even as Altman is now insulting his former co-founder.

  • BigTech, just four companies have committed $320 million to capex spending much of it going to AI Infrastructure, like datacenter campuses, servers and GPUs.

Training could demand up to 1 GW in a single location by 2028 and 8 GW — equivalent to eight nuclear reactors — by 2030, if current training compute scaling trends persist. – RAND

But are numbers like those even rational? We don’t have the Fusion, Small modular reactor or alternatives to even do that yet.

The 16 states OpenAI is currently considering are Arizona, California, Florida, Louisiana, Maryland, Nevada, New York, Ohio, Oregon, Pennsylvania, Utah, Texas, Virginia, Washington, Wisconsin and West Virginia.

Back in December I argued the price might be high for the power required. Tycoons like Sam Altman will ask things from our energy grid and the energy startups they back as VCs (conflicts of interest here) what we’ve never asked from America before.

Keep in mind while there are risks of AI models to society and jobs, there are risks of Datacenters to human health that are being overlooked and under-reported. While Macron’s tone-deaf “plug-baby-plug” comment, the Datacenter projects announced in France are funded by the UAE and a Canadian fund.

Goldman Sachs Research forecasts global power demand from data centers will increase 50% by 2027 and by as much as 165% by the end of the decade (compared with 2023), writes James Schneider, a senior equity research analyst covering US telecom, digital infrastructure, and IT services.

If AI Goes Exponential in Demand

  • Global AI data center power demand could reach 68 GW by 2027 and 327 GW by 2030, compared with total global data center capacity of just 88 GW in 2022.

  • Individual AI training runs could require up to 1 GW in a single location by 2028 and 8 GW by 2030, although decentralized training algorithms could distribute this power requirement across locations.

Power Infrastructure Delays

  • Insufficient power generation is increasing wait times for grid connections, with grid connection requests taking four to seven years in key regions like Virginia, according to Rand.

  • Transmission line projects face complex multistate permitting processes and local opposition, delaying power delivery to suitable sites.

  • Data centers struggle with local and state permits, particularly for on-site backup generators and environmental impact assessments.

  • Environmental commitments and regulations limit the use of readily available power sources, forcing reliance on harder-to-scale renewable options.

Human Safety Concerns

  • One of the latest executive orders signed by Joe Biden prohibits these infrastructures from being built in areas with a high risk of cancer due to pollution.

  • Acceleration and authorization of small modular nuclear reactors — to meet rising demand is now inevitable as well as partnerships with Energy (natural Gas) companies.

  • One of the latest executive orders signed by former President Joe Biden (EO 14141, January 14), titled Advancing United States Leadership in Artificial Intelligence Infrastructure, implicitly acknowledges that data centers are harmful to health (Elpais).

“The executive order acknowledges for the first time the public health impact of data centers,” says Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside and an expert on AI sustainability.

We aren’t just talking higher energy prices, we’re talking consequences on American public health that will get buried. I’ve been in discussion with Marco Antonio Lueras about the impact on water and energy of this Great Datacenter Boom. While an Energy Score for LLMs is one thing, there are bigger power issues looming.

Let’s now explore Marco’s deep dive on our topic.

Gridlocked 🔅

Marco is the author of the self-published book The Metaphysics of AI and Policing.


  • Our article continues after his deep dive.

By: Marco Antonio Lueras, February 5, 2025

Biography

Marco Antonio Lueras is the Co-Founder and Executive Director of New Mexico Conservation Alliance, a non-profit based in the Southwestern U.S. with the mission of empowering New Mexico’s youth to become stewards of their environment by connecting them with meaningful service opportunities in collaboration with water and environmental organizations. An avid researcher, Marco serves on several advisory boards for companies working in the intersections of AI and water. Marco currently coordinates the Eastern New Mexico Sentinel Landscape Partnership, a multi-organization partnership that focuses on environmental resilience. Marco is passionate about environmental conservation, human rights, and the pursuit of knowledge. He recently published a book on the intersection of AI, philosophy, and policing, titled The Metaphysics of AI and Policing, exclusively available on Apple Books.

Summary

Article Summary: 🤖 The AI race to the hypothesized singularity has provided humanity with tangible tools: chatbots, image generators, virtual companions… the list goes on. These tools provide humanity with quick and digestible information, but most importantly, they express our most wanton desire for production. As such, the energy grids of the past may indeed be rendered obsolete if they do not adapt to the tides of change.

Gridlocked

By

Created by Google Gemini

The Energy of Expression

“According to the law of conservation of energy, energy cannot be created or destroyed, although it can be changed from one form to another,” (Florida State College).

The average human body burns between 1,600 and 2,450 calories per day, depending on sex, weight, age, and activity level. Calories are measures of potential energy (i.e., heat) per one gram of a given food source. One calorie is equivalent to 0.0000011622 Kilowatt-hours (kWh) (Calculator). This means that:

At most, 2,450 calories in a day multiplied by 0.0000011622 kWh comes out to 0.00284739 kWh of energy spent by the human body per day.

Training a large language model like Chat-GPT has been found to consume upwards of 1,300 mWh of energy, or 1,300,000 kWh:

Luccioni and her colleagues ran tests on 88 different models spanning a range of use cases, from answering questions to identifying objects and generating images. In each case, they ran the task 1,000 times and estimated the energy cost. Most tasks they tested use a small amount of energy, like 0.002 kWh to classify written samples and 0.047 kWh to generate text,” (The Verge).

In other words, our bodies are extremely more efficient in the process of converting energy when compared to that of artificial intelligence (AI). By now it is no secret that AI is an energy intensive process that strains our outdated power grids and natural resources. The question must be asked, then: How much energy are we willing to sacrifice for algorithmic expression?

DeepSeek AI shook up the Western AI markets when it introduced itself as a more cost ~ and energy ~ efficient model than competitors such as OpenAI’s ChatGPT. This is due to the difference in model architecture:

“At their core, these AI models work differently. DeepSeek uses what’s called a Mixture-of-Experts (MoE) approach, which is like having a team of specialized experts where only the most relevant ones are called upon for each task. With 671 billion parameters (think of these as the model’s knowledge points), DeepSeek activates only a subset of its parameters for each request, enhancing efficiency. This Mixture-of-Experts approach allows DeepSeek to optimize both performance and resource usage, dynamically adapting to different types of queries. In contrast, ChatGPT uses a traditional transformer model, which is like having all experts working on every task – more consistent but potentially less efficient,” (Data Camp).

A growing trend in Western AI has been to scale hardware at immense pace; Creating more hardware channels for algorithms to flow is seen as the solution to create more accurate and faster chat models. Yet, accuracy and speed is gained at the loss of efficiency and natural resources, including water. Let’s return to another quick math example:

The human body is composed of at least 60 percent water. Water in the human body regulates temperature, transports nutrients, and removes waste. It is the most fundamental element of life itself. Depending on age, sex, activity level, and climate, most humans use between 2.7 to 3.7 liters of water per day. This equates to roughly 11 to 16 cups of water per day.

Regarding water footprint, a recent study found that ChatGPT utilizes 500 milliliters (approx. 2.1 cups) of water every time you prompt a series of 5 to 50 questions. Two cups of water may not seem like a large amount, but compound it with the fact that ChatGPT has on average 300 million users per week (GPT Stats). If each user utilizes the allocated 2.1 cups of water, that equates to 630 million cups of water used per week to cool OpenAI’s data centers. That is 39,375,000 gallons of water per week.

The energy and natural resources needed to produce technological expression is readily quantifiable. The results of the quick analysis done in the beginning of this article are hard to swallow. The fact of the matter is simple: if AI giants continue to scale their hardware without finding sustainable solutions to the issue of efficiency, more energy and more natural resources will be needed to power our digital world. The rate at which renewable energy is used across this industry must accelerate to survive.

You Get a Data Center, You Get a Data Center, Everyone Gets a Data Center!

Indeed, the rate at which renewable energy is used across this industry must accelerate to survive. Yet, as AI demand grows, so too does the need for physical infrastructure, leading to an unprecedented expansion of data centers across the U.S. and globally. Tech giants like Microsoft, Google, and Amazon are investing billions into new facilities to support AI workloads, which require immense computational power. The result? Data center hubs are emerging in states like Virginia, Texas, and Georgia, where land and energy resources are relatively abundant. Abroad, countries like Sweden, Norway, and Singapore are also seeing increased investment due to favorable climate conditions, infrastructure, and government incentives.

However, this expansion comes with an undeniable cost: energy consumption. AI-driven data centers require enormous amounts of electricity, straining already outdated power grids and placing greater pressure on natural resources. But where does all this energy come from? The answer varies drastically depending on location, corporate sustainability goals, and government policies.

Non-Renewable Energy in Data Centers

Despite growing efforts toward sustainability, many data centers still rely heavily on non-renewable energy sources such as coal, natural gas, and oil. For example, in Texas and Georgia, where fossil fuels remain dominant, data centers often draw from conventional power grids. A 2023 report from the International Energy Agency (IEA) found that while the tech industry is working toward greener energy solutions, many hyperscale data centers continue to contribute significantly to carbon emissions.

A notable case is Northern Virginia’s “Data Center Alley,” the world’s largest concentration of data centers. As AI models become more complex and energy-intensive, the region’s power infrastructure has struggled to keep up. Dominion Energy, the area’s primary utility provider, has warned of potential power shortages as electricity demand skyrockets. While efforts are being made to integrate renewable sources, fossil fuels still play a major role in keeping AI systems operational.

The Shift Toward Renewable Energy

On the other hand, some companies are leading the charge toward sustainable AI infrastructure. Google has committed to operating on carbon-free energy by 2030, investing heavily in wind and solar projects to power its facilities. In Nebraska, for instance, a new Google data center is largely sustained by wind energy. Microsoft has also signed multiple power purchase agreements (PPAs) to secure renewable energy for its operations, including offshore wind farms developed in partnership with Ørsted in Europe.

Scandinavia, with its naturally cool climate and abundant hydropower, has emerged as a leader in sustainable AI data centers. Facebook (Meta) operates a major facility in Luleå, Sweden, which takes advantage of the region’s renewable energy sources and low ambient temperatures to reduce both power consumption and cooling costs.

Yet, even as companies push toward greener solutions, the underlying issue remains: AI is inherently energy-intensive. Training a large language model can require millions of kilowatt-hours of electricity, while real-time AI interactions continue to draw power with each user query. With water-intensive cooling systems and the rapid proliferation of AI-driven applications, the industry must evolve beyond scaling hardware alone. Efficiency, both in energy consumption and model architecture, must become the new frontier for sustainable AI.

The Nuclear Solution

As AI data centers continue to expand, their demand for energy has pushed industries and governments to rethink how power is generated and distributed. While renewable energy sources like wind and solar have gained traction, they come with limitations: intermittency, land use, and scalability challenges. To meet AI’s growing power needs without overwhelming existing grids or increasing reliance on fossil fuels, some experts argue that nuclear energy is the most viable long-term solution.

Why Nuclear?

Unlike fossil fuels, nuclear energy produces no carbon emissions during operation, making it one of the cleanest large-scale energy sources available. It also offers a key advantage over renewables: reliability. Unlike wind and solar, which depend on weather conditions, nuclear power plants generate a constant, uninterrupted energy supply—a crucial factor for AI data centers that operate 24/7.

The efficiency of nuclear energy is another factor. A single uranium fuel pellet—about the size of a fingertip—contains as much energy as a ton of coal, 149 gallons of oil, or 17,000 cubic feet of natural gas. This extreme energy density makes nuclear power well-suited for meeting the colossal demands of AI infrastructure without requiring massive amounts of land or resources.

The Push for Nuclear-Powered Data Centers

Tech giants are starting to recognize nuclear energy’s potential in solving AI’s energy dilemma. In 2023, Microsoft made headlines by posting a job listing for a nuclear technology expert, signaling its interest in integrating small modular reactors (SMRs) into its energy strategy. SMRs are a next-generation nuclear technology that offers a safer, more flexible alternative to traditional large-scale reactors. These compact reactors could be deployed near data centers, providing direct, emission-free power without overloading local grids. Microsoft also agreed to a PPA with Helion, a nuclear start-up company backed by Sam Altman.

Google has expressed interest in nuclear energy as part of its carbon-free commitment, and some industry analysts predict that hyperscale AI companies will begin investing directly in nuclear projects. A growing number of start-ups, such as Oklo and NuScale, are developing advanced nuclear reactors tailored for industrial applications, including AI infrastructure.

The Challenges of Nuclear Adoption

Despite its advantages, nuclear energy is not without obstacles. High upfront costs, regulatory hurdles, and public skepticism have slowed its widespread adoption. Traditional nuclear plants can take decades to build, and while SMRs offer a faster deployment timeline, they still require government approval and significant investment.

Another concern is nuclear waste. While modern reactors are designed to minimize and recycle spent fuel, long-term storage solutions remain a contentious issue. However, proponents argue that nuclear waste challenges are manageable compared to the ongoing environmental damage caused by fossil fuels.

A Necessary Path Forward?

As AI technology advances and data centers become even more energy-intensive, the industry must confront a harsh reality: the current energy grid is not built to handle AI’s exponential growth. While renewables will play a crucial role, they may not be enough to sustain the future of AI without nuclear energy as part of the mix.

If AI is to continue evolving without depleting natural resources or accelerating climate change, a large-scale shift toward nuclear-powered data centers may be inevitable. The question is not whether nuclear energy will play a role in AI’s future, but rather how quickly governments and corporations will embrace it.

Grid(locked) out of Heaven

AI is “evolving” at an unprecedented pace, but its rapid expansion is shackled by a fundamental truth: energy is finite. The more we scale AI, the more we push against the hard limits of our power infrastructure, natural resources, and sustainability efforts. If artificial intelligence is to continue to expand, it must do so without consuming the planet in the process.

The concept of the singularity, the hypothetical moment when AI surpasses human intelligence and becomes self-improving, has long been a topic of speculation. But even if AI reaches that threshold, its existence is still constrained by the laws of physics. Intelligence, no matter how advanced, requires energy to function. The singularity, if it ever occurs, will not be an event of pure abstraction; it will be an energy event, a transformation of matter into computation on a scale we have never seen before. Or perhaps may never see. Discussing the singularity is important in the context of energy because this is exactly what OpenAI is in pursuit of.

In pursuit of nothing, we stand to lose everything.

The problem is, the energy demands of AI are already unsustainable. As AI models become more powerful, they demand more data, more computation, and more electricity. The power required to train advanced neural networks is outpacing our ability to generate clean energy at scale. While wind, solar, and hydropower offer partial solutions, they are ultimately constrained by geography, weather, and infrastructure challenges. Nuclear power, for all its potential, faces political and economic roadblocks.

If AI continues to expand at its current rate, we risk reaching a bottleneck: a gridlocked future where progress is no longer limited by algorithmic breakthroughs, but by the availability of energy itself. We already see early warning signs—rolling blackouts in regions overloaded by data centers, water shortages linked to server cooling, and the growing carbon footprint of AI-driven industries.

This raises an existential question: is AI’s expansion sustainable, or will it collapse under its own energy demands?

Historically, every major leap in human technology has been accompanied by an energy revolution. The Industrial Age was powered by coal. The 20th century was defined by oil, natural gas, and nuclear energy. Now, as AI accelerates toward the unknown, we have yet to determine what will power this new era.

One possibility is that AI itself will become the catalyst for its own energy solution. Advancements in quantum computing, fusion energy, and self-optimizing grid systems could shift the balance, enabling intelligence to grow without depleting the Earth. Alternatively, AI may be forced to self-limit, constrained by the thermodynamic realities of an already overburdened planet.

Either way, we are standing at a crossroads. The future of AI is not just a question of algorithms or processing power—it is a question of sustainability, of how much energy we are willing to sacrifice for intelligence.

The singularity, if it ever comes, will not arrive in silence. It will arrive humming with the sound of turbines, cooling fans, and reactors, demanding more power than we have ever produced. Whether we unlock the energy to sustain it, or find ourselves gridlocked out of heaven, remains to be seen.

End of Marco’s deep dive.

Postscript: Datacenter Mania 2025

Editor’s note continues:

We’ll never forget the weird air of urgency of our biggest corporations going all-in on AI datacenter campuses that characterized 2025 with a 60% + jump in Capex on AI Infrastructure.

Why this is Global

A lack of data center infrastructure in the United States could shift construction to other countries
  • U.S. companies are exploring expansion in countries offering better power availability and faster permitting.

  • Countries with more compute access can deploy AI at larger scale, potentially gaining economic and military advantages (Rand).

At present (Feb, 2025), Goldman Sachs Research estimates the power usage by the global data center market to be around 55 gigawatts (GW). This is comprised of cloud computing workloads (54%), traditional workloads for typical business functions such as email or storage (32%), and AI (14%).

By modeling future demand for each of these workload types, our analysts project power demand will reach 84 GW by 2027, with AI growing to 27% of the overall market, cloud dropping to 50%, and traditional workloads falling to 23%.

  • That is, in two years we are expected to go from 55 GW to 84 GW. A jump of 29 GW.

The current global market capacity of data centers is approximately 59 GW. Roughly 60% of this capacity is provided by hyperscale cloud providers and third-party wholesale data center operators.

  • Winners of Cloud computing tend to be the winners of AI hypercompute scale datacenter campuses as well. This means that this is a winner-takes-all scenario for global capitalism.

“Many of the health risks will be caused by the pollution Datacenter will cause. “It is well known that cars pollute, and, precisely for that reason, there are strict regulations to control and reduce the gases they emit. But data centers are growing at such a dizzying rate that by 2028 they will exceed the emissions of the entire California vehicle fleet — even if we add 35 million vehicles — according to the Department of Energy’s recent projection of data center energy demand.” – Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside.

Datacenters at this scale will present energy and environmental risks that are not being covered much in terms of public health considerations.

“The great known unknown isn’t what AI becomes, it’s how will we be able to power it.”

Share

Plug, Baby, Plug

In America they say drill baby drill, but in Europe the metaphors are a bit more obscure.

OpenAI’s construction on the data centers in Abilene, Texas, is currently underway. Part of the financing is by SoftBank and includes debt financing and third party debt financing that’s considered high leverage. The Japanese billionaire (Softbank) is looking at a technique called project financing that is sometimes linked with oil and gas projects.

One scenario under discussion would have SoftBank, OpenAI and partners Oracle Corp. and Abu Dhabi’s MGX contribute in equity about 10% of the overall cost and tap debt markets for much of the rest. – Bloomberg.

MGX: A lesser-known but crucial player in the venture, MGX represents the Abu Dhabi investment group, providing additional financial backing. So the UAE is implicated in by Stargate and the Datacenter in France.

Supercharging Energy Grids

Recent estimates calculate that AI is responsible for 100 terawatt-hours (TWh) of electricity globally in 2025, with worst-case forecasts predicting a surge to 1,370 TWh by 2035. In the United States, data center electricity consumption is expected to rise from 4.4% of total electricity use in 2023 to 6.7-12% by 2028, driven largely by AI. To meet this soaring demand, the U.S. is adding 46 gigawatts of natural gas capacity by 2030, equivalent to the entire electricity system of Norway. – Source.

Read AI and Electricity

At the global level, AI data center electricity use currently accounts for less than 0.3% of worldwide electricity demand. In 2025, we’re leaving that reality behind forever. This energy demand is going to become a major problem meaning the AI race is also a GPU and energy race. China’s green-tech prowess is going to catch up to the U.S. eventually. As you might know, China now holds 80% of the global manufacturing capacity for solar PVs and over 70% for lithium-ion batteries. These advantages scale into things like robotics and powering bigger and more datacenters via sustainable technologies. So we now have energy ratings for Chatbots and LLMs, but that’s not the point.

The worrying environmental cost of AI is obvious even at this nascent stage of its evolution.

This reliance on fossil fuels risks derailing the energy transition and global climate goals, as AI’s energy consumption increasingly competes with efforts to decarbonize the grid. Beyond electricity use and emissions, AI’s growth also raises concerns about its impact on water consumption, air pollution, electronic waste, and critical materials, in addition to public health concerns around pollution (Salesforce).

Read iea.org Report

International Energy Agency. Electricity 2024 (IEA, 2024).

BigTech’s Capex for datacenters creates massive problems for society at large both for energy and the environment. These aren’t being addressed with any oversight on the global stage. So it’s power and BigTech politics and lobbying in the wild.

The International Energy Agency estimated that the electricity consumption of data centres could double by 2026, and suggested that improvements in efficiency will be crucial to moderate this expected surge.

It really is a kind of AI mania and it will deteriorate our environments probably without much accountability.

The broader implications of this demand suggest that data centers could consume up to 9% of U.S. electricity generation by 2030, nearly double the current levels. France’s artificial intelligence sector will receive 109 billion euros ($112.6 billion) of private investment in the “coming years”, so it’s not just the U.S. Anyone who wants to keep up with the U.S. will need to be doing the same thing from China, the U.K, to Germany and India.

Image of the interior of the US Lawrence Livermore National Laboratory, a facility where nuclear fusion has been achieved.HANDOUT (AFP)

Towards Nuclear and Fusion

Big tech has already taken the first steps towards the nuclear age, a declining energy source in the West (where plant closures outpace openings) with some major exceptions: the United States, France, the United Kingdom and several Eastern European countries.

The AI race isn’t just about datacenters or language models, it’s about power, energy and fusion technologies. China’s new military base near Beijing is considered of scale with bunkers the likes of which the world has never seen.

An energy tech race is also about space-technologies and mining the Moon and other bodies in the solar system. AI and emerging technologies are connected in ways maybe DARPA considers but China has been planning for, for likely decades.

Each year we will need more GW of power to meet the demands of our new AI models and to power their increasing and perhaps exponential capabilities. The key bottleneck will be energy, and the U.S. is not very well positioned.

OpenAI is aiming to build five to 10 data center campuses total, although executives said that number could rise or fall depending on how much power each campus offers. OpenAI is taking out a fair bit of debt to even survive this era of hyper growth.

The great known unknown isn’t what AI becomes, it’s how will we be able to power it.

Addendum: EU refuses to stay on the Outside

European start-ups confident AI race not lost to the US and China | Euronews

European Commission President Ursula von der Leyen emphasized that the race for AI leadership is “far from over” at the AI summit that the U.S. and UK refused to sign in early 2025.

A World “Forced to Keep up”, Generative AI FOMO

The U.S. hyperscalers have been too successful in their lobbying and public relations. Now Europe and the Middle East think they have to invest $Billions to keep up in AI Infrastructure. The Middle East will also directly profit from this European Union FOMO spending.

All of these plans mean datacenter and AI Infrastructure roll-outs will increase fossil fuel usage, pollution, health risks and energy prices globally simply due to Datacenter campus FOMO. Mind you these regions and ecosystem don’t have the AI talent, Venture Capital infrastructure or other competitive moats to even compete on a levelling playing field with either the U.S. or China.

So what are they investing in exactly? The myth of Sovereign AI? That’s good for Nvidia. It’s a nice idea in theory. Can countries maintain their unique values and languages via their own models? Is that great spending by Governments and smaller countries?

JD Vance Speech Paris AI Summit February 12th

  • My take on this.

  • This is a 15-min speech full of the Trump administration’s take on AI and the U.S. as the Head.

Join our AI Chat on JD Vance Speech

  • Share your opinion

  • Learn what others are saying

  • JD Vance’s speech and U.S. policy on their “drill baby drill” approach to AI Infrastructure is going to divide the world further. However that’s just my opinion.

Disclaimer: Due to the length of this article, significant typos and grammatical errors might exist. I will be continuing to edit as usual but due to time constraints might not have caught them all.

Support independent journalism and curation that is not Billionaire owned or influenced.

References

  1. https://www.goldmansachs.com/insights/articles/ai-to-drive-165-increase-in-data-center-power-demand-by-2030

  2. https://english.elpais.com/technology/2025-02-07/us-admits-data-centers-are-harmful-to-health.html

  3. https://www.politico.eu/article/emmanuel-macron-answer-donald-trump-fossil-fuel-drive-artificial-intelligence-ai-action-summit/

  4. https://www.nature.com/articles/d41586-024-02680-3#ref-CR1

  5. https://illuminem.com/illuminemvoices/china-is-building-a-giant-laser-to-generate-the-energy-of-the-stars-satellite-images-appear-to-show

  6. https://huggingface.github.io/AIEnergyScore/

  7. https://www.rand.org/content/dam/rand/pubs/research_reports/RRA3500/RRA3572-1/RAND_RRA3572-1.pdf

  8. https://www.cnbc.com/2025/02/10/frances-answer-to-stargate-macron-announces-ai-investment.html

  9. https://www.cnbc.com/2025/02/06/openai-looking-at-16-states-for-data-center-campuses-tied-to-stargate.html

  10. https://www.rand.org/pubs/research_reports/RRA3572-1.html

Among many others. Not an exhaustive list.

Read More in  AI Supremacy