Sam Altman: AI Water Fears Are “Fake” — But Energy Is Real

In partnership with

Meet America’s Newest $1B Unicorn

A US startup just hit a $1 billion private valuation, joining billion-dollar private companies like SpaceX, OpenAI, and ByteDance. Unlike those other unicorns, you can invest.

Why all the interest? EnergyX’s patented tech can recover up to 3X more lithium than traditional methods. That's a big deal, as demand for lithium is expected to 5X current production levels by 2040. Today, they’re moving toward commercial production, tapping into 100,000+ acres of lithium deposits in Chile, a potential $1.1B annual revenue opportunity at projected market prices.

Right now, you can invest at this pivotal growth stage for $11/share. But only through February 26. Become an early-stage EnergyX shareholder before the deadline.

This is a paid advertisement for EnergyX Regulation A offering. Please read the offering circular at invest.energyx.com. Under Regulation A, a company may change its share price by up to 20% without requalifying the offering with the Securities and Exchange Commission.

OpenAI CEO Sam Altman is pushing back against criticism of AI’s resource footprint.

At the India AI Impact Summit, he dismissed claims that ChatGPT uses “gallons of water per query” as:

“Completely untrue. Totally insane.”

But while he called water fears exaggerated, he acknowledged that AI’s energy consumption is a legitimate concern.

💧 The Water Debate

Data centers traditionally use water for cooling servers.

Critics argue:

• AI expansion = massive water draw
• Local water systems could face stress
• Cooling demand will rise as AI scales

A recent report from Xylem and Global Water Intelligence projected that water drawn for cooling could more than triple in 25 years as computing demand grows.

Altman says per-query water claims circulating online are disconnected from reality.

And it’s true that:

• Some modern data centers use air cooling
• Others use closed-loop systems
• Some avoid water entirely

The real issue isn’t one ChatGPT prompt.

It’s aggregate infrastructure growth.

🔋 Energy Is the Bigger Question

Altman admitted total energy use is a fair concern.

Not per query.

But at scale.

As AI adoption explodes, so does compute demand.

According to the IMF, global data center electricity use in 2023 already rivaled countries like Germany or France.

That was just the beginning.

🧠 AI vs Human Energy Comparison

Altman also made a controversial comparison:

People criticize AI training energy…

But it takes 20 years of food and life energy to “train” a human.

He argues the fair comparison is:

Energy per answer (AI inference)
vs
Energy per human answer

Inference — once a model is trained — is far less energy-intensive than training itself.

Measured that way, Altman suggests AI may already be competitive.

⚖️ The Pushback

Not everyone agrees.

Sridhar Vembu criticized equating AI systems with human beings.

Critics argue:

• Human energy consumption isn’t optional
• AI energy use is additive, not replacement
• AI scaling could strain grids

Communities have already pushed back.

In San Marcos, Texas, a $1.5B data center project was rejected after public opposition over energy and water strain.

🌍 The Bigger Energy Shift

Tech leaders increasingly argue:

AI expansion requires:

• Faster renewable deployment
• Nuclear expansion
• Grid modernization

Without new supply, AI demand could:

• Raise electricity prices
• Increase emissions
• Clash with net-zero goals

📌 The Core Question

Is AI a net efficiency gain?

Or a massive new energy load?

Altman’s position:

Water panic is overstated.
Energy demand is real.
The solution is more generation — especially clean energy.

The future of AI may depend less on algorithms…

And more on megawatts.