Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
It’s used like sweating. We lose heat by havibg water evaporate off our skin. Right now get warm water and put it on your arm, then blow on it. It gets cold until it is fully evaporated. For water to change from the liquid to the gaseeous phase it needs energy. Think like water molecules are holding hands in a liquid. If one of them wants to come free and fly through the air it needs to somehow get the energy to break free from the grip of the others first. When water evaporates from your arm it tales this energy in the form of heat. It turns heat and uses it to get to the gaseous phase. As long as there is water on your arm it can be cooled that way.
That’s what data centers do as well. They take water to cool their processors and the let part of it evaporate into the air. That way the parts of the water that remain are like your arm - the get cool quickly.
It’s very effective. But if you live in a small town and next door there’s a massive datacenter that takes out all the groundwater and basically just boils it until it disappears, you might get angry after a while.
The simply answer is that your A/C dumps heat outside using big metal fins, it’s not terribly great, but it works well at that scale.
Dissipating it into the air for the amount of heat some data centers need to get rid of doesn’t cut it, so they use evaporative coolers.
The phase change of evaporating water from liquid to gas uses approximately 7x more heat energy than taking room temperature water and getting it up the boiling point itself.
Essentially they stick their large metal fins from the AC into a large pool of water and boil it off. This gets rid of the energy with a much smaller and cheaper system, but uses up water.
Edit: To clarify on the water in your home AC, the water is actually being collected INSIDE on the chilling unit as lowering the temperature decreases the ability for the air to hold water, it’s then being pumped outside. Data centers recirculate most of their air, letting in only a small amount of fresh air that they pre-chill, rather than letting fresh air in from the outside all the time like your home does.
Home hvac units don’t let in fresh air all of the time, they recycle air.
The HVAC system no, the home itself, yes.
Depending on how old your home is of course, newer homes tend to have lower exchange rates.
Also datacenters don’t have windows, or even doors constantly letting people in and out of cooled areas and outside.
My HVAC system has an inside and outside air source.
Server farms use water to cool computers, it’s like water called computers but on a bigger scale
Aircons condense water from the atmosphere the same way water on your shower mirror happens because the mirror is colder than the fog
If you’re familiar with condensation and the rain cycle it should help you understand further
They use evaporative cooling for a chiller plant most likely.
Why would they design around evaporative cooling when water consumption is a problem?
Because evaporative cooling is much cheaper and easier to accomplish at scale, and megacorps don’t care about long-term resource constraints until it begins to affect their wallets.
also places in red states allow for free or cheap polluting, and waste.
Because it’s cheap and easy.
because it’s cheap, easy, compact, well understood, and makes numbers look good. number in question is ratio of energy used by entire facility to energy used by silicon only (i forgor how it’s called). alternative is dissipating heat from radiators, but this makes this number like 3. evaporative cooling makes this number closer to 1.2
number in question is ratio of energy used by entire facility to energy used by silicon only (i forgor how it’s called)
sounds like we need to charge them more for water
Instead, WE are paying more for water and power to subsidize them.
Because they are assholes who hate the environment. The same reason they are using fossil fuels to power their slop centers instead of renewables.
most effective form of heat transfer
Because line must go up
it wasn’t a problem before they started doing this
What I don’t get is how the water is “consumed”, it’s not like it’s gone right? It evaporates and then just comes back down as rain surely?
Same with water consumption of a sweater or a steak.
There probably is some good reason for measuring it like that but conceptually I don’t get it.
Even though there is loads and loads of water on the planet, the amount of fresh/drinkable/usable/accessible water is tiny. This water evaporates and rains back down, but this will most likely fall over the ocean, or on land and go into the ground, or into some other unusable area/form.
Water suitable for human use is a scarce commodity and needs to be preserved. Of all the water lost to the atmosphere from server cooling systems, almost none of it can be recaptured again.
Why are they not using a closed loop system with condensers collecting the evaporated water?
I’ll give you one gue$$
A condenser will generate the same amount of heat that they are trying to dissipate.
Collect and condense the hot water vapor, concentrate the heat until you’ve got steam; then pump it through a steam turbine recapturing that energy as electricity.
I’m sure there’s some difficulties and nuances I’m not seeing right away, but it would be nice to see some sort of system like this. Most power plants generate heat, then turn that into electricity. Data centers take electricity and turn it back into heat. There’s gotta be a way to combine the two concepts.
The difficulty is, to put it in very simple terms, is that physics doesn’t allow that. The less simple explanation is a thermodynamics textbook, and trust me, you don’t want that.
Everything generates heat. Everything. Everything. Anything that seems to generate “cold” is generating more heat somewhere else.
Yeah, thermodynamics are a thing. I’m not trying to claim some free energy system saying you could power the whole data center; but if you could re-capture some of the waste heat and convert it back into electricity, putting that energy to work instead of just venting to atmosphere, it could potentially help offset some of the raw electrical needs. An efficiency improvement, that’s all.
Yeah, if we just take heat pumps for example, or even cpu water coolers, the heat is carried away from where it’s hot to somewhere where it can be radiated off and equilibrium of heat conducting material and surrounding occurs.
You can bet your ass that these US data center are just brute forcing heat exchange via evaporation instead to make the initial investment cheaper. It’s the equivalent to burning coal instead of straight up going for the renewable but initially more costly option when it comes to energy production.
Actually more because thermodynamics is a cruel mistress.
A condenser can be as simple as a glass dome in a cool room. There is no need for any electricity or heat.
Note your use of the word “cool.”
Pretty sure the glass dome traps the heat they’re trying to dissipate.
It’s literally there to let evaporated water cool and become liquid again. 🤦♂️
And what happens to the heat? Heat can’t just magically disappear which means water can’t cool without heat being able to dissipate somewhere. So it would have to dissipate heat into the dome. What happens to the dome if you keep pumping hot vapor into the dome? It heats up. If it heats up the water vapor stops cooling and the entire cooling system stops working.
I’m not saying it couldn’t work in theory, I’m saying it doesn’t work in practice because the dome would have insanely big, maybe the size of small nation big.
Hadley cells bring them inland does it not where it condenses and rains flowing back towards the ocean where it again evaporates and travels inland rains and goes back to the ocean.
This is the complicated part with water consumption, saving water in the Netherlands won’t make rain in Morroco.
However, there is only so many rain water stored in the ground at a given time and brought by the rivers. This water need to be used mostly for agriculture, then human consumption, and finally industry. Once it’s back in the cloud we don’t fully know where it will fall again, let alone if it’s polluted.
Sure it’s a renewable ressource, the problem start when you the water faster than the rate at which it renews, especially during summer. In Europe the problem will be even worse with the global warming. The alpine glacier are disappearing meaning that we’ll loose a major water reserve for summer
Just cause you use water in one place doesn’t mean it’ll come back in the same place.
It does though doesn’t it
I mean eventually yeah, but not fast enough for you to keep using it that way.
Especially now that air holds more moisture since rising temperatures keep the atmosphere warmer and rain is less frequent.
Most people are not directly collecting rain to drink.
Yea we are
To add to what others said, it’s a tradeoff.
Your gaming PC not only runs up your electric bill from the wall, but the AC as well. It has to work to get all that heat out.
This is the equivalent of water cooling your PC, and piping it to a hot tub outside. It would heat it and evaporate water faster, but it’s basically free and uses basically no electricity.
That’s the tradeoff. It’s water evaporation instead of heat pumps. It’s trading water usage for lots of electricity usage, which in some cases, is well worth it.
And what if you live in a cold climate, you say? Well, evaporative cooling is most cost efficient in hot and (ironically) dry climates.
taps the fact that electricity is a steam reaction and even if you don’t see it, the electricity you’re using is made by decompressing water into vapor, whether by burning coal through turbines/boiled wind from water sources creating wind power/ even nuclear reactors are often a boiling water reaction going through turbines, creating a net loss of ‘water’ if we don’t have natural condensation utilities to convert ‘air’
that water usually goes through a heat exchanger in a closed loop. there’s a reason most power plants are built by lakes.
also, explain “boiled wind”?
If it’s truly a closed loop, why do you need a lake, a true closed loop has zero need for local water sources, else there’s some sort of negative that they’re compensating for which, in case of local water sources, there’s not enough infrastructure if any of that water OR HEAT leaves the system faster than it enters
to be the other end of the heat exchanger?
Water exchange above large bodies of water causes thermal dynamic exchanges deliberating speeds of wind currents
that’s not really a big factor.
Yeah because you’ve measured the water intake and export of every large body of water I forgot you’re obviously an expert who knows how to read when a data center takes more water than a town, love your stern optimism, maybe like, wander off somewhere else so you feel important in your views, because it ain’t with me here bud
no, you obviously don’t want people to talk to you. that’s fair.
The water is for sure going up there with help, it won’t come back down without help in equal measure, it’s dynamics are completely spun out
At this point, I wouldn’t be surprised if it turned out that they’re destroying the environment on purpose for some nefarious purpose. e.g. maybe they think it’s easier to rule the masses if natural ressources are very scarce.
I think it’s more likely that yellow journalism is making an issue out of something that isn’t as big a deal as it is.
Since when does yellow journalism care about the environment?
When haven’t they?
Fear and anger sell.
This AI shit is the leftist version of “illegal immigrants are stealing yur jobs”
To be fair, the “infinite scaling” vision Altman and such are selling is quite a dystopia. And they are the ones pushing it.
It’s not reality at all. But it’s kinda reasonable for people to hate that specifically.
Isn’t everything infinitely scaling?
No.
The path I see forward for ML is small, task specific models running on your smartphone or PC, with some kind of bitnet architecture so it uses basically no power.
That’s the hope, anyway, but all the pieces already exist. Bitnet works, extreme task specific training works with a paper that just came out, NPU frameworks are starting to come together.
If that sounds incompatible with corporate AI, that’s because it is.
No