Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • exocortex@discuss.tchncs.de
    link
    fedilink
    arrow-up
    20
    ·
    24 hours ago

    It’s used like sweating. We lose heat by havibg water evaporate off our skin. Right now get warm water and put it on your arm, then blow on it. It gets cold until it is fully evaporated. For water to change from the liquid to the gaseeous phase it needs energy. Think like water molecules are holding hands in a liquid. If one of them wants to come free and fly through the air it needs to somehow get the energy to break free from the grip of the others first. When water evaporates from your arm it tales this energy in the form of heat. It turns heat and uses it to get to the gaseous phase. As long as there is water on your arm it can be cooled that way.

    That’s what data centers do as well. They take water to cool their processors and the let part of it evaporate into the air. That way the parts of the water that remain are like your arm - the get cool quickly.

    It’s very effective. But if you live in a small town and next door there’s a massive datacenter that takes out all the groundwater and basically just boils it until it disappears, you might get angry after a while.

  • BlameThePeacock@lemmy.ca
    link
    fedilink
    English
    arrow-up
    48
    ·
    edit-2
    1 day ago

    The simply answer is that your A/C dumps heat outside using big metal fins, it’s not terribly great, but it works well at that scale.

    Dissipating it into the air for the amount of heat some data centers need to get rid of doesn’t cut it, so they use evaporative coolers.

    The phase change of evaporating water from liquid to gas uses approximately 7x more heat energy than taking room temperature water and getting it up the boiling point itself.

    Essentially they stick their large metal fins from the AC into a large pool of water and boil it off. This gets rid of the energy with a much smaller and cheaper system, but uses up water.

    Edit: To clarify on the water in your home AC, the water is actually being collected INSIDE on the chilling unit as lowering the temperature decreases the ability for the air to hold water, it’s then being pumped outside. Data centers recirculate most of their air, letting in only a small amount of fresh air that they pre-chill, rather than letting fresh air in from the outside all the time like your home does.

      • BlameThePeacock@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        The HVAC system no, the home itself, yes.

        Depending on how old your home is of course, newer homes tend to have lower exchange rates.

        Also datacenters don’t have windows, or even doors constantly letting people in and out of cooled areas and outside.

  • x4740N@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    21 hours ago

    Server farms use water to cool computers, it’s like water called computers but on a bigger scale

    Aircons condense water from the atmosphere the same way water on your shower mirror happens because the mirror is colder than the fog

    If you’re familiar with condensation and the rain cycle it should help you understand further

  • Fiery@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    1 day ago

    What I don’t get is how the water is “consumed”, it’s not like it’s gone right? It evaporates and then just comes back down as rain surely?

    Same with water consumption of a sweater or a steak.

    There probably is some good reason for measuring it like that but conceptually I don’t get it.

    • cRazi_man
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      1 day ago

      Even though there is loads and loads of water on the planet, the amount of fresh/drinkable/usable/accessible water is tiny. This water evaporates and rains back down, but this will most likely fall over the ocean, or on land and go into the ground, or into some other unusable area/form.

      Water suitable for human use is a scarce commodity and needs to be preserved. Of all the water lost to the atmosphere from server cooling systems, almost none of it can be recaptured again.

          • Darkassassin07@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            12 hours ago

            Collect and condense the hot water vapor, concentrate the heat until you’ve got steam; then pump it through a steam turbine recapturing that energy as electricity.

            I’m sure there’s some difficulties and nuances I’m not seeing right away, but it would be nice to see some sort of system like this. Most power plants generate heat, then turn that into electricity. Data centers take electricity and turn it back into heat. There’s gotta be a way to combine the two concepts.

            • SaltSong@startrek.website
              link
              fedilink
              English
              arrow-up
              3
              ·
              4 hours ago

              The difficulty is, to put it in very simple terms, is that physics doesn’t allow that. The less simple explanation is a thermodynamics textbook, and trust me, you don’t want that.

              Everything generates heat. Everything. Everything. Anything that seems to generate “cold” is generating more heat somewhere else.

              • Darkassassin07@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 hours ago

                Yeah, thermodynamics are a thing. I’m not trying to claim some free energy system saying you could power the whole data center; but if you could re-capture some of the waste heat and convert it back into electricity, putting that energy to work instead of just venting to atmosphere, it could potentially help offset some of the raw electrical needs. An efficiency improvement, that’s all.

                • dubyakay@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  2 hours ago

                  Yeah, if we just take heat pumps for example, or even cpu water coolers, the heat is carried away from where it’s hot to somewhere where it can be radiated off and equilibrium of heat conducting material and surrounding occurs.
                  You can bet your ass that these US data center are just brute forcing heat exchange via evaporation instead to make the initial investment cheaper. It’s the equivalent to burning coal instead of straight up going for the renewable but initially more costly option when it comes to energy production.

                • Goodeye8@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  24 hours ago

                  And what happens to the heat? Heat can’t just magically disappear which means water can’t cool without heat being able to dissipate somewhere. So it would have to dissipate heat into the dome. What happens to the dome if you keep pumping hot vapor into the dome? It heats up. If it heats up the water vapor stops cooling and the entire cooling system stops working.

                  I’m not saying it couldn’t work in theory, I’m saying it doesn’t work in practice because the dome would have insanely big, maybe the size of small nation big.

      • Melvin_Ferd@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        Hadley cells bring them inland does it not where it condenses and rains flowing back towards the ocean where it again evaporates and travels inland rains and goes back to the ocean.

    • Ziggurat@jlai.lu
      link
      fedilink
      arrow-up
      6
      ·
      1 day ago

      This is the complicated part with water consumption, saving water in the Netherlands won’t make rain in Morroco.

      However, there is only so many rain water stored in the ground at a given time and brought by the rivers. This water need to be used mostly for agriculture, then human consumption, and finally industry. Once it’s back in the cloud we don’t fully know where it will fall again, let alone if it’s polluted.

      Sure it’s a renewable ressource, the problem start when you the water faster than the rate at which it renews, especially during summer. In Europe the problem will be even worse with the global warming. The alpine glacier are disappearing meaning that we’ll loose a major water reserve for summer

        • 4am@lemmy.zip
          link
          fedilink
          arrow-up
          6
          ·
          1 day ago

          I mean eventually yeah, but not fast enough for you to keep using it that way.

          Especially now that air holds more moisture since rising temperatures keep the atmosphere warmer and rain is less frequent.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 day ago

    To add to what others said, it’s a tradeoff.

    Your gaming PC not only runs up your electric bill from the wall, but the AC as well. It has to work to get all that heat out.

    This is the equivalent of water cooling your PC, and piping it to a hot tub outside. It would heat it and evaporate water faster, but it’s basically free and uses basically no electricity.

    That’s the tradeoff. It’s water evaporation instead of heat pumps. It’s trading water usage for lots of electricity usage, which in some cases, is well worth it.

    And what if you live in a cold climate, you say? Well, evaporative cooling is most cost efficient in hot and (ironically) dry climates.

  • Thoath@leminal.space
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 day ago

    taps the fact that electricity is a steam reaction and even if you don’t see it, the electricity you’re using is made by decompressing water into vapor, whether by burning coal through turbines/boiled wind from water sources creating wind power/ even nuclear reactors are often a boiling water reaction going through turbines, creating a net loss of ‘water’ if we don’t have natural condensation utilities to convert ‘air’

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      23 hours ago

      that water usually goes through a heat exchanger in a closed loop. there’s a reason most power plants are built by lakes.

      also, explain “boiled wind”?

      • Thoath@leminal.space
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        20 hours ago

        If it’s truly a closed loop, why do you need a lake, a true closed loop has zero need for local water sources, else there’s some sort of negative that they’re compensating for which, in case of local water sources, there’s not enough infrastructure if any of that water OR HEAT leaves the system faster than it enters

      • Thoath@leminal.space
        link
        fedilink
        arrow-up
        1
        ·
        20 hours ago

        Water exchange above large bodies of water causes thermal dynamic exchanges deliberating speeds of wind currents

          • Thoath@leminal.space
            link
            fedilink
            arrow-up
            1
            ·
            13 hours ago

            Yeah because you’ve measured the water intake and export of every large body of water I forgot you’re obviously an expert who knows how to read when a data center takes more water than a town, love your stern optimism, maybe like, wander off somewhere else so you feel important in your views, because it ain’t with me here bud

            • lime!@feddit.nu
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 hours ago

              no, you obviously don’t want people to talk to you. that’s fair.

    • Thoath@leminal.space
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      The water is for sure going up there with help, it won’t come back down without help in equal measure, it’s dynamics are completely spun out

  • rumschlumpel@feddit.org
    link
    fedilink
    arrow-up
    2
    arrow-down
    3
    ·
    edit-2
    1 day ago

    At this point, I wouldn’t be surprised if it turned out that they’re destroying the environment on purpose for some nefarious purpose. e.g. maybe they think it’s easier to rule the masses if natural ressources are very scarce.

    • Melvin_Ferd@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      1 day ago

      I think it’s more likely that yellow journalism is making an issue out of something that isn’t as big a deal as it is.

        • Melvin_Ferd@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          10
          ·
          edit-2
          1 day ago

          When haven’t they?

          Fear and anger sell.

          This AI shit is the leftist version of “illegal immigrants are stealing yur jobs”

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 day ago

            To be fair, the “infinite scaling” vision Altman and such are selling is quite a dystopia. And they are the ones pushing it.

            It’s not reality at all. But it’s kinda reasonable for people to hate that specifically.

              • brucethemoose@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                1 day ago

                No.

                The path I see forward for ML is small, task specific models running on your smartphone or PC, with some kind of bitnet architecture so it uses basically no power.

                That’s the hope, anyway, but all the pieces already exist. Bitnet works, extreme task specific training works with a paper that just came out, NPU frameworks are starting to come together.

                If that sounds incompatible with corporate AI, that’s because it is.