• ddh@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 hours ago

    No, the current corporate approach of large scale AI typically uses a lot of water. It is not the only approach.

    Let’s also not forget that there are places where water is plentiful and these levels of consumption would be OK. The problem is they are competing with other water uses.

  • reddig33@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    12 hours ago

    Interesting article. I do wonder why a more of the data centers aren’t located in colder climates. I’ve read people are investigating piping/reusing the waste heat to warm homes and businesses in colder cities. There was one data center (not AI related) that was using their heat to warm an indoor pool.

    The water usage they mention looks like it’s for cooling. I’m guessing they mean evaporative coolers? That would mean that the water wouldn’t have to be potable. Requiring the systems to use grey water might help. I also wonder if they could be located on the coastline and just use filtered seawater.

    The other interesting point was about the amount of electricity at these places used to generate content. Some device manufacturers are moving towards on device AI. I’m guessing that would use a lot less electricity.

    And then weren’t there some recent advances in AI models that didn’t require as much computing power? Maybe this will help dial back their electrical use.

    It just seems like in the race to get ahead in AI, people are throwing money and processing power at the problem without worrying about the environmental consequences. Of course that’s usually the way it’s done, all in the name of making money. It’s good to see the industry being called out for it.