Data centers don’t have “water cooling loops” that are anything like the ones in consumer PCs. To maximize cooling capacity, a lot of the systems use some sort of evaporative cooling that results in some of the water just floating away into the atmosphere (after which point it would need to be purified again before it could be used for human consumption)
It also seems from what I can find like some data centers just pipe in clean ambient-temperature water, use it to cool the servers, and then pipe it right back out into the municipal sewer system. Which is even more stupid, because you’re taking potable water, sending it through systems that should be pretty clean, and then mixing it with waste water. If anything, that should be considered “gray water”, which is still fine to use for things like flushing toilets.
I would be really surprised if anyone is cooling data centres with city water except in emergency, that’s so unbelievably expensive (could see water direct from a lake though but that had it’s own issues too). I recall saving millions just by adjusting a fill target on an evaporative cooling tower so it wouldn’t overfill (levels were really cyclic, targets weren’t tuned for them), and that was only a fraction of what it’d have cost if we’d’ve used pure city.
This is correct. You don’t need potable water for cooling systems. Releasing vapor returns natural water where it came from, without adding any more heat to the environment than you already were.
The environmental cost of AI needs to be measured in gigawatt hours, distributed over different energy generation methods.
Adding heat to the system isn’t a big deal if you’re powered by solar energy, for example.
Wait… What? The article seems to imply that the water is consumed, but it’s referencing the water used in cooling loops.
Data centers don’t have “water cooling loops” that are anything like the ones in consumer PCs. To maximize cooling capacity, a lot of the systems use some sort of evaporative cooling that results in some of the water just floating away into the atmosphere (after which point it would need to be purified again before it could be used for human consumption)
It also seems from what I can find like some data centers just pipe in clean ambient-temperature water, use it to cool the servers, and then pipe it right back out into the municipal sewer system. Which is even more stupid, because you’re taking potable water, sending it through systems that should be pretty clean, and then mixing it with waste water. If anything, that should be considered “gray water”, which is still fine to use for things like flushing toilets.
As with everything else, we need the government to regulate it because otherwise the corporations don’t really give a shit.
I would be really surprised if anyone is cooling data centres with city water except in emergency, that’s so unbelievably expensive (could see water direct from a lake though but that had it’s own issues too). I recall saving millions just by adjusting a fill target on an evaporative cooling tower so it wouldn’t overfill (levels were really cyclic, targets weren’t tuned for them), and that was only a fraction of what it’d have cost if we’d’ve used pure city.
This is correct. You don’t need potable water for cooling systems. Releasing vapor returns natural water where it came from, without adding any more heat to the environment than you already were.
The environmental cost of AI needs to be measured in gigawatt hours, distributed over different energy generation methods.
Adding heat to the system isn’t a big deal if you’re powered by solar energy, for example.
My work drilled water wells for evaporative cooling in their datacenter.