We don’t know how much water data centers use. We just know it’s a lot
This article is two years old.
How did I completely miss that date. Oh my gosh. Good catch!
That’s interesting, I’ve never thought of data centers using water before, but it sounds like they use it for evaporative cooling. Wouldn’t that mean the water’s not really “lost” so much as it’s returned to the environment?
I saw a video recently about the trouble with desalination (turning salt water to fresh water) is that it takes a lot of energy to evaporate the water. Sounds like some smart people need to get together and start cooling data centers with salt water and turning it into fresh water as a byproduct.
Running salt water through any kind of cooling system is going to cause huge problems.
Salt is corrosive. Metal will degrade rapidly from salt water running through it.
Screw data centers, I want to see desalination combined with nuclear power plants. They literally generate power by boiling water, it’s a match made in heaven.
We just need a few more advances in technology to remove impurities from brine and we’d also corner the table salt market.
Reactors that can do that have existed since the 70s and maybe even earlier.
https://en.m.wikipedia.org/wiki/BN-350_reactor
The only reason there aren’t more reactors like that is because most governments have barely been allowing the construction of new reactors period.
Why would a data center need to continously consume water to cool itself? Leaks?
Evaporative cooling systems, such as cooling towers, so that water is non-recoverable.
The article however is mentioning that 3/4 of the water use cited is indirect through power generation.
Didn’t know those were a thing
Water is extremely important in most large scale cooling systems, whether it be swamp coolers (aka evaporative cooling) or traditional HVAC (aka chillers).
That water will be recovered as rain.
But probably will end in ocean
And evaporate to become rain again and again.
I mean, sure, but that’s not ideal for us
It will rain somewhere. Generally places that already have rain. If you’re counting global amount, we have plenty of fresh water, but we don’t have it in the places where we need it.
That can still turn into a local deficit in areas with little rainfall
Evaporative coolers are cheap. It can be done with non-evaporative coolers, but is far more expensive to build.
Not to mention a much higher carbon footprint.
The reason evaporative coolers are cheap is because they use a fraction of the electricity that chillers do.
And note that the majority of data center water usage is indirect via power generation, so using less water on site but more indirectly by consuming more power is both more expensive and less efficient.
Unfortunately, evaporative coolers are the best way to go, for now.
When calculating water use, it’s important to not only look at the water used directly to cool data centers, but also at the water used by power plants to generate that 205TWh.
The researchers also tracked the water used by wastewater treatment plants due to data centers, as well as the water used by power plants to power that portion of the wastewater treatment site’s workload.
From Google’s blog:
Last year, our global data center fleet consumed approximately 4.3 billion gallons of water. This is comparable to the water needed to irrigate and maintain 29 golf courses in the southwest U.S. each year.
From the WaPo article:
A large data center, researchers say, can gobble up anywhere between 1 million and 5 million gallons of water a day — as much as a town of 10,000 to 50,000 people.
They compare it to residential use and I wonder if they add all those sources for that when comparing?
For California at least, residential use is about 10% of all water usage iirc. So if data centers are dwarfed by that…not a big concern in the big picture.
The issue I guess is when data center usage sucks up all the local supply. State and region wide they don’t use much but they do use a lot in one small area.