haxor@derp.fooMB to Hacker News@derp.fooEnglish · 1 year agoCritics Furious Microsoft Is Training AI by Sucking Up Water During Droughtfuturism.comexternal-linkmessage-square7fedilinkarrow-up183arrow-down13file-text
arrow-up180arrow-down1external-linkCritics Furious Microsoft Is Training AI by Sucking Up Water During Droughtfuturism.comhaxor@derp.fooMB to Hacker News@derp.fooEnglish · 1 year agomessage-square7fedilinkfile-text
minus-squarebirdcurtains@lemmy.worldlinkfedilinkEnglisharrow-up7·1 year agoCouldn’t they just spend the money and not use evaporative cooling? it’s a solvable problem.
minus-squareVeltoss@lemmy.worldlinkfedilinkEnglisharrow-up13·1 year agoYeah it’s a “this is cheaper and we’re greedy” problem but people will add this to their AI fearmongering and hating circlejerks. Apparently they’re looking into nuclear reactors for this which doesn’t waste as much water from what I understand.
minus-squarelud@lemm.eelinkfedilinkEnglisharrow-up2·1 year agoI don’t understand how a nuclear reactor could cool a datacenter. Could you explain?
minus-squareCADmonkey@lemmy.worldlinkfedilinkEnglisharrow-up1·1 year agoCould supply power for better A/C? I admit that doesn’t make sense, but then neither does it make sense to cool a data center with evaporative water cooling as if it were a hit-and-miss engine from the 1910’s, so I dunno.
minus-squarelud@lemm.eelinkfedilinkEnglisharrow-up2·1 year agoCooling with evaporative cooling does make some sense since it works. It’s absolutely not ideal though. It should totally be a closed water (or other fluid) loop and where possible build datacenters in cooler climates.
minus-squareV H@lemmy.stad.sociallinkfedilinkEnglisharrow-up1·1 year agoLatency limits datacenter placement a lot, but for batch jobs like AI training it’s certainly an option.
Couldn’t they just spend the money and not use evaporative cooling? it’s a solvable problem.
Yeah it’s a “this is cheaper and we’re greedy” problem but people will add this to their AI fearmongering and hating circlejerks.
Apparently they’re looking into nuclear reactors for this which doesn’t waste as much water from what I understand.
I don’t understand how a nuclear reactor could cool a datacenter. Could you explain?
Could supply power for better A/C?
I admit that doesn’t make sense, but then neither does it make sense to cool a data center with evaporative water cooling as if it were a hit-and-miss engine from the 1910’s, so I dunno.
Cooling with evaporative cooling does make some sense since it works. It’s absolutely not ideal though.
It should totally be a closed water (or other fluid) loop and where possible build datacenters in cooler climates.
Latency limits datacenter placement a lot, but for batch jobs like AI training it’s certainly an option.