While the mass adoption of AI has transformed digital life seemingly overnight, regulators have fallen asleep on the job in curtailing AI data centers’ drain on energy and water resources.
I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
Same. I think I've read that a single GPT-4 instance runs on a 128 GPU cluster, and ChatGPT can still take something like 30s to finish a long response. A H100 GPU has a TDP of 700w. Hard to believe that uses only 10x more energy than a search that takes milliseconds.