From this 2023 paper, looks like if all Nvidia AI servers are running 24/7, you’d get an energy consumption of about 5.7–8.9 TWh per year. Nvidia servers make up 95% of the AI market (according to the paper) so that’d be pretty close to what AI servers would consume.
The paper also estimates about 20% of crypto mining GPUs no longer mining etherium converted to AI, which contributed another 16.1 TWh per year.
This doesn’t include some AI, but it should be the majority.
Between those two sources, that gives 23.4 TWh per year. That gives 0.08 exta joules per year per this converter. That’s 22% of Sri Lanka’s energy consumption (which is the lowest country).
So AI in a year uses at much energy as Sri Lanka uses in 3 months. At least in 2023. I’ll see if I can find a more recent study.
Yes, but those servers are pretty ai specific, so that’s a decent assumption. Looks like Nvidia is drastically ramping up production of these servers, so current electricity use might be about 10x, I’m working on it.
There’s plenty of countries missing from that rankings list, and I bet those are the ones using less energy. Especially considering microstates like Vatican, the statement could be technically correct
Following for the results of your work here so I can use it in the future.
From this 2023 paper, looks like if all Nvidia AI servers are running 24/7, you’d get an energy consumption of about 5.7–8.9 TWh per year. Nvidia servers make up 95% of the AI market (according to the paper) so that’d be pretty close to what AI servers would consume.
The paper also estimates about 20% of crypto mining GPUs no longer mining etherium converted to AI, which contributed another 16.1 TWh per year.
This doesn’t include some AI, but it should be the majority.
Between those two sources, that gives 23.4 TWh per year. That gives 0.08 exta joules per year per this converter. That’s 22% of Sri Lanka’s energy consumption (which is the lowest country).
So AI in a year uses at much energy as Sri Lanka uses in 3 months. At least in 2023. I’ll see if I can find a more recent study.
So that assumes AI requests use 100 percent of the hardware 100 percent of the time.
Yes, but those servers are pretty ai specific, so that’s a decent assumption. Looks like Nvidia is drastically ramping up production of these servers, so current electricity use might be about 10x, I’m working on it.
100% utilization 100% of the time? That seems like an unlikely figure, right?
100% utilization yes, but server uptimes are in the 99 percent range.
This is the kind of comment I love on Lemmy.
There’s plenty of countries missing from that rankings list, and I bet those are the ones using less energy. Especially considering microstates like Vatican, the statement could be technically correct