this post was submitted on 13 Mar 2025
712 points (98.9% liked)

Fuck AI

2122 readers
225 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] clonedhuman@lemmy.world 9 points 1 day ago (2 children)

Now guess how much power it took for each one of those wrong answers.

The upper limit for AI right now has nothing to do with the coding or with the companies programming it. The upper limit is dictated by the amount of power it takes to generate even simple answers (and it doesn't take any less power to generate wrong answers).

Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.

https://www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption

If the AI wars between powerful billionaire factions in the United States continues, get ready for rolling blackouts.

[–] Grimy@lemmy.world 4 points 1 day ago* (last edited 1 day ago) (1 children)

It's a drop in the bucket compared to what's actually causing damage like vehicles and plane travel.

Estimates for [training and building] Llama 3 are a little above 500,000 kWh[b], a value that is in the ballpark of the energy use of a seven-hour flight of a big airliner.

https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans-and-large-language-models/

That's around 570 average american homes.

That being said, it's a malicious and stupidly formed comparaison. It's like comparing the cost of building a house vs staying in a hotel for a night.

The model, once trained can be constantly re-used and shared. The llama model has been downloaded millions of time. It would be better to compare it to the cost of making the movie.

An average film production with a budget of $70 million leaves behind a carbon footprint of 3,370 metric tons – that’s the equivalent of powering 656 homes for a year!

https://thestarfish.ca/journal/2025/01/understanding-the-environmental-impact-of-film-sets#%3A%7E%3Atext=While+it%27s+easy+to+get%2C656+homes+for+a+year!

[–] queermunist@lemmy.ml -1 points 1 day ago

The water consumed by data centers is a much bigger concern. They're straining already strained public water systems.

[–] fishy@lemmy.today 2 points 1 day ago

Time for nuclear to make a comeback.