I see where you're coming from, but the planet was not on the verge of a climate breakdown at the time. I think that's the implicit context in the block post that your reply misses.
The marginal impact of AI on global climate probably rounds down to zero, or at least is not the thing driving us over the cliff:
> Global energy consumption in 2023 was around 500,000 GWh per day. That means that ChatGPT’s global 3GWh per day is using 0.0006% of Earth’s energy demand.
How much of that 500 TWh would disappear if you round to zero everything that consumes 3GWh per day or less? And keep in mind you are calculating for a single app (ChatGPT), nor for an entire industry or tech. Don't look at "civil aviation" but at each airline in particular.
I suspect nearly all of it. You framed it such that it appears insignificant, but with this framing nothing is significant.
ChatGPT is probably the majority of LLM use today, so this is not the equivalent of claiming that neither your car nor mine nor anybody else's uses more than 3GWh per day and therefore cars don't use a significant amount of energy collectively.
Also, though this is not really relevant to your major incorrect point, airlines aren't part of civil aviation.
I guess I'd frame it another way: how many LLM conversations a month would you have to have to increase your energy consumption by 1% vs how many more hot showers
if you are actually worried about your ChatGPT energy usage, skip a hot shower or spend a few hours less playing Cyberpunk 2077 and a few more hours reading an old book.
I definitely think breaking out ChatGPT on its own is too fine-grained. At the very least, one should lump all LLM training and use, or perhaps all deep learning training and use together.
I suspect at a "sensible" breakdown (trying to avoid the "How long is my coastline?" problem), which is presumably something akin to Zipfian, the main uses will actually account for most energy usage.
If anyone can find a breakdown at the level of granularity that "all deep learning" would make sense, let me know: so far my Googling has led primarily to either detailed breakdowns of energy _sources_, or high-level breakdowns of use at the level of "industry", "agriculture", or "residential use".
That may be true, but I guess my approach to it would be that as a general principle we shouldn't be using things that increase energy usage for little real gain, and this is something that is in my professional field that I can do something about personally. It does come somewhat from a place of feeling like the world as a whole is absolutely failing to get to grips with the significant large-scale changes that are needed.
a single hot shower uses more Watt-hours than most people spend on ChatGPT in a week, or a month. if people really want to cut down on how many watt-hours they're using on frivolous things there are dozens of things they should do first. if people are spending time worried about their ChatGPT energy use more than they are spending time going vegan or getting used to cold showers then they have been grossly misinformed about what the relative impact is
(to be clear: I'm not vegan. but most people, incl. myself, could be quite easily, and it would save several orders of magnitude more energy and water)
Ultimately we're trying to persuade people to reduce their quality of life in order to reduce their energy footprint. That's already a difficult sell. It becomes an impossible sell when any reductions that might be achieved are immediately offset by a new energy-hungry technology coming online - especially when the long-term effect (goal, even) is to reduce yet further the quality of life of everyday people.
convincing people to change anything is very hard, so why are people spending time wringing their hands over LLM use? if you convinced one person to go vegan or get into cold showers for a year you'd do more for the planet than convincing a hundred people not to pick up ChatGPT
I'm actually not wringing my hands over people choosing to use LLMs - I'm wringing my hands over the irresponsibility of the tech world in introducing and making available a technology whose existence will be responsible for a big chunk of the expected doubling of datacentre energy use over the next decade.
But the environmental aspects aren't even my main concern: my main concern is where that energy is coming from - and how the idea of Three Mile Island being restarted solely for Microsoft's benefit marks a huge shift (that no-one seems to have noticed) towards infrastructure that was previously at least nominally for public use being reserved for the use of a private corporation.
I'm also annoyed by the kleptocratic attitude towards training data, but I'll happily accept that reasonable minds may differ on that subject.
So while I don't use LLMs myself for ethical reasons, I don't have a problem with other people choosing to use them.
The reason we have any environmental regulations at all was from mass death and disease caused by technology. Far more people died due to the agricultural revolution then will ever die due to climate change in the next few decades in the worst predictions.
The Black Death alone was 25-50 million people in 7 years