Hide table of contents

In my math and CS classes, we often have the option to use generative AI. Do you think the energy demands of AI inference are so great that it would be better for students like myself to lay off it as much as possible, or are those demands not significant enough to merit a habit change? I've seen mixed opinions online but I've never heard from an EA-aligned person on this question.

I am curious to hear your perspectives. 

1

0
0

Reactions

0
0
New Answer
New Comment

1 Answers sorted by

Unfortunately, most estimates of LLM energy use are somewhat out of date due to the rise of reasoning models. A small amount of personal usage is probably still not that energy intensive, but I don't think it's negligible anymore. 

The most up-to-date estimates I've seen of AI energy use is this paper here. I recommend you look at table 4. For the o3 reasoning model, which is probably the closest analogue to todays reasoning models, a short query costs something like 7 Wh, a medium query is 20 Wh, and a long query is 30 Wh. Using a non-reasoning model like GPT-4o was much less intensive at like 0.4 Wh for a small query, however in my experience the results tend to be a lot worse. 

So if you end up using like 10 medium queries to a reasoning model over the course of a project, that would add up to 0.2 kWh: if you use 100 queries, that would be 2 kWh.  The typical household energy use is something like 30 kWh per day. So the impact is small, but non-neglible: probably there are other things you can do that will have a bigger impact on energy use. 

Personally, I would be worried about cognitive offloading: I think that an overreliance on AI can hamper your ability to learn things, if you offload mentally difficult tasks to the AI. 

Curated and popular this week
Relevant opportunities