1 Comment

The article rightly calls attention to growing AI energy use, and documents widely varying projections. However, a nod to Jevons paradox is insufficient rebuttal to efficiency gains. As the authors note, the internet did *not* consume half of all energy use, but only 2%. Missing here are substantial efforts at efficiency in AI, especially for inference, which is the main worry as use proliferates. Specialized chips like Groq's claim 10x efficiency over GPUs on existing algorithms, and new algorithms that avoid matrix multiplications (Zhu et al 2024) claim inference at ~13 watts on FPGA hardware, close to what the human brain uses.

Expand full comment