Stating "Software engineers are all-in on AI" because of an increase in github projects being created is hilarious. I didn't realise creating a github repo made someone a software engineer. If only I had known this I wouldn't have bothered learning all the other stuff!
> The report estimates that training the latest frontier large language models, such as xAI’s Grok 4, can generate over 72,000 tons of carbon-equivalent emissions.
That seems pretty trivial, relative to 38bn per year globally?
> The report estimates that carbon emissions from models with the least efficient inference are over 10 times as high as those with the most efficient inference. DeepSeek’s V3 models were estimated to consume around 23 watts when responding to a “medium-length” prompt, while Claude 4 Opus was estimated to consume about 5 watts.
This makes absolutely no sense. I suppose they meant watt hours, and that's a weird way to explain carbon emissions...
The "China leads in robotics" seems to be unaffected by AI. The China line is basically on the same trajectory since 2012. The chart does no belong in the article.
> The capabilities of AI models have improved with incredible speed over the past decade, and as the graph above shows, progress seems to be accelerating.
errr… no? Every discipline is clearly hitting a plateau so far. Some started recently and hence haven’t yet (competition maths) but based on past graph, they will all plateau.
62 comments
cause up until now I have observed the exact opposite which is coherent with expectations: https://coding2learn.org/blog/2013/07/29/kids-cant-use-compu...
> The report estimates that training the latest frontier large language models, such as xAI’s Grok 4, can generate over 72,000 tons of carbon-equivalent emissions.
That seems pretty trivial, relative to 38bn per year globally?
https://news.ycombinator.com/item?id=47758028
Source: https://hai.stanford.edu/ai-index/2026-ai-index-report
> The report estimates that carbon emissions from models with the least efficient inference are over 10 times as high as those with the most efficient inference. DeepSeek’s V3 models were estimated to consume around 23 watts when responding to a “medium-length” prompt, while Claude 4 Opus was estimated to consume about 5 watts.
This makes absolutely no sense. I suppose they meant watt hours, and that's a weird way to explain carbon emissions...
> Training AI models can generate enormous carbon emissions
Sure, but what I'd really like to see is a graph for how much carbon is generated serving these models globally.
The absence speaks volumes.
> The capabilities of AI models have improved with incredible speed over the past decade, and as the graph above shows, progress seems to be accelerating.
errr… no? Every discipline is clearly hitting a plateau so far. Some started recently and hence haven’t yet (competition maths) but based on past graph, they will all plateau.