Pull to refresh

Graphs that explain the state of AI in 2026 (spectrum.ieee.org)

by bryanrasmussen 62 comments 115 points
Read article View on HN

62 comments

[−] fyrn_ 26d ago
Worth calling out AI sentiment among young people is not nearly so rosy: https://news.gallup.com/poll/708224/gen-adoption-steady-skep...
[−] BerislavLopac 26d ago
That's temporary. They will adapt and find ways to use it to its full potential - just like it happened with every new technological shift in history.
[−] free_bip 26d ago
Would you mind asking your crystal ball some other questions - like what those ways of using it are exactly?
[−] bigbugbag 26d ago
when exactly did that happen ?

cause up until now I have observed the exact opposite which is coherent with expectations: https://coding2learn.org/blog/2013/07/29/kids-cant-use-compu...

[−] johnnienaked 13d ago
Metaverse?
[−] whateveracct 26d ago
don't the young usually pick up new tech faster?
[−] claudiug 26d ago
bullshit. you hear that you are not needed, you data is not yours. the AI lovers thinking: "humans also consume energy".
[−] johnnienaked 26d ago
This seems to be a slow discovering of the inherent limitations.
[−] i_love_retros 26d ago
Stating "Software engineers are all-in on AI" because of an increase in github projects being created is hilarious. I didn't realise creating a github repo made someone a software engineer. If only I had known this I wouldn't have bothered learning all the other stuff!
[−] tqi 26d ago

> The report estimates that training the latest frontier large language models, such as xAI’s Grok 4, can generate over 72,000 tons of carbon-equivalent emissions.

That seems pretty trivial, relative to 38bn per year globally?

[−] amelius 26d ago
Also nobody will ever have a moat, so the graph of investor stupidity is going through the roof.
[−] eulgro 26d ago

> The report estimates that carbon emissions from models with the least efficient inference are over 10 times as high as those with the most efficient inference. DeepSeek’s V3 models were estimated to consume around 23 watts when responding to a “medium-length” prompt, while Claude 4 Opus was estimated to consume about 5 watts.

This makes absolutely no sense. I suppose they meant watt hours, and that's a weird way to explain carbon emissions...

[−] HelloMcFly 26d ago
Besides the lead in robotics for China, those Grok emissions charts are the thing that most leap off the page.
[−] xnx 26d ago
The "China leads in robotics" seems to be unaffected by AI. The China line is basically on the same trajectory since 2012. The chart does no belong in the article.
[−] cloud-oak 26d ago

> Training AI models can generate enormous carbon emissions

Sure, but what I'd really like to see is a graph for how much carbon is generated serving these models globally.

[−] themafia 26d ago
Profits generated by AI:

The absence speaks volumes.

[−] hydrocomplete 26d ago
I still don't understand the State of AI in 2026.
[−] illiac786 26d ago

> The capabilities of AI models have improved with incredible speed over the past decade, and as the graph above shows, progress seems to be accelerating.

errr… no? Every discipline is clearly hitting a plateau so far. Some started recently and hence haven’t yet (competition maths) but based on past graph, they will all plateau.

[−] bix6 26d ago
China’s robotics lead holy cow.