The AI Layoff Trap (arxiv.org)

by armcat 104 comments 62 points
Read article View on HN

104 comments

[−] drivebyhooting 32d ago
The trick is bypassing the human consumer as well. Companies satisfy (human) consumer needs as a byproduct of profit maximization. But human consumers are inefficient. They have to sleep, require medical care, etc.

A purely machine economy would be far more efficient. Therefore in the limit we should eliminate reliance on human labor and consumption to build a more perfect and efficient world.

[−] leokennis 32d ago
Idea! Maybe these now redundant humans can be turned into a kind of battery, so they serve as a source of energy for the machines?

Perhaps it's then smart to make the humans have a brain/computer interface, to make then dream/think they are living in a normal society so they don't revolt.

[−] treetalker 32d ago
Agents everywhere!!

Do you like what I've done with the place?

[−] atmavatar 32d ago
A tragedy is that supposedly the original idea was to use human minds for processing power, which would have been far superior to the thermodynamically laughable idea to use them as power sources.
[−] giacomoforte 32d ago
You jest, but isn't this the logical conclusion? A sufficiently smart AGI has no need for humanity, at all.
[−] rickydroll 31d ago
I know people like to make Matrix jokes about what the future of AGI would be like, but I suspect it would be more like the relationship between humans and Cheela in Dragon's Egg.
[−] Incipient 32d ago
But then, what need do we have for the AI?
[−] librasteve 32d ago
viz. Iain M. Banks culture, where the AI keep humans as pets
[−] paulpauper 32d ago
The "economy on a chip" thought experiment .
[−] samrus 32d ago
The humans consume to fulfill needs, how do those needs get fulfilled in a post human economy?
[−] Eddy_Viscosity2 32d ago
Large populations of humans not getting their basic needs fulfilled is a feature. That way, those who do have lots of resources get to have even more intense feelings of superiority and that of having high status. That is, after all, the point of wealth accumulation (at least a certain point where more money does not change your life quality meaningfully). For status, its the delta between you and others that matters most. If raising your own wealth has diminishing quality of life returns, then lowering other peoples is the only effective way of increasing the delta.
[−] exitb 32d ago
Having large amounts of people with unfulfilled needs is not exactly a novel idea.
[−] ithkuil 32d ago
Just train machines on the huge corpus of human needs so they can need things like no human has needed things before.

What can possibly go wrong

[−] xyzal 32d ago
I use AIs for coding with moderate success, but the more I work with them, the more I am convinced that "intelligence on tap" is a pipe dream, especially in domains where logical thinking in novel (ie not-in-dataset) contexts is required.

Recently, I tasked it to study a new Czech building permit law in conjunction with some waste disposal regulations and the result was just tragic. The model (opus 4.6) just could not stop drawing conclusions from obsolete regulations in its training dataset, even when given the fulltext of the new law. The usual "you are totally right" also applied and its conclusions were most of the time obviously wrong even to a human with cursory knowledge of the subject.

I ended with studying the relevant regulations myself over the weekend.

[−] semiinfinitely 32d ago
neo-luddism dressed up in economic jargon. the authors suggest the only effective tool is to tax companies based on how much automation they achieve. Penalizing efficiency is a guaranteed recipe for stagnation and if we'd done this at any point in our past we would have not made it out of the dark ages
[−] bwhiting2356 32d ago
If robotics progress starts to pick up, I'll take this more seriously. Right now, there's practically infinite demand for labor in construction, manufacturing, agriculture and many other industries. All kinds us good projects that could be happening, if you dig into why, labor intensive work is a factor. Why didn't the hydroponics project take off? Why is that still an empty lot instead of a new home? Why isn't there live theatre in this small city? Why is there a pot hole in the bike lane?
[−] efitz 32d ago
AI layoffs are very shortsighted IMO and should be viewed by investors as a sign of weakness in management or the business itself.

If everyone is going to increase productivity by some factor k per employee, then kx is the new norm of overall productivity of x employees.

If you lay off some percentage Y of your work force, then your expected gains will only be k(x(100-y)/100). In other words, you will not recognize the same productivity gains as your competitors that chose not to lay off.

Yes I realize it is more complex than that, because of reduced opex, but there are diminishing returns very quickly.

[−] rvz 32d ago
Let’s take AGI to its inevitable raw conclusion. Not by the definition (ab)used by clueless VCs screaming about abundance, but by what is already happening using the worst case:

The abundance of mass layoffs and job displacement due to funding and building of AI systems is the true definition of AGI.

We might as well get there faster instead of delaying it. You have already seen Oracle and Block attributing their layoffs to AI so it is happening right now.

So why delay any further and just get it over with.

[−] rsalus 32d ago
Great paper.

> If AI displaces human workers faster than the economy can reabsorb them

Big if.

[−] khalic 32d ago
While I agree with the general sentiment that this requires monitoring and study, the abstract is _very_ tendentious, lays multiple hypothesis as facts and doesn’t provide any measurement or alternatives to their preferred solution.

This isn’t a scientific study, it’s a militant manifesto

[−] josefritzishere 32d ago
DApplying a Pigouvian tax to Ai companies might be difficult. None of them actually generate profit, nor have a model to ever do so. I think the huype machien needs to stop spinning before a solution like that would become practical.
[−] yobbo 32d ago
Start by shifting taxation from worker incomes to corporate incomes?
[−] incomingpain 32d ago
The way I see OP: It's the Luddite Fallacy.

Textile workers worried about machinery saying it's going to result in collapse of the economy when they lose their job.

But every time there's a new technology that threatens some jobs because of an increase in efficiency but investment thus can be placed in other locations creating new jobs. different kinds of jobs.

It's not different this time. Each time the luddite movement is wrong. They are solely concerned with their own selfish concerns and the demands to stop the technological improvement will not be heeded.

[−] isoprophlex 32d ago
So... the solution is basically "pay tax on the demand that you're destroying".

We can all hate on the premise (ai is good enough to do this) and/or the solution presented (centrally enforced taxation), but you gotta admit:

the messaging from SV's AI leaders about how "ai will take all your jobs" is confused as fuck, because if so, who will be on the consuming end of things?

[−] tiveriny 32d ago
[dead]
[−] slopinthebag 32d ago
Still looking for the AI in the room. Where is it exactly? Surely nobody is claiming LLMs are AI?
[−] paulpauper 32d ago
If AI displaces human workers faster than the economy can reabsorb them, it risks eroding the very consumer demand firms depend on.

That is a huge "if" though. I am not sure either that the latter falls from this. When the US transitioned away from assembly lines or agriculture dominated, it's not as if consumer spending consequently collapsed.