The trick is bypassing the human consumer as well.
Companies satisfy (human) consumer needs as a byproduct of profit maximization. But human consumers are inefficient. They have to sleep, require medical care, etc.
A purely machine economy would be far more efficient. Therefore in the limit we should eliminate reliance on human labor and consumption to build a more perfect and efficient world.
Idea! Maybe these now redundant humans can be turned into a kind of battery, so they serve as a source of energy for the machines?
Perhaps it's then smart to make the humans have a brain/computer interface, to make then dream/think they are living in a normal society so they don't revolt.
A tragedy is that supposedly the original idea was to use human minds for processing power, which would have been far superior to the thermodynamically laughable idea to use them as power sources.
I know people like to make Matrix jokes about what the future of AGI would be like, but I suspect it would be more like the relationship between humans and Cheela in Dragon's Egg.
Large populations of humans not getting their basic needs fulfilled is a feature. That way, those who do have lots of resources get to have even more intense feelings of superiority and that of having high status. That is, after all, the point of wealth accumulation (at least a certain point where more money does not change your life quality meaningfully). For status, its the delta between you and others that matters most. If raising your own wealth has diminishing quality of life returns, then lowering other peoples is the only effective way of increasing the delta.
I use AIs for coding with moderate success, but the more I work with them, the more I am convinced that "intelligence on tap" is a pipe dream, especially in domains where logical thinking in novel (ie not-in-dataset) contexts is required.
Recently, I tasked it to study a new Czech building permit law in conjunction with some waste disposal regulations and the result was just tragic. The model (opus 4.6) just could not stop drawing conclusions from obsolete regulations in its training dataset, even when given the fulltext of the new law. The usual "you are totally right" also applied and its conclusions were most of the time obviously wrong even to a human with cursory knowledge of the subject.
I ended with studying the relevant regulations myself over the weekend.
neo-luddism dressed up in economic jargon. the authors suggest the only effective tool is to tax companies based on how much automation they achieve. Penalizing efficiency is a guaranteed recipe for stagnation and if we'd done this at any point in our past we would have not made it out of the dark ages
If robotics progress starts to pick up, I'll take this more seriously. Right now, there's practically infinite demand for labor in construction, manufacturing, agriculture and many other industries. All kinds us good projects that could be happening, if you dig into why, labor intensive work is a factor. Why didn't the hydroponics project take off? Why is that still an empty lot instead of a new home? Why isn't there live theatre in this small city? Why is there a pot hole in the bike lane?
AI layoffs are very shortsighted IMO and should be viewed by investors as a sign of weakness in management or the business itself.
If everyone is going to increase productivity by some factor k per employee, then kx is the new norm of overall productivity of x employees.
If you lay off some percentage Y of your work force, then your expected gains will only be k(x(100-y)/100). In other words, you will not recognize the same productivity gains as your competitors that chose not to lay off.
Yes I realize it is more complex than that, because of reduced opex, but there are diminishing returns very quickly.
Let’s take AGI to its inevitable raw conclusion. Not by the definition (ab)used by clueless VCs screaming about abundance, but by what is already happening using the worst case:
The abundance of mass layoffs and job displacement due to funding and building of AI systems is the true definition of AGI.
We might as well get there faster instead of delaying it. You have already seen Oracle and Block attributing their layoffs to AI so it is happening right now.
So why delay any further and just get it over with.
While I agree with the general sentiment that this requires monitoring and study, the abstract is _very_ tendentious, lays multiple hypothesis as facts and doesn’t provide any measurement or alternatives to their preferred solution.
This isn’t a scientific study, it’s a militant manifesto
DApplying a Pigouvian tax to Ai companies might be difficult. None of them actually generate profit, nor have a model to ever do so. I think the huype machien needs to stop spinning before a solution like that would become practical.
Textile workers worried about machinery saying it's going to result in collapse of the economy when they lose their job.
But every time there's a new technology that threatens some jobs because of an increase in efficiency but investment thus can be placed in other locations creating new jobs. different kinds of jobs.
It's not different this time. Each time the luddite movement is wrong. They are solely concerned with their own selfish concerns and the demands to stop the technological improvement will not be heeded.
So... the solution is basically "pay tax on the demand that you're destroying".
We can all hate on the premise (ai is good enough to do this) and/or the solution presented (centrally enforced taxation), but you gotta admit:
the messaging from SV's AI leaders about how "ai will take all your jobs" is confused as fuck, because if so, who will be on the consuming end of things?
If AI displaces human workers faster than the economy can reabsorb them, it risks eroding
the very consumer demand firms depend on.
That is a huge "if" though. I am not sure either that the latter falls from this. When the US transitioned away from assembly lines or agriculture dominated, it's not as if consumer spending consequently collapsed.
104 comments
A purely machine economy would be far more efficient. Therefore in the limit we should eliminate reliance on human labor and consumption to build a more perfect and efficient world.
Perhaps it's then smart to make the humans have a brain/computer interface, to make then dream/think they are living in a normal society so they don't revolt.
Do you like what I've done with the place?
What can possibly go wrong
Recently, I tasked it to study a new Czech building permit law in conjunction with some waste disposal regulations and the result was just tragic. The model (opus 4.6) just could not stop drawing conclusions from obsolete regulations in its training dataset, even when given the fulltext of the new law. The usual "you are totally right" also applied and its conclusions were most of the time obviously wrong even to a human with cursory knowledge of the subject.
I ended with studying the relevant regulations myself over the weekend.
If everyone is going to increase productivity by some factor k per employee, then kx is the new norm of overall productivity of x employees.
If you lay off some percentage Y of your work force, then your expected gains will only be k(x(100-y)/100). In other words, you will not recognize the same productivity gains as your competitors that chose not to lay off.
Yes I realize it is more complex than that, because of reduced opex, but there are diminishing returns very quickly.
The abundance of mass layoffs and job displacement due to funding and building of AI systems is the true definition of AGI.
We might as well get there faster instead of delaying it. You have already seen Oracle and Block attributing their layoffs to AI so it is happening right now.
So why delay any further and just get it over with.
> If AI displaces human workers faster than the economy can reabsorb them
Big if.
This isn’t a scientific study, it’s a militant manifesto
Textile workers worried about machinery saying it's going to result in collapse of the economy when they lose their job.
But every time there's a new technology that threatens some jobs because of an increase in efficiency but investment thus can be placed in other locations creating new jobs. different kinds of jobs.
It's not different this time. Each time the luddite movement is wrong. They are solely concerned with their own selfish concerns and the demands to stop the technological improvement will not be heeded.
We can all hate on the premise (ai is good enough to do this) and/or the solution presented (centrally enforced taxation), but you gotta admit:
the messaging from SV's AI leaders about how "ai will take all your jobs" is confused as fuck, because if so, who will be on the consuming end of things?
That is a huge "if" though. I am not sure either that the latter falls from this. When the US transitioned away from assembly lines or agriculture dominated, it's not as if consumer spending consequently collapsed.