Two big tech FAANG jobs, org is 95% h1b engineers from china/india. Tons of resumes from american grads somehow never hit my desk, continue to interview random candidates from india with some low quality USA masters from missisipi state. Candidate has spent the last year locked in a room memorizing algorithm interview questions
I’ve seen two interns through to FTE jobs at a FAANG, both were at least second generation immigrants (so citizens I guess, though I never asked). So kids are still getting jobs, they come from reasonable universities, I haven’t seen UW resumes yet (my Alma mater), for some reasons those kids are all scooped up before I can take a look (I work at an office in Seattle so I find it weird that we can hire from the local university).
the weird thing is seeing american kids from a top 20 american school (duke, carnegie melon, etc) mixed with a much larger majority of h1bs from absolute no name schools. its almost like the bar for americans is higher
So I want to talk about the Claude Mythos mega-model that had a real marketing push in the last week. Marcus Hutchins (of IWannaCry fame) posted about the economics of this [1].
The upshot here is this was a longstanding BSD bug but mostly because nobody is paying BSD bug bounties and a null-pointer dereference may induce a crash but rarely (if ever) leads to privilege escalation as buffer overflows generally do so isn't as high-priority. The estimate on the cost is $20-50k of tokens.
$50k gets you a lot of developer time, upwards of 2 months. As has been stated in a bunch of threads, much smaller models could also find this bug. The defense is "they didn't" and "once you know it's there, it's easier to find" but again, nobody has been paying BSD bug bounties. Put another way: far fewer people have been looking.
My point is that the economics of these models are still highly debatable. What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.
And we've arrived at the real goal here: to reduce head counts. Why? To suppress labor costs. To get people who didn't get laid off to do more free work.
In addition, AI capex is used as an excuse to cut costs ie reduce labor, for exactly the same reasons.
Lastly, the threat of this happening in the near to medium future is also used to scare labor and reduce costs.
Sufficiently largecompanies can't grow anymore. Their only path forward is to raise prices and redcue costs. For tech companies, labor is a significant cost. That's what's going on here.
Software devs lost their pricing power due to LLMs but not exactly how most people think.
What's missed in understanding is 'how exactly does this functionality work for this specific case?' or 'can we implement this tiny one off feature in some legacy code base'. Both things are why you keep the guy that wrote it around. And you couldn't really replace him. Because digging into what he wrote was hard.
Now, LLMs can do that stuff better than the guy that wrote it.
Software devs were non-fungible. Now they're commodities. When things become commodities, they lose their value.
I'm not sure why I haven't heard people talk about this aspect. It's the biggest effect on jobs.
Can anyone provide historical data for "job busts" or other types of declines in tech employment, massive layoffs or hire freezes? I seem to read about something like this every few years. Would like some data to see if this trend is stronger than the previous ones or not.
The only reason to blame AI is that it doesn't live up to promises, so corps are betting that scale will solve the problem. So datacenter after datacenter wants to be built in areas that can't supply them water/power.
And that money sink is causing the layoffs. If AI really helped like it expected to, you would grab any dev you could so that you could have a army of 100x devs.
Humans are still cheaper for the real costs. Augmented humans are on average a few percentage points better.
Nothing happens in a vacuum. "Tech companies" act in a coordinated fashion despite claiming they don't. Remote work was great until it wasn't. This is a coordinated action to shed all the mass gathered during ZIRP. Market forces is one naive explanation. But as history has shown factors extraneous to market forces are at work. This period will be written about, analyzed and diced in thousand different ways by future "thought leaders".
I think a lot of people don't see the historical trend of where investment moves around. The 90s dot-com era was driven by massive investment in e-commerce (remember that term?) businesses. It created a bubble, which then burst.
When bubbles burst, investors change tack, and when investors pull back, companies turtle and do layoffs.
After Y2K, investment moved on to real estate. In 2008, that bubble burst. It moved on to social media next.
I would argue that bubble burst somewhat silently during COVID with the quiet deaths of Xitter and Facebook (from a standpoint of cultural relevance), and investment transitioned towards and into its current AI bubble.
Investors are always going to move on once the lustre of a field has waned, and I'd hazard that we'll see investment move somewhere other than software tech next (if I had to prognosticate, I'd say it's moving into robotics/ drones).
It is the perfect storm, companies overhired during covid by a 1000%
Then covid went away, life went back to "normal", tons of juniors everywhere who moved to IT because of WFH and the salary.
Then comes AI, within the IT field, I am seeing first hand companies firing by the thousands because of AI.
Only Blue collar was being affected in the past, but now is everybody, no matter of your degree, experience, you either already had a hard time or you will have.
If Google Quantum computing somehow finds a massive breakthrough, we are all fu :)
Would really appreciate an archive link to read this, even though it appears to be another meta-analysis wrapped in a “everything will be fine” narrative from the limited previews I’ve been able to gleam and The Economist’s general lean.
The problem with the tech layoffs is that it’s poisoning the well downstream. Smaller employers have repeatedly cited their layoffs as justification for abysmal, unlivable salaries, and demanding those of us looking for work suck it up and deal with it while they search for bottom-dollar unicorns.
AI isn’t replacing the IT crowd outside of the expected junior roles, and even that’s starting to rebound as executives realize Juniors were how they got “white glove service” for themselves - a Senior Engineer isn’t going to wipe their ass for them, job market or not, because said Engineer’s time is infinitely better spent on literally anything else.
One other thing I’ll note is that the layoffs also seem to be remnants of the Brogrammer hustle culture: tech folks were enjoying more time for themselves to grow or live life outside of work specifically thanks to a few good years of paying down technical debt with properly-staffed teams, but the grifters up top see anything less than a 9-9-6 as somehow stealing from the employer and slash accordingly. The remnants are expected to do more work for less pay, with AI tools somehow filling the gap (even though these same employers often don’t want to pay for proper tooling to maximize use of AI).
This is definitely an industry downturn as those who stand to gain maximize their immediate returns. From the perspective of the C-Suite and Boards, the safest (albeit unethical) move is to betray: if AI is a bust, they’ll have made their wealth and can fuck off; if AI eliminates jobs and work, they believe their wealth will protect them in the future dystopia of their own creation.
It’s in that context (“fuck you got mine”) that the broader narrative fits with the myriad of puzzle pieces out there (higher interest rates, stock pumps, circular financing, tariffs, aging population, AI, etc).
Another reason is that wages don't keep up with the cost of living. For instance in the UK it makes little sense to be in IT, unless it's something boring and typing React boilerplate feels easier than stacking shelves or running deliveries.
"America’s Census Bureau estimates that just 28% of firms in the San Francisco metropolitan area use AI regularly as part of their day-to-day operations. In America as a whole, adoption is much lower."
reciprocal tariffs had put the non-tech and tech economy in stasis (except for hardware for AI). they are also are better than tax breaks and will supercharge bottom lines for large corporations once reclaimed and if prices remain high.
also if you want to test/force ai adoption you have to put pressure by firing some
now wars will put us into further stasis or decline via increased inflation pressure.
194 comments
- Tech or engineering company hires foreign resident engineer.
- That engineer works their way up in the company to a position where they are in control of hiring.
- They hire more people like them.
We used to accuse start-ups for hiring for cultural fit (people like them) which looked like sexism and ageism. Same thing, different group.
At this point everyone should know “cultural fit” is synonymous to “preference”/“discrimination”.
The upshot here is this was a longstanding BSD bug but mostly because nobody is paying BSD bug bounties and a null-pointer dereference may induce a crash but rarely (if ever) leads to privilege escalation as buffer overflows generally do so isn't as high-priority. The estimate on the cost is $20-50k of tokens.
$50k gets you a lot of developer time, upwards of 2 months. As has been stated in a bunch of threads, much smaller models could also find this bug. The defense is "they didn't" and "once you know it's there, it's easier to find" but again, nobody has been paying BSD bug bounties. Put another way: far fewer people have been looking.
My point is that the economics of these models are still highly debatable. What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.
And we've arrived at the real goal here: to reduce head counts. Why? To suppress labor costs. To get people who didn't get laid off to do more free work.
In addition, AI capex is used as an excuse to cut costs ie reduce labor, for exactly the same reasons.
Lastly, the threat of this happening in the near to medium future is also used to scare labor and reduce costs.
Sufficiently largecompanies can't grow anymore. Their only path forward is to raise prices and redcue costs. For tech companies, labor is a significant cost. That's what's going on here.
[1]: https://www.tiktok.com/@itsmarcushutchins/video/762774007353...
What's missed in understanding is 'how exactly does this functionality work for this specific case?' or 'can we implement this tiny one off feature in some legacy code base'. Both things are why you keep the guy that wrote it around. And you couldn't really replace him. Because digging into what he wrote was hard.
Now, LLMs can do that stuff better than the guy that wrote it.
Software devs were non-fungible. Now they're commodities. When things become commodities, they lose their value.
I'm not sure why I haven't heard people talk about this aspect. It's the biggest effect on jobs.
Jevons paradox
And that money sink is causing the layoffs. If AI really helped like it expected to, you would grab any dev you could so that you could have a army of 100x devs.
Humans are still cheaper for the real costs. Augmented humans are on average a few percentage points better.
When bubbles burst, investors change tack, and when investors pull back, companies turtle and do layoffs.
After Y2K, investment moved on to real estate. In 2008, that bubble burst. It moved on to social media next.
I would argue that bubble burst somewhat silently during COVID with the quiet deaths of Xitter and Facebook (from a standpoint of cultural relevance), and investment transitioned towards and into its current AI bubble.
Investors are always going to move on once the lustre of a field has waned, and I'd hazard that we'll see investment move somewhere other than software tech next (if I had to prognosticate, I'd say it's moving into robotics/ drones).
Then covid went away, life went back to "normal", tons of juniors everywhere who moved to IT because of WFH and the salary.
Then comes AI, within the IT field, I am seeing first hand companies firing by the thousands because of AI.
Only Blue collar was being affected in the past, but now is everybody, no matter of your degree, experience, you either already had a hard time or you will have.
If Google Quantum computing somehow finds a massive breakthrough, we are all fu :)
The problem with the tech layoffs is that it’s poisoning the well downstream. Smaller employers have repeatedly cited their layoffs as justification for abysmal, unlivable salaries, and demanding those of us looking for work suck it up and deal with it while they search for bottom-dollar unicorns.
AI isn’t replacing the IT crowd outside of the expected junior roles, and even that’s starting to rebound as executives realize Juniors were how they got “white glove service” for themselves - a Senior Engineer isn’t going to wipe their ass for them, job market or not, because said Engineer’s time is infinitely better spent on literally anything else.
One other thing I’ll note is that the layoffs also seem to be remnants of the Brogrammer hustle culture: tech folks were enjoying more time for themselves to grow or live life outside of work specifically thanks to a few good years of paying down technical debt with properly-staffed teams, but the grifters up top see anything less than a 9-9-6 as somehow stealing from the employer and slash accordingly. The remnants are expected to do more work for less pay, with AI tools somehow filling the gap (even though these same employers often don’t want to pay for proper tooling to maximize use of AI).
This is definitely an industry downturn as those who stand to gain maximize their immediate returns. From the perspective of the C-Suite and Boards, the safest (albeit unethical) move is to betray: if AI is a bust, they’ll have made their wealth and can fuck off; if AI eliminates jobs and work, they believe their wealth will protect them in the future dystopia of their own creation.
It’s in that context (“fuck you got mine”) that the broader narrative fits with the myriad of puzzle pieces out there (higher interest rates, stock pumps, circular financing, tariffs, aging population, AI, etc).
> Don't blame AI
Some may blame AI, but the opposite is also true: The believers use the bust as "proof" that AI coding "works".
also if you want to test/force ai adoption you have to put pressure by firing some
now wars will put us into further stasis or decline via increased inflation pressure.
It's just that the tech workers are the canaries in the coal mines for the other white collar knowledge workers.
This is "AGI".