Why are executives enamored with AI, but ICs aren't? (johnjwang.com)

by johnjwang 168 comments 109 points
Read article View on HN

168 comments

[−] SirensOfTitan 49d ago
I think executives are excited about AI because it confirms their worldview: that the work is a commodity and the real value lies in orchestration and strategy.

It doesn't help that the west has a clear bias wherein moving "up" is moving away from the work. Many executives often don't know what good looks like at the detail level, so they can't evaluate AI output quality.

[−] peacebeard 49d ago
This is definitely part of it.

I think another part of it is that AI tools demo really well, easily hiding how imperfect and limited they are when people see a contrived or cherry-picked example. Not a lot of people have a good intuition for this yet. Many people understand "a functional prototype is not a production app" but far fewer people understand "an AI that can be demonstrated to write functional code is not a software engineer" because this reality is rapidly evolving. In that rapidly evolving reality, people are seeing a lot of conflicting information, especially if you consider that a lot of that information is motivated (eg, "ai is bad because it's bad to fire engineers" which, frankly, will not be compelling to some executives out there). Whatever the new reality is going to be, we're not going to find out one step at a time. A lot of lessons are going to be learned the hard way.

[−] ryandrake 48d ago
Another thing that I think is critical: AI (by default) responds to you like the leadership-idealized version of an employee: Totally subservient, humble, pleasant, endlessly excited, and always, always telling you that you're absolutely right. AI is like an energetic intern who knows his place on the totem pole and wants to please. The kind of Yes Man, I'll Do It Man that CEOs consider the ideal employee.
[−] Esophagus4 49d ago

> AI tools demo really well

Yes, and they work really well for small side projects that an exec probably used to try out the LLM.

But writing code in one clean discrete repo is (esp. at a large org) only a part of shipping something.

Over time, I think tooling will get better at the pieces surrounding writing the code though. But the human coordination / dependency pieces are still tricky to automate.

[−] gerdesj 49d ago
MD here, of a really small company (and I'm not a doctor).

I'm (mildly) excited by LLMs because I love a new shiny tool that does appear to have quite some utility.

My analogy these days is a screwdriver. Let's ignore screw development for now.

The first screwdrivers, which we still use, are slotted and have a habit of slipping sideways and jumping (camming out). That's err before LLMs ... something ... something.

Fast forward and we have Philips and Pozi and electric drivers. Yes there were ratchet jobs, and I still have one but the cordless electric drilldriver is nearly as magical as the Dr Who sonic effort! That's your modern LLM that is.

Now a modern drilldriver can wrench your wrist if you are not careful and brace properly. A modern LLM will hallucinate like a nineties raver on ecstasy but if you listen carefully and phrase your prompts carefully and ignore the chomping teeth and keep them hydrated, you may get something remarkable out of the creature 8)

Now I only use Chat at the totally free level but I do run several on-prem models using ollama and llama.cpp (all compiled from source ... obviously).

I love a chat with the snappily named "Qwen3.5-35B-A3B-UD-Q4_K_XL" but I'm well aware that it is like an old school Black and Decker off of the noughties and not like my modern De Walt wrist knackerers. I've still managed to get it to assist me to getting PowerDNS running with DNSSEC and LUA and configuring LACP and port channel/trunking and that on several switch brands.

You?

[−] Eufrat 49d ago

> I'm (mildly) excited by LLMs because I love a new shiny tool that does appear to have quite some utility.

I really think a lot of folks were conned by a smooth operator and a polished demo, so now everyone has to suffer though having this nebulous thing rammed down our throats regardless of its real utility because people with higher pay grades believe it has utility.

It feels like a lot of “AI is inevitable; you are failing to make this abundant future inevitable by your skepticism.”

[−] tdeck 48d ago

> Your team's screwdriver usage is only 30% of the company's target. 80% of other teams at Taylor Manufacturing Co. are leveraging Screw Driving tools more often as a regular part of their daily work. If your team doesn't improve, we'll need to come up with a retraining plan.

> But we're the accounting team?

> Doesn't matter. This is a SD-Native company now. We believe everyone can be more productive with an SD-based workflow.

[−] sigbottle 49d ago
Classic kafka trap! The mere sign of resistance is a sign of a deeper psychological incompatibility that fundamentally needs to be worked through until you agree with the state.
[−] smrtinsert 48d ago
The dreaded "not aligned"!
[−] gamblor956 48d ago
This is actually a really good analogy, because electric drivers are the wrong tool 99% of the time and most people don't use them.
[−] gerdesj 47d ago
I drive an EV and am a complete tool!
[−] indistinction 49d ago

>A modern LLM will hallucinate like a nineties raver on ecstasy but if you listen carefully and phrase your prompts carefully and ignore the chomping teeth and keep them hydrated, you may get something remarkable out of the creature 8)

Like what - the world's most advanced blowjob?

[−] gerdesj 49d ago
Perhaps I should have gone for Sherlock Holmes doing morphine as an analogy. Mind you the '90s raver fits for some models or is it the prompter ...
[−] indistinction 49d ago
[dead]
[−] smrtinsert 48d ago
One hundred percent. Leadership consolidating around like the rebirth of offshoring.

For the record all your prompts are tracked and easily viewable by whoever oversees it at your company. Don't prompt more than you have to and certainly don't give it your best ids. This is value at scale.

[−] argee 49d ago
Jeff Bezos famously said “your margin is my opportunity,” I feel like Steve Jobs could’ve just as easily said “your slop is my opportunity.” (And he sort of did with “insanely great”)
[−] pigpop 49d ago
The reasons given in the article are much more compelling.
[−] enoint 49d ago
I don’t think so. The article is about a manager who thinks in systems and an individual contributor preoccupied with process work. The GP comment gets closer to incentives.
[−] echelon 49d ago
Work is delivering value.

Yes, we have craftsmanship, but at the end of the day everything is ephemeral and impermanent and the world continues on without remembering us.

I think both the IC and executive are correct in superposition.

[−] bitwize 49d ago
Indeed. Even the ur-craftsman, John Carmack, says that delivering value to customers is pretty much the only thing that matters in development. If AI lets you do that faster, cheaper, you'd be a fool not to use it. There's a reason why it's virtually a must in professional software engineering now.
[−] glouwbug 48d ago
Right, but Carmack has always delivered. I need someone without Carmack’s ability to show me what value they can deliver with AI
[−] whattheheckheck 47d ago
They cant, its your job to train them or take their salary
[−] pron 49d ago
As someone who's both an IC and leads other developers I disagree with the explanation. As a technical lead, with people I can much better predict the quality of the outcome than with LLMs, and the "failure modes" are much more manageable. As a programmer, I am actually more impressed with AI agents but in an informed and qualified way. Their debugging ability wows me; their coding ability disappoints and frustrates me.

I think that the simple explanation for why executives are so hyped about AI is simply that they're not familiar with its severe current limitations. For example, Garry Tan seems to really believe he's generating 10KLOC of working code per day; if he'd been a working developer he would have known he isn't.

[−] fcarraldo 49d ago
AI allows executives to spend R&D to create a flywheel which builds more, faster, without hiring more. It makes every individual employee able to deliver more.

ICs dislike this because it raises expectations and puts the spotlight on delivery velocity. In a manufacturing analogy, it’s the same as adding robots that enables workers to pack twice as many pallets per day. You work the same hours, but you’re more tired, and the company pockets the profits.

Software Engineers are experiencing, many for the first time in their careers, what happens when they lose individual bargaining power. Their jobs are being redefined, and they have no say in the matter - especially in the US where “Union” is a forbidden word.

[−] paxys 49d ago
You must be living in a different universe if you think ICs aren't enamored by AI. Every developer I know basically can't operate now without Claude Code (or equivalent).
[−] MarkSweep 49d ago
In addition to reason in the article, one thing I’ve noticed among some executives and product managers is their experience using LLM coding tools causes them to lose respect for human software engineers. I’ve seen managers lose all respect for engineering excellence and assume anything they want can be shat out by an LLM on a short deadline. Or assume because they were able to vibe code something trivial like a blog they don’t need to involve engineers in the design of anything, rather they should just be code monkeys that follow whatever design the product managers vibed up. It is really demoralizing to be talked to as if the speaker is promoting an LLM.
[−] UtopiaPunk 49d ago
I don't buy it. Executives worry about labor costs, ARR, RoI, etc. The grandest promises of AI are that executives will make a lot more money with a lot fewer employees. Of course they are pushing it!

ICs worry about doing their job (either doing it well because they care about their craft, or doing it good enough because they need to pay bills). AI doesn't really promise them anything. Maybe they automate some of their tasks away, but that just means they will take on more tasks. For practically any IC, there is no increase in wealth nor reduction in labor time. There is only a new quiet lingering threat that they might be laid off if an executive determines they're not needed anymore.

That's the difference in enthusiasm about AI.

[−] Aperocky 49d ago
IC here enamored with LLM - my implementation speed used to be a bottleneck of what I can do both professionally and personally, and now only thoughts and idea are the limit. That is incredibly exciting.

Thoughts and idea as in "I will implement this in this structure, with these tradeoff, and it will work with these 4 APIs and have no extra features and here's how I'm (or LLM with tools is) going to run it and test it".

Thoughts and idea not as in "build facebook" - a lot of people think AI can do that, it won't (but might pretend to) and it will just lead to failure.

My competitive edge did not diminish, it expanded.

[−] exolymph 49d ago
People who will get paid more if AI eliminates jobs (in theory, anyway — execs aren't necessarily owners) versus people whose jobs will be eliminated.
[−] bad_username 49d ago
I do not think most executives are particularly enamored with AI. They are being mostly driven by the fear of missing out. More precisely, their thought process is: if they bet on AI and fail, they can plausibly claim that it was the technology's fault (not good enough, poorly suited for the business, etc). But if they skip on AI by choice, and their competition succeeds, they will be blamed personally. The more hyped a technology is, the stronger this calculus is for the managers. It's like Pascal's wager in a way.
[−] keeda 49d ago
Look at any of the large developer surveys out there, AI adoption is up to 80 - 90%; ICs absolutely are enamored with AI too. HN, and social media in general, is largely an echo chamber of the loudest voices that tend to skew negative, but does not reflect the broader reality. If HN were to be believed, most of Big Tech would be dead instead of thriving more than ever.

That said, the central point of the TFA is spot-on, though it could be made more generally, as it applies to engineering as well as management: uncertainty rises sharply the higher you climb the corporate and/or seniority ladder. In fact, the most important responsibility at higher levels is to take increasing ambiguity and transform it into much more deterministic roles and tasks that can be farmed out to many more people lower on the ladder.

The biggest impact of AI is that most deterministic tasks (and even some suprisingly ambiguous ones) are now spoken for. This happens to be at the bread and butter of the junior levels, and is where most of the job displacement will happen.

I would say the most essential skill now is critical thinking, and the most essential personality trait is being comfortable with uncertainty (or as the LinkedInfluencers call it, "having a growth mindset.") Unfortunately, most of our current educational and training processes fail to adequately prepare us for this (see: "grade inflation") so at a minimum the fix needs to start there.

[−] try-working 49d ago
I'm an IC and I love it. Executives have the wrong concept of AI. For them it's chat + magic, and then it does everything. You can't work with people who have incorrect concepts about how the world works. Best ignore them.
[−] ilaksh 49d ago
Executive's job is to increase profit. Reduction in employees is a primary way to do that. AI is the most promising way to reduce the need for employees.

Executives do not need actively functional systems from AI to help with their own daily work. Nothing falls over if their report is not quite right. So they are seeing AI output that is more complete for their own purposes.

But also, AI is good enough to accelerate software engineering. To the degree that there are problems with the output, well, that's why they haven't fired all the the engineers yet. And executives never really cared about code quality -- that is the engineers' problem.

What I'm trying to build for my small business client right now is not engineering but still requires some remaining employees. He's already automated a lot of it. But I'm trying to make a full version of his call little center that can run on one box like an H200. Which we can rent for like $3.59/hr. Which if I remember correctly is approximately the cost of one of his Filipino employees.

Where we are headed is that the executives are themselves pretty quickly going to be targeted for replacement. Especially those that do not have firm upper class social status that puts them in the same social group as ownership.

[−] yalogin 49d ago
The bigger question is if AI helps cut down the time of development by 10x (assume for this conversation), and the products are released immediately, will companies keep pushing products/functionalities out every week/month? They still have to wait and see adoption, feedback etc to see if it works or not. Sure ai speeds up development but to what end? It’s not like meta is going to compress 5yrs or instagram features into 1! No one has the pipeline built up. So not sure how it fits into the overall company strategy. It’s only helping to fire people now that’s it?
[−] kerblang 49d ago
I'm not gonna say "incorrect" like the absolutists. It's an interesting hypothesis, at least.

But I will insist that executives are more driven by FOMO than a teenager.

[−] fooker 49d ago
The premise is wrong. Plenty of ICs 'enamored' with AI.

If you are not, you either have a boring job or do not have any ideas that are worth prototyping asynchronously. Or haven't tried AI in the last ~3 months.

[−] 3form 49d ago
There are two views, non-technical and technical.

For non-technical, the current meteoric rise of AI is due to the fact that AI is generally synonymous to "it can talk". It has never _really_ spoken to the wider audience that the image recognition, or various filters, or whatever classifiers they could have stumbled upon are AI as well. What we have, now, is AI in the truest sense. And executives are primarily non-technical.

As for the technical people, we know how it works, we know how it doesn't work, and we're not particularly amused.

[−] skeletoncrew2 49d ago
I’d posit that AI is good at tasks that managers have to do: it is a world composited primarily of processes and procedures set up by humans, about other humans. In other words it is just like an ai trained on text. At the worker level you have to interact with the real, outside world in some way. If I could have AI take the wheel for every share point tracker management manages to cook up, I’d be raving about it too.
[−] monarchwadia 49d ago
The premise is incorrect. Plenty of ICs are enamored with AI. And plenty of executives are skeptical of it.
[−] fwip 49d ago
Real answer - it's because an LLM is better than you at the things you suck at.

For executives, that's writing code. For ICs, it's other stuff.

[−] bitwize 49d ago
AI is the much-hoped-for MBA's Stone, the magical substance which transmutes engineering work (costly) into managerial work (valuable).
[−] Eufrat 49d ago
They don’t have to use the tech, except maybe superficially. They are either being explicitly mislead by salespeople or like others have mentioned, it simply is a vehicle to confirm their own biases or annoyance at having to pay peons. It’s up to the grunts to actually make this work.

It’s like Marc Andressen bloviating about how AI will replace everyone except him.

To be fair, some of this is understandable. At some level, you’re just going to see some things as a bullet point in a daily/monthly/quarterly report and possibly a 10 minute presentation. You’re implicitly assuming that the folks under you have condensed this information into something meaningful.

[−] LogicFailsMe 49d ago
I think ICs are threatened because they're told from day one how they are at will employees that can be terminated at any time with or without cause.

On top of that, places like Amazon extol the virtues of only working on projects that can be completed with entirely fungible staffing and Google tries ever so hard to electroplate this steaming turd of an ideology with iron pyrite calling fungibles "generalists."

So along comes AI coding agents, which I love as an IC because it excels at tedious work I'd rather not have to do in the first place, yet I get why others see it as a threat. But I really think it's no more of a threat than any other empty promise to cut costs with the silver bullet of the month and we just have to let the loudmouths insist otherwise until the industry figures out this isn't a magic black box. They never learn, do they? Maybe their jobs depend on never learning.

[−] pjmlp 49d ago
ICs see the teams being reduced as the individual productivity gets increased the amount of FTE per project goes down, and superfluous folks shown the door.

Meanwhile executives see the money related numbers go up.