I am leaving the AI party after one drink (lara-aigmueller.at)

by speckx 130 comments 121 points
Read article View on HN

130 comments

[−] legitster 49d ago

> I don’t want to feel this kind of “addiction.”

> I don’t want to depend on something doing the work I earn money with.

> I don’t want to give up my brain and become lazy and not think for myself anymore.

There are a lot of good reasons we should be skeptical of AI and not give up on essential skills. But sometimes I want to shake these people by the shoulders. Do you drive an automatic car? Do you use a microwave? Do you buy food from a grocery store? Do you own power tools?

The entire point of civilization and society is that we are all "addicted" to technology and progress. But the invention of the plow did not, in fact, make us lazier or stop using our brain. We just moved on to the next problems. Maybe the Amish are have it right and we should just be happy with a certain level of technology. But none of us have "lost" the ability to go backwards if we really wanted.

You can finally ask a computer to think and solve problems, and it will! People act like this is a brave new world, but this is literally what computers were supposed to be doing for us 50 years ago! If somebody finally came out with a fusion reactor tomorrow I would half expect people to suddenly come out and say "Oh, I don't think I can support this. What about the soul of solar panels? I think cheap electricity is going to make things too easy."

[−] mr_mitm 49d ago
Douglas Adams really put it best:

> “I’ve come up with a set of rules that describe our reactions to technologies,” writes Douglas Adams in The Salmon of Doubt.

> 1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

> 2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

> 3. Anything invented after you’re thirty-five is against the natural order of things.

[−] tines 49d ago
The problem is that you're likening fundamentally unlike things. AI isn't like a microwave or an automatic car or a power tool. It does not augment you. As I said elsewhere: AI is not a bicycle for the mind, it's an easy chair. You will lose more than you ever gain.
[−] MisterTea 49d ago

> Do you drive an automatic car? Do you use a microwave? Do you buy food from a grocery store? Do you own power tools?

None of these things allow you to turn your brain off while the machine does the work.

I still have to DRIVE the car and all the thinking that goes with that. It's not a robotaxi.

I still have to acquire and prep the food I am microwaving. It's not a replicator.

I still have to know what I want to eat before grocery shopping and prepare the food. It's not a take out restaurant.

I still have to know how to use the power tools to carefully shape something into a fine piece of furniture and not a pile of splintered firewood. Power tools can't operate on their own unless aliens (see Maximum Overdrive.)

These are better analogies:

Do you take a taxi or public transport? Those let you turn your brain off while someone or something does the driving work.

Do you go to a restaurant where you can pick what you want, turn your brain off and wait for a delicious (or not) meal?

Do you order takeout where you can order what you want form the comfort of your home, turn your brain off and enjoy the meal when it arrives? Then reheat the leftovers in the microwave.

Do you use a fabrication service where you send them a drawing, turn your brain off, and they ship you an assembled thing?

[−] cush 49d ago

> I want to shake these people by the shoulders... Do you use a microwave?

Microwaves aren't doing active problem solving though. It seems what the author is trying to say is they enjoy problem solving and they find coding a rewarding and creative experience. Sure microwaves saved at-home cooks might enjoy zapping a frozen dinner, but the author is a chef who enjoys writing their own recipes and cooking from scratch. AI isn't just the microwave, it's also the chef.

> None of us have "lost" the ability to go backwards if we really wanted

This absolutely isn't true. Using google maps quickly makes people poorer at navigation - skills need to be practiced. The author thinks letting AI into their kitchen to cook for them will change themself cognitively and make them lazy and lose their skills. And that would be true.

What it sounds like you're getting at but never said is there might be newer skills on the other side that are even more rewarding, which may be true. But if history is any indication, there will be no shortage of folks who like things the old way and want to use their meat brains to provide bespoke goods and services that AI can't.

[−] coxmi 49d ago
This is an insane claim:

> The entire point of civilization and society is that we are all "addicted" to technology and progress.

Technology is like much of material reality, in that we can think whatever the hell we like about its various forms, especially so if we’re surrounded by it.

[−] f1shy 49d ago
The line is fine. Even if I use GPS a lot, I still try to keep my ability to interpret maps and find my way.

Same with calculators, even when today are dirt cheap, are not allowed in school, and being able to do math without it is a valuable skill.

So maybe there are like 2 groups of things: one where using it you are losing nothing, some where you lose some valuable ability.

[−] andai 49d ago
Reposting a comment (and one of the replies) since it's relevant:

---

It occurred to me on my walk today that a program is not the only output of programming. The other, arguably far more important output, is the programmer.

The mental model that you, the programmer, build by writing the program.

And -- here's the million dollar question -- can we get away with removing our hands from the equation? You may know that knowledge lives deeper than "thought-level" -- much of it lives in muscle memory. You can't glance at a paragraph of a textbook, say "yeah that makes sense" and expect to do well on the exam. You need to be able to produce it.

(Many of you will remember the experience of having forgotten a phone number, i.e. not being able to speak or write it, but finding that you are able to punch it into the dialpad, because the muscle memory was still there!)

The recent trend is to increase the output called programs, but decrease the output called programmers. That doesn't exactly bode well.

See also: Preventing the Collapse of Civilization / Jonathan Blow (Thekla, Inc)

https://www.youtube.com/watch?v=ZSRHeXYDLko

---

Munksgaard 1 day ago:

Peter Naur had that realization back in 1985: https://pages.cs.wisc.edu/~remzi/Naur.pdf

[−] allenrb 49d ago
For what it’s worth (ie, absolutely nothing), I agree with her 100%. I didn’t get into this field in order to prompt an AI to take care of the details. I got into it because I love the details.

I’m a strong performer on a good team at a company many people would want to work at… and I know the clock is ticking. Sooner or later, I will be too slow.

I’m not going to claim that this is the wrong way to go. It’s obviously the future, and the future doesn’t care what allenrb does or does not want. I’m somewhat hopeful that power and cooling requirements will come down by multiple factors of 10x over time, reducing the environmental damage.

The fact is, I love what I’ve been able to do “the old way” and just don’t feel the urge to move on. So it goes.

[−] mstank 49d ago
While I applaud her and wish her well — writing like this reminds me of a couple of things.

First my aging father insisting on navigating using his unfortunately fading memory instead of Google maps. Some people just won’t pick up technology out of habit or spite, even if it hinders them.

Second, a quote I read here that I’ll paraphrase “you can be the best marathon runner in the world and still lose a race to a guy on a bike.” Know the race you’re racing. It often changes.

I think it’s valid and commendable to keep the old ways alive, but also potentially dangerous to not realize they’re old ways.

[−] firmretention 49d ago
I've found two personal use cases for LLM generated code:

(1) I have an idea for some app, but either I feel it won't be useful enough/save me enough time to justify developing it, or I simply don't feel the problem is interesting enough to be motivated by it. In that case, a vibe coded tool is perfect. It generally does one simple thing, and I don't care about long term maintenance, because it just needs to keep doing that thing.

(2) Adding a feature to an open source project. Again, it's a case of "I want this feature, but am not willing to spend the time needed to implement it." Even a relatively simple open source project can take a day or two just to get a basic understanding of the code and where I need to make the changes. Now I can often just get a functioning vibe-coded implementation within a few hours.

(2) leaves me with some unsettling feelings about how this will affect the future of open source software. Some of the features I've implemented this way may very well be useful to other users, but I can't in good conscience just dump a vibe coded pull request on a project and except them to do the work of vetting it. But if I didn't have the energy to implement the change myself, I'm definitely not going to bother doing the work of going through all the LLM generated code, cleaning it up to the standards of the project, etc. Whereas before I didn't have a choice, and the idea of getting the change ready for a PR was much less daunting since I understood the problem space and solution well.

So at least for myself, I can see a future where many of the apps I use are bespoke forks of popular applications. Extrapolate that to many, many people and an interesting landscape emerges.

[−] trinsic2 49d ago
Yeah I think this article put a finger on what I was feeling after using Claude Code for the first time to convert an PDF to an Markdown document[0]. I think I will update my article on these thoughts. Thanks for touching on something I had been feeling. It also feel like I was cheating. I also used CC to update the version of my SSG and that was good because I did not want to spend my time dealing with that. But there are certain projects that I can see myself not feeling good about if I used the tool to help me with.

[0]: https://www.scottrlarson.com/publications/publication-my-fir...

[−] Nevermark 49d ago
Some of us have been waiting our whole lives for a comprehensive DWIM command.

> DWIM is an embodiment of the idea that the user is interacting with an agent who attempts to interpret the user's request from contextual information. Since we want the user to feel that he is conversing with the system, he should not be stopped and forced to correct himself or give additional information in situations where the correction or information is obvious. [0]

— Teitelman and his Xerox PARC colleague Larry Masinter, Xerox PARC, in 1981

[0] https://en.wikipedia.org/wiki/DWIM

[−] haolez 49d ago
I don't have a stake on AI, but more and more I see the following patterns:

- people that give in to AI do so because the technical merits suddenly became too big to ignore (even for seasoned developers that were previously against it) - people who avoid AI center their arguments on principles and personal discomfort

Just from that, you can kind of see where this is going.

[−] basket_horse 49d ago
Why can’t people just acknowledge AI is good at some things and bad at others. Why does every post say AI is either groundbreaking or terrible. Get a grip people. It’s a tool.
[−] politelemon 49d ago

> When you know CSS well (and I would consider myself as someone who does), you quickly find weird and broken things in the generated code.

You have encountered https://en.wiktionary.org/wiki/Gell-Mann_Amnesia_effect

[−] PeterStuer 49d ago
I am the opposite. I do not understand how people do not get at least 5x more productive with GenAI.

Maybe it is because I do not do much front-end design. Maybe it is because I'm a bit more diligent than your average "viber", or maybe because for me it is easier to spot a suboptimal solution, or challange with edge cases from experience etc.

But these people turning their backs, not in principle, because that I fully understand, but because of underperformance?

Maybe their expectations are way out there? Maybe (most likely) it is the application domain? Maybe plainly a skill issue?

But seeing how GenAI is plowing through through fields, I would not turn my back on it even if it wasn't there (yet) in my domain.

[−] thepra 49d ago
That's true for new projects, I've found that as a bootstrap mechanism it is not ideal, takes away your hard earned preferences/know-how abs leaves you barely engaged on the technical path. On the other side I already have quite big codebases, some legacy and inherited, there exist vast amount of context to work on and figure out stuff/fixes/bugs/improvements and Claude Code excels 95% of the time in figuring the existing flows and functionality, if a vague or precise issue is known that's facing some files elaborations or user work flows it can do a good job in pointing out what are, with a high chance, the issues.
[−] tjchear 49d ago
Here’s my experience: just yesterday I had to tackle this task that’d have required a backend engineer and a frontend engineer several days, so I tasked several Claude code agents to work on them autonomously. With the time freed up, I didn’t just twiddle my thumbs. I used it to read up on this topic that was making the rounds yesterday and gained a better understanding of it - something hard to do when you juggle both a job and raising a family. I could then reinvest the time I used to learn something by using them in some other projects.

Just my two cents. No matter whether you use AI or not, I’m sure you’ll gain something.

[−] piloto_ciego 49d ago
There are always a lot of people afraid of change. People in my rough age bracket have lived in a time where change usually sucked for us, so we tend to look at this latest change and think, "oh my god, not again."

It's fear of being worthless in a society where previously being "smart" (because they could write code) made them special. The truth is and it's a bummer, but none of us are special anymore. That's OK, get over it and go build cool things! Life is short.

[−] TheGRS 49d ago
I know I read several posts like these every day, but I've been thinking more and more that I should stick to the chat window and having AI guide me instead of doing the work for me. Great tool for showing me what to do when I don't know and offering me guidance, and rubber ducking of course, but I definitely lose the context and understanding when I just let it rip and write everything for me.
[−] cowlby 49d ago
Who else struggles with both sides of this? My engineer side values curiosity, brain power, and artistanship. My capitalist side says it's always the product not the process. My formula is something like this: product = money, process = happiness, money != happiness, no money = unhappiness.

I think the optimal solution is min/maxing this thing. Find the AI process that minimizes unhappiness, and maximizes money.

[−] itomato 49d ago
What did you first order at the bar? Did it burn? Did you become a teetotaler?

Latecomers lack the hundreds of iterations and the experience that comes with it. The senses haven't been trained.

There's a business here. Not one I want to be in, or one without major ethical drag, but a viable one. Fend off extinction, get fit.

[−] heathrow83829 49d ago
You are lucky that you have the luxury of being able to work at a normal pace and not be pressured to go "even faster".

My boss, is constantly saying "go faster, go faster". AI, is unfortunately the only way to keep up if you want to keep your job, this day and age.

[−] peacebeard 49d ago
Pretty insubstantial high level tour of broad AI pushback. Goes from "It was just not 'elegant.'" [sic] to "I don't want to give up my brain" [sic].
[−] api 49d ago
I'm a little tired of the environmental argument against AI. It feels contrived, like people are fishing for a "problemism" to use to oppose it to avoid harder discussions.

Let's compare AI to one typical 20 mile round trip commute. I asked Gemini and Claude and compared to see if the results looked good, but feel free to check.

One ~20 mile round trip commute: about 5700 Wh in an EV, about 27000 Wh in a gas car (due to thermal efficiency).

Comparing to the EV that's about 1,400 ChatGPT queries, 2,800 AI code completions, and 380 AI image generations.

Ordering lunch on Doordash uses the same power as days and days worth of very heavy AI usage, and that's if the dasher is driving a very efficient car. If they're driving an inefficient gas car it's like weeks of heavy AI usage.

Ultimately what matters is where we get our power. If we are getting it from CO2 emitting sources, what we do with it after that is not relevant. Make AI memes? Order burritos? Boil spaghetti? Who cares. The solution is to replace CO2 emitting sources with cleaner sources.

I also think people are avoiding the big fat elephant: wealth inequality. The whole problem with AI that bothers people is loss of jobs and possible wage suppression. The problem isn't AI, it's inequality and the fact that our system is basically regressive at this point with wealth being actively transferred upward.

But that's a hard complicated discussion and involves confronting powerful forces. It's easier to make stuff up about AI being some uniquely bad energy or water waste when it's not. This is really what "problemism" is all about: using a contrived or exaggerated or mis-attributed problem to avoid a hard or complicated conversation.

[−] machinecontrol 49d ago
How is there such a wide gap between developers? Either running ten agents pushing thousands of LOC a day or afraid to paste a code snippet from ChatGPT.
[−] rickdeaconx 49d ago
The idea of getting information from other engineers relies on the idea that the other engineers aren't already complacent with AI.
[−] lowlander 49d ago
Patient: "Doctor, it hurts when I do this." Doctor: "Then don't do that!"

I'm finding that how you choose to use it makes all the difference in whether it's useful or not. I understand the reticence to jump on the hype train and it's taken some reps to find the parts of building with AI that I don't like and how to navigate it and keep it from making choices I wouldn't make or are low quality.

> asking for a recommended tech stack

this is up to you. you can just tell it what tech stack to use. better yet, bootstrap the project yourself and give it to AI as the starting point. nobody is saying AI has to make these choices for you and you're not allowed anymore.

> I wasn’t happy with some of them because of my own experiences in the past... Even when deciding against something for a reason, Claude Code tried to push me back on the suggested track.

this kind of sounds like many human teammates at work... you don't always like their suggestions or they aren't convinced by your arguments? the difference being with AI you can just tell it what to do, no persuasion required.

nothing about AI prevents you from thinking about design choices, architecture, data modeling, or even the minutiae if you want to. the only thing telling AI to do those things for you is you!

[−] ikidd 49d ago
Good lord, don't hire a plumber to fix the pipes, you'd better do it yourself then. Maybe you like plumbing; then plumb. But most people will just pay for the person that does a good enough job that they can take a shower in the morning. Personally I think I do a better job so I do it myself, but my wife sure as hell wouldn't, and shouldn't.

The sheer selective blindness is mind boggling. I get so sick of seeing this virtue signalling.

[−] tim-tday 49d ago
Best argument I’ve seen
[−] par 49d ago
Any sufficiently advanced technology is indistinguishable from magic. -Arthur C Clarke
[−] TaupeRanger 49d ago

> I don’t want to feel this kind of “addiction.”

Getting a feeling of "wanting to keep going" with something does not automatically make it an "addiction".

> I don’t want to depend on something doing the work I earn money with.

A tale as old as time, and a valid feeling, though not particularly helpful to dwell on since the technology will never go away and never get worse than it is right now.

> I don’t want to give up my brain and become lazy and not think for myself anymore.

Your brain will think about other important things and you don't need to become "lazy" just because a machine is doing something that used to require more effort on your part.

> I enjoy technical discussions with (human) co-workers.

So what? You can still have those discussion.

> I enjoy reading blog posts and tutorials and learning from other developers.

So what? You can still read those blogs, but the subjects might shift away from coding minutiae to other topics.

> I want to learn and grow and become better at what I am doing by trial and error and mistakes I make all by myself.

Going forward, that trial and error process will start to happen more at the product/project level rather than the source code level.

> I don’t want to be part of a trend/hype destroying our planet even faster than we already do without it.

It isn't. https://blog.andymasley.com/p/the-ai-water-issue-is-fake

[−] 66yatman 41d ago
Goodluck you didn’t take enough sip, you aren’t afraid of all Dario has said
[−] aeternum 49d ago
A lot of this is just wrong. AI can now see the output.

It's interesting that highly flawed opinion pieces like this are so popular.

[−] slumpt_ 49d ago
to be clear, the work you’re doing is only relevant to the extent it produces product

nobody cares if you use your brain or not. they do care if you’re efficiently delivering reliable product

[−] helenazdenova 49d ago
[flagged]
[−] jeremie_strand 49d ago
[flagged]
[−] jeremie_strand 49d ago
[dead]
[−] rspoerri 49d ago
Human thinks eating food from fire is bad because it looks charred and you might burn your fingers doing so. /s

I remember the time when people insisted that they would never use a mobile phone. I remember the time when people didnt understand my presentation about the magical "internet" (8th grade school in 94).

[−] ognav 49d ago
[flagged]
[−] jr3592 49d ago
I love posts like this. AI is easily the most disruptive thing to hit our industry in over a decade and it feels like one of those "this changes everything" moments. Reading how it's impacting others is cathartic and helps shape my own understanding.

Here are some thoughts I have from reading this article:

> The AI can’t “see” the output, so some responsive refinements were just not correct. Within one CSS rule block there were redundant declarations.

This 1,000%.

Vibe coding has its issue and for me personally, frontend polish, responsiveness, and overall quality is the #1 most glaring of them that simply re-prompting often can't solve.

Even with the ability to screen shot your UI that hasn't solved things like glitchy animations. If you want to do anything even remotely above a junior level like scroll animations, page transitions, etc. good luck. AI will certainly try to do it for you, but inevitably it will not work perfectly and you will need to manually refine or even re-write code. When the code base isn't yours, that makes these re-writes a lot less fun.

> The guilty conscience at the same time, like I was cheating. I realized that when I move on like this, my project will never truly feel like my own.

I've wrestled with this over the last year, and still do to some extent. I'm trying to shift my perspective and envision myself as a brand new developer maybe 16 or 17 years of age. Would I think this isn't my work? I doubt it. I'd probably just (correctly) assume that this is the state of the art, this is how you do it.

Unfortunately this doesn't fix a bigger problem... I just don't enjoy vibe coding as a craft. There's something special about sitting down in the morning with your coffee and taking on a difficult programming problem. You start writing some code, the solutions start to formalize in your mind, there's a strong back-and-forth effect where as you code, the concepts crystalize further... small wins fuel a wonderful dopamine hit experience... intellisense completions, compilation completions, page refreshes, etc. are now all replaced with dull moments often waiting for the agent to return its response, which you now read.

> I’m curious (and a little bit scared) to see where we will go from here. I hope that in the end I can be part of a community that values craftsmanship, individuality and honest, high-quality work.

I really hope so too... But speaking honestly, I think this ship is sailing away quite quickly.

Time is money, and it always has been this way. Very few organizations can afford the luxury of time when building, designing, etc. I see no chance for this genie to go back in the bottle, and I believe it has (and will continue) to fundamentally change the nature of our work.

Over time as these models improve, there's a chance it could dramatically reduce the overall need for developers... It will start with low level teams as we're seeing already, but could expand.

I have been saying this to everyone -- what's your exit strategy?

I'm not saying you need to panic, but you need a plan for what happens if / when salaries tank dramatically. I hate to be "that guy" but in life I've found expecting the worst, isn't always a bad thing. Keep your mood up, prepare for the worst possible outcome, and be pleasantly surprised if that's not what happens.

[−] yomismoaqui 49d ago
I would pay money to read rants on a forum like Hackernews but in 190X when cars started sharing the roads with horse carriages.

I bet it would look something like the posts we are seeing today with developers and agentic AI.

[−] JPKab 49d ago
"I'm not going to use this technology that obviously enhances my productivity because ."

I think a lot of people have forgotten why we actually get paid to write code. The person who wants an automated billing system doesn't care if you hand-typed it or not, or if the CSS that would have taken 2 hours to write took 8 seconds via an AI plus 60 seconds of you tweaking a border you didn't like. They just want their billing system. And if you are the person that takes 20x longer to build it, you're going to quickly get outcompeted. Sorry.