I gave up on r/programming after an article I wrote (thoughtfully, without AI, even though the content might not have been super interesting) got mod-slapped with a stickied comment "This content is low quality, stolen, blogspam, or clearly AI generated".
Ironically, that comment was added three months after I posted the article, when it was nowhere near the front page anymore, in a clearly automated and AI-driven review.
Reddit is a low-quality platform, the sorts of people who would be interested in moderating a popular subreddit like r/programming are even less fit to be moderators than the average moderator is. It would be better if people completely stopped using the platform.
Do you think maybe it's the disclosure about self promotion you added? You explicitly say the purpose of the blog post is to promote your consultancy, so that might be why they marked it blogspam. I know it feels like you're being forthright, but really that you're promoting yourself is implicit in the fact it's a personal blog, so you can leave that out and still be honest.
Yeah, maybe it's that, though I still wouldn't expect someone to categorize the post as blogspam, even if they just glance at it. (At least according to my definition of blogspam, but I guess each has their own.) But yes, pragmatically I should probably remove the disclaimer.
I gave up on Reddit after many years of acting as characters on the Venture Bros subreddit. Every so often I would retire my account and begin a new one. I've had MANY over the years. I've used Reddit "cleanup" apps to remove/clean the content I've created. Good stuff over time, very niche and specific to VB.
I gave it all up when Reddit started recycling my old accounts and reposting my content as if it were new -- but not authored by me, just regurgitated back onto the site.
If that happened to me, you can bet it's happening en masse. Which indicates to me that the site is really dead.
AI programming is fundamentally different from programming and as such the discussions merit to have separate forums.
If r/programming wants to be the one solely focusing on programming then power to them. Discussing both in combination also makes sense, but the value of reddit is having a subreddit for anything and “just programming” should be on the list.
> AI programming is fundamentally different from programming
It's really not. Maybe vibecoding, in its original definition (not looking at generated code) is fundamentally different. But most people are not vibe coding outside of pet projects, at least yet.
Hopefully this does not devolve into ‘nuh-uh’-‘it is too’ but I disagree.
Even putting aside the AI engineering part where you use a model as a brick in your program.
Classic programming is based on assumption that there is a formal strict input language. When programming I think in that language, I hold the data structures and connections in my head. When debugging I have intuition on what is going on because I know how the code works.
When working on somebody else’s code base I bisect, I try to find the abstractions.
When coding with AI this does not happen. I can check the code it outputs but the speed and quantity does not permit the same level of understanding unless I eschew all benefits of using AI.
When coding with AI I think about the context, the spec, the general shape of the code. When the code doesn’t build or crashes the first reflex is not to look at the code. It’s prompting AI to figure it out.
It is not. One version of a compiler on one platform transforms a specific input into an exact and predictable artefact.
A compiler will tell you what is wrong. On top of that the intent is 100% preserved even when it is wrong.
An LLM will transform an arbitrarily vague input into an output. Adding more specification may or may not change the output.
There is a fundamental difference between asking for “make me a server in go that answers with the current time on port 80” and actually writing out the code where you _have to_ make all decisions such as “wait in what format” beforehand. (And using the defaults is also making a decision - because there are defaults)
Compilers have undefined behaviour. UB exists in well defined places.
Even a 100% perfect LLM that never makes mistakes has, by definition, UB everywhere when spec lacks.
Right, they allow for the idea of gradual specification - you can write in broad strokes where you don't care about the details, and in fine detail when you do. Whether the LLM followed the spec or not is mostly down to having the right tooling.
There can't be any interesting discussion about AI programming. Every conversation boils down to what skill files you use, or how Opus 4.6 compares to Codex, or how well you can manage 16 parallel agents.
Seems a lot of commenters here dislike their decision, I like it though.
LLM-generated projects, articles, blogs are low-effort products lacking authenticity.
And the discussion on LLM itself can in the long run be fairly tiring, follow r/LocalLLaMA for a while and you'll see what I mean. But if you are really into LLMs though, that sub is great.
It is simply not fun to go on to a subreddit, seeing 90% being projects and blogs that is obviously created using AI, and authentic content being pushed to the side due to the high volume of artificial works. r/Python was horrible at one point, but the mods have been stepping up their game.
I created an account and started reading this site primarily for programming news when r/programming took a precipitous dive in quality around 2020 or so. Before it was an example of one of the few good communities there, but it quickly became show and tell (ironically this was against its unenforced rules). And any real interesting posts had no discussion. But then I noticed the "Other Communities" tab would show posts from a HN posts sub that tracked posts here, and suddenly I was able to get great information. A post about CockroachDB that had 20 boorish comments complaining about its name over there would have the designer of it over here answering technical questions about its capabilities.
THAT SAID, I think this might be what gets me to go back to that place. I used to come here to read about new Python tooling, latest database development news, interesting thinkpieces on development practices, etc. Now it's dominated by AI evangelism, "I'm Showing HN™ What I Used By Claude Tokens On :)", AI complaining, AI agent strategies, AI's impacts on the industry news, etc. There are some non-AI posts but not as many good ones as there used to be, and a lot of the non-AI posts quickly turn out to be AI written. Because they respect their time as a writer greatly and my time as a reader not at all. It's ClankerNews, the Hackers are in short supply.
I know this snarky, I'm sorry ahead of time. But I don't know how else to make this point...
The fact that the people running r/progamming don't know not to wait until April 2 to publish this tells me that they don't have real-world experience in shipping software in a business environment.
We are SO past the point of software being developed without LLMs at _all_, the trend line is never going to reverse. I don't understand the people digging in as zero LLM absolutists.
Maybe this was a genius move made precisely to be ambiguous on whether it was April Fools or not... so that the author can later read the room and clarify whether it was or was not April Fools, without much repercussion either way.
This is to be expected. There's a definite split in the engineering community between those who are embracing AI, and those who are rejecting it. It's now become political, like systemd and wayland.
If you enjoy comedy, you should check the status of subreddits like /r/selfhosted or /r/homelab, etc. I find them interesting because they are on the edge of computers pro-users and software developers. Used to be a nice community
Now it’s people sharing AI apps that look exactly like other AI apps that they have never heard of [1]
Project rise then implode hilariously in a month [2]
An ebook management project that grew over a year with pretty conservative feature set, then in 3 months implements every ebook feature under the sun, breaks every thing, then implodes. Funniest thing is when the “AI Slop” callout is itself AI written and no body notices. [3]
Like… amazing comedy. Then after the owner deletes the repo, 10 people have to role-play the hero who “has the code” because clicking Fork on GitHub is the sign of a true hacker.
I had almost forgotten about that subreddit. Sadly it has been in a zombie state for years now. Despite having millions of members you can hardly find even 100+ comments on any post in the front page.
Last time I checked only political posts (like related to offshore programmers) got any kind of attention. Most technical posts barely gets 10 comments. Some of the smaller subreddits (like /r/ProgrammingLanguages) are much better.
Wow that's lovely. Wish we could do that on HN for a bit.
(Yes, I know, I can install an extension or something to hide LLM/AI submissions. I don't want to, and that's not the same thing, and won't have the same effect.)
I use LLMs, I think they are useful, but oh my sweet jesus I am so tired of reading and hearing about them everywhere.
We also believe that, generally, the community have been indicating that, by and large, they aren't interested in this content.
How can that be true? Reddit is vote-based. So if people weren't interested, they wouldn't vote it up and it wouldn't appear on the front page. Hacker News has no rule banning posts about Barbie and yet, amazingly, Barbie rarely makes it to the front page, because that's how upvotes work.
People clearly are interested enough to vote LLM related posts up, but a bunch of mods who don't like AI are upset enough to want to dictate what others can find interesting. Which is not unusual for Reddit.
The takes on LLM programming on reddit are hilarious and borderline sad. It's way past the point of denial, now into delusions.
They truly believe LLMs are close to useless and won't improve. They believe it's all just a bubble that will pop and people will go back to coding character by character.
Reddit is doomed anyway. People are using AI to start threads, and other people are using AI to comment on these threads. You can never know what you're interacting with.
221 comments
Ironically, that comment was added three months after I posted the article, when it was nowhere near the front page anymore, in a clearly automated and AI-driven review.
Still salty about it.
I generally fire-and-forget on Reddit, except for more niche communities where sometimes domain experts actually comment
I gave it all up when Reddit started recycling my old accounts and reposting my content as if it were new -- but not authored by me, just regurgitated back onto the site.
If that happened to me, you can bet it's happening en masse. Which indicates to me that the site is really dead.
AI programming is fundamentally different from programming and as such the discussions merit to have separate forums.
If r/programming wants to be the one solely focusing on programming then power to them. Discussing both in combination also makes sense, but the value of reddit is having a subreddit for anything and “just programming” should be on the list.
> AI programming is fundamentally different from programming
It's really not. Maybe vibecoding, in its original definition (not looking at generated code) is fundamentally different. But most people are not vibe coding outside of pet projects, at least yet.
Even putting aside the AI engineering part where you use a model as a brick in your program.
Classic programming is based on assumption that there is a formal strict input language. When programming I think in that language, I hold the data structures and connections in my head. When debugging I have intuition on what is going on because I know how the code works.
When working on somebody else’s code base I bisect, I try to find the abstractions.
When coding with AI this does not happen. I can check the code it outputs but the speed and quantity does not permit the same level of understanding unless I eschew all benefits of using AI.
When coding with AI I think about the context, the spec, the general shape of the code. When the code doesn’t build or crashes the first reflex is not to look at the code. It’s prompting AI to figure it out.
A compiler will tell you what is wrong. On top of that the intent is 100% preserved even when it is wrong.
An LLM will transform an arbitrarily vague input into an output. Adding more specification may or may not change the output.
There is a fundamental difference between asking for “make me a server in go that answers with the current time on port 80” and actually writing out the code where you _have to_ make all decisions such as “wait in what format” beforehand. (And using the defaults is also making a decision - because there are defaults)
Compilers have undefined behaviour. UB exists in well defined places.
Even a 100% perfect LLM that never makes mistakes has, by definition, UB everywhere when spec lacks.
We're telling them what to do in a loop. Instead we should be declaring what we want to be true.
>But most people are not vibe coding outside of pet projects, at least yet.
Major corporations have had outages thanks to AI slop code. Lol the idea that people aren't vibe coding outside of pet projects is hilarious.
That genie's not going back into the lamp.
(Heck, I've leaned on LLMs to generate damned SwiftUI code for me.)
And the discussion on LLM itself can in the long run be fairly tiring, follow r/LocalLLaMA for a while and you'll see what I mean. But if you are really into LLMs though, that sub is great.
It is simply not fun to go on to a subreddit, seeing 90% being projects and blogs that is obviously created using AI, and authentic content being pushed to the side due to the high volume of artificial works. r/Python was horrible at one point, but the mods have been stepping up their game.
THAT SAID, I think this might be what gets me to go back to that place. I used to come here to read about new Python tooling, latest database development news, interesting thinkpieces on development practices, etc. Now it's dominated by AI evangelism, "I'm Showing HN™ What I Used By Claude Tokens On :)", AI complaining, AI agent strategies, AI's impacts on the industry news, etc. There are some non-AI posts but not as many good ones as there used to be, and a lot of the non-AI posts quickly turn out to be AI written. Because they respect their time as a writer greatly and my time as a reader not at all. It's ClankerNews, the Hackers are in short supply.
There are some true gems however but usually in smaller focused subreddits.
The fact that the people running r/progamming don't know not to wait until April 2 to publish this tells me that they don't have real-world experience in shipping software in a business environment.
We are SO past the point of software being developed without LLMs at _all_, the trend line is never going to reverse. I don't understand the people digging in as zero LLM absolutists.
Now it’s people sharing AI apps that look exactly like other AI apps that they have never heard of [1]
Project rise then implode hilariously in a month [2]
An ebook management project that grew over a year with pretty conservative feature set, then in 3 months implements every ebook feature under the sun, breaks every thing, then implodes. Funniest thing is when the “AI Slop” callout is itself AI written and no body notices. [3]
Like… amazing comedy. Then after the owner deletes the repo, 10 people have to role-play the hero who “has the code” because clicking Fork on GitHub is the sign of a true hacker.
[1] https://old.reddit.com/r/selfhosted/comments/1r9s2rn/musicgr...
[2] https://old.reddit.com/r/selfhosted/comments/1rckopd/huntarr...
[3] https://old.reddit.com/r/selfhosted/comments/1rs275q/psa_thi...
Last time I checked only political posts (like related to offshore programmers) got any kind of attention. Most technical posts barely gets 10 comments. Some of the smaller subreddits (like /r/ProgrammingLanguages) are much better.
https://sciactive.com/human-contribution-policy/
IMHO, Mitchell Hashimoto[^1] is a good example for the community to learn how to cooperate with modern LLMs.
[^1]: https://github.com/mitchellh
(Yes, I know, I can install an extension or something to hide LLM/AI submissions. I don't want to, and that's not the same thing, and won't have the same effect.)
I use LLMs, I think they are useful, but oh my sweet jesus I am so tired of reading and hearing about them everywhere.
How can that be true? Reddit is vote-based. So if people weren't interested, they wouldn't vote it up and it wouldn't appear on the front page. Hacker News has no rule banning posts about Barbie and yet, amazingly, Barbie rarely makes it to the front page, because that's how upvotes work.
People clearly are interested enough to vote LLM related posts up, but a bunch of mods who don't like AI are upset enough to want to dictate what others can find interesting. Which is not unusual for Reddit.
Which is fair but just be honest about it.
/r/programming was already unappealing because they tend to be late to surface interesting content in comparison to HN.
>
Please don't post comments saying that HN is turning into Reddit. It's a semi-noob illusion, as old as the hills.If only, just this once, it were true. Sigh.
They truly believe LLMs are close to useless and won't improve. They believe it's all just a bubble that will pop and people will go back to coding character by character.
/r/assembly bans all discussion of 4GL
LLM programming isn't going away by not talking about it. It's time to move on, and eventually considering farming.