Stream of consciousness

Stream of consciousness, Sept. 28th, 2025: AI & Coding

In the last two or so years since ChatGPT came out, I never published anything related to AI. That's a bit odd, given that because I'm working in a fast-moving startup, it has been on my mind every single day since then. I refrained from writing about it because I generally try to stay away from participating in hypes, and it felt like there were only two camps for the longest time: Those who think it will make all programmers redundant lates next month (the date keeps moving) and those who think its a big bubble and will be gone next year. Usually, on hyped topics, the reality seems to end up somewhere in the middle. I'm happy to accept that reality, but with hyped topics the discussion seems to heated that it doesn't make any sense to argue on the level in the beginning. It's mostly screaming in the void.

The other reason I didn't want to write about it is because it was too early to have an opinion on the topic. Things were moving fast, and admittedly, the jumps in quality in early models (say ChatGPT 3.5 to ChatGPT 4) were huge. It was impossible to get a feeling for the trajectory we're on, because the technology was so new and every new release brought big improvements. I don't think that this fact has necessarily changed, but the improvements seem to be more granular now, so I feel a little more comfortable thinking about where we might be in two years from now. My thinking will likely be wrong though.

I still don't want to write about any direct predictions. I have a couple of potential routes in my mind, just because I like to be prepared to what the future may hold, but they're mostly based on gut feeling, probably not that surprising and therefor not really valuable to talk about. I may write them out one day, just because I haven't, but today is not the day.

What I do want to write about, though, is the effect AI has on software engineering. My exposure to this is two streams: The people I work with day to day and people that I interview for open roles (which in my case involves a small code challenge). It seems like there is a large percentage of developers who have just decided to outsource everything they do to AI. It makes sense when you think about it: Programmers often solve problems in a programatic way, they automate things. If you have a hammer, everything looks like a nail, so automation is what we do, where we can. Additionally, many developers are lazy. They have often been praised for this, because this apparently would make them predestined to find great solutions for problems that nobody else though about. In my experience, most lazy developers do not spark that genius, they're just too lazy to write good code, or put the work in on the boring that would give a huge return, or to write tests or documentation.

And it seems like there is a large amount of software engineers, across most seniority levels, that have the idea in their mind that if they just give all their tasks to AI, they can mostly get paid to write an instruction every couple of minutes while watching YouTube on their main monitor. I don't know how big this group actually is, but I know that its more vocal - both online and also in the interviews I have. The ration of good submissions (which may include AI usage) to complete AI slop is roughly 1 to 10 right now. There seems to be a group of people that actually thinks companies are willing to pay them 6-digit salaries for writing "Claude, fix this" into a textbox.

There is also the other extreme: People that, by principle, ban AI from their development workflow because they deem its bad and they're more productive without AI. Research seems to back their thinking for now. If I had to pick a group, I'd rather choose this over vibe-hype bros. But I think neither group is correct. I think as an industry, we have largely not been able to act like grown-up professionals that are presented with a new tool and need to figure out what to do with it.

I'd love to see way more discussions about this. How AI was used to actually make something easier or better. Not as a silver bullet that just one-shotted a feature that it read from a linear issue via an MCP.

To be honest, I have no idea where I want to go with this, I've lost track and am mostly ranting at this point. Maybe its just that, a rant about an industry that just threw away all their funny pictures of apes they spent their live savings on to follow a new golden cow.