April 15th, 2025

The Post-Developer Era

Human developers are still crucial in software development despite AI advancements. AI tools need human oversight, and the job market challenges stem from economic factors, not AI replacement. Coding skills remain valuable.

Read original articleLink Icon
The Post-Developer Era

The article reflects on the ongoing relevance of human developers in the face of advancing AI technologies, particularly large language models (LLMs). Initially, there was widespread belief that AI would soon replace human developers, especially after the release of GPT-4 in 2023. However, the author argues that while AI tools are increasingly used in coding, they do not operate independently and still require skilled human oversight. For instance, at Google, AI contributes to about 25% of code, but human developers remain essential for guidance and quality control. The author shares experiences from teams using AI tools like Devin, which struggled to complete tasks effectively, reinforcing the idea that human developers are still necessary. The job market remains challenging, not due to AI replacing developers, but because of economic factors and misconceptions about AI's capabilities. The author believes that as companies recognize AI's role as an enhancer rather than a replacement, hiring may increase. Despite concerns about the future and the potential for a "developer renaissance," the author maintains that coding skills will continue to be valuable.

- Human developers remain essential in software development despite advancements in AI.

- AI tools are increasingly used but require human oversight for effective implementation.

- The job market for developers is tough due to economic factors, not AI replacement.

- Companies may begin hiring more as they recognize AI's supportive role in development.

- Coding skills will continue to be valuable in the evolving tech landscape.

Link Icon 20 comments
By @dasil003 - 26 days
This whole post-developer idea is a red herring fueled by investor optics.

The reality is AI will change how software is built, but it's still just a tool that requires the same type of precise thinking that software developers do. You can remove all need to understand syntax, but the need for translating vague desires from laypeople into a precise specification will remain—there's nothing on the AI horizon that will solve for this, and savvy tech leaders know this. So why are we hearing this narrative from tech leaders who should know better? Two reasons:

First is that the majority of investor gains in the US stock market have been fueled heavily by AI hype. Every public tech CEO is getting questions from analysts about what their AI strategy, and they have to tell a compelling story. There's very little space or patience for nuance here because no one really knows how it will play out, but in the meantime investors are making decisions based on what is said. It's no surprise that the majority of execs are just jumping on the bandwagon so they have a story to keep their stock price propped up.

Second, and perhaps more importantly, regardless of AI, software teams across the industry are just too big. Headcount for tech companies has ballooned over the last couple decades due to the web, smart phone revolution and ZIRP. With this type of environment the FAANGs of the world were hoarding talent just to be ready to capitalize on whatever is next. But the ugly truth is that a lot of the juice has already been squeezed, and the actual business needs don't justify that headcount over the long-term. AI is convenient cover story for RIFs that they would have done anyway, this just ties it with a nice bow for investors.

By @rmnclmnt - 26 days
The other day I was reviewing the work of a peer on a relatively easy task (using a SDK to manage some remote resources given a static configuration).

At several times I was like « why did you do that this way? This is so contrived ». And I should have known better but of course the answer started with « … I know but Copilot did this or that… ». Of course no test to validate properly the implementation.

The sentiment expressed in the article were developers won’t even bother to validate the output of coding assistants is real. And that’s one of the biggest shame to this current hype: quality was already decreasing for the past 10 years, signs indicate this will only go downhill from here

By @osigurdson - 26 days
The "AGI is right around the corner" argument is effectively corporate malpractice. No, shareholders don't want you to wait around and do nothing (other than a few layoffs here and there) while you wait for AGI.
By @ern - 26 days
I think the crux of this post is spot-on: we’re nowhere near a “post-developer” era. LLMs are great accelerators, but they still need a competent human in the loop.

That said, I do think he understates how much the value of certain developer skills is shifting. For example, pixel-perfect CSS or esoteric algorithmic showmanship used to signal craftsmanship — now they’re rapidly becoming commoditized. With today’s tools, I’d rather paste a screenshot into an LLM and tweak the output than spend hours on handcrafted layouts for a single corporate device.

By @mleroy - 26 days
Over the past 30 years, computers and software have dramatically transformed our world. Yet many sectors remain heavily influenced by their analog history. My understanding is that the HN community has always recognized that the future is already here, just not evenly distributed across work environments, administration, and general processes. Didn't many of us believe that numerous jobs could be replaced by a few lines of code if inputs and outputs were properly standardized? The fact that this hasn't happened or has occurred very slowly due to institutional inertia is another story altogether. Whether software development will become a "bullshit job" or how the world will look in a few years remains unknown. But those who constantly praise their work as software developers while simultaneously acknowledging that other non-physical jobs and processes could be fundamentally overhauled are living in a cognitive bubble—something I wouldn't have expected in this community.
By @stana - 26 days
After some experience, starting to use a new programming language is not such a big challenge. Mastering the new language eco system is. AI might help you generate new code faster, but feel like crunching code has not been so much of an issue. Bigger picture system design, something that scales and is maintainable is the challenge.
By @notepad0x90 - 26 days
Big part of coding is understanding the code and making decisions on what needs to be added/removed/changed. LLMs can code, even if they generate perfect code every single time, someone still needs to read the code and make decisions. Others speak about understanding business logic, interfacing with clients and stakeholders,etc... I get that, but even without that, someone will always need to decide how things should be done, improved,etc.. LLMs are not going to benchmark your code and they will never understand the developer/client's intent perfectly.

Why are LLMs in these context being viewed as more than really powerful auto-complete/intellisense? I mean, correct me if I'm wrong, but aren't LLMs still years away from auto-generating a complete complex codebase that works without fixing a lot of bugs/assumptions?

By @garylkz - 26 days
I thought we were supposed to reach AGI since like, the 1970s. Every time there's an advancement in AI, there's always speculations that "robot gonna take our jobs in 10 years".

That said, at least we have reached the phase where AI tools are commercialized, so that's another +1 I guess.

By @xnx - 26 days
There used to be a job of a "Typist". Now everyone does their own typing.

In the near future we'll probably see a lot more subject matter experts creating their own tools instead of requiring a dedicated person to translate their requirements.

By @colesantiago - 26 days
This "AGI" definition is extremely loose depending on who you talk to. Ask "what does AGI mean to you" and sometimes the answer is:

1. Millions of layoffs across industries due to AI with some form of questionable UBI (not sure if this works)

2. 100BN in profits. (Microsoft / OpenAI definition)

3. Abundance in slopware. (VC's definition)

4. Raise more money to reach AGI / ASI.

5. Any job that a human can do which is economically significant.

6. Safe AI (Researchers definition).

7. All the above that AI could possibly do better.

I am sure there must be a industry aligned and concrete definition that everyone can agree on rather the goal post moving definitions.

By @terminatornet - 25 days
> It’s like a tag team wrestling match; when I hit a task that Claude would excel at, I tap out and let him tackle it.

Please do not anthropomorphize the AI, it's not a he.

By @fire_lake - 26 days
The post developer era will be the post white collar era.

Surely the level of intellectual difficulty in software engineering is similar to practicing law, medicine, banking… ?

By @mediumsmart - 25 days
The client sends the new menu as a word document. Select all > copy > post this wall of text into llm > please proofread > please turn into semantic html > please translate into German, French and Italian. That takes the 3 minutes a Developer needs to formulate a prompt that will get a result with only 2 bugs.

Know if it actually is your tool.

By @andrewstuart - 26 days
Anyone who thinks AI will replace programmers isn’t doing much AI assisted coding.

AI is at best very helpful.

It’s a very very long way from making programmers obsolete.

By @intelVISA - 26 days
Truth is we've been post-developer for a while; powerful machines and high level abstractions has made good developers into managers, translating business objectives into software products and managing the outcomes with much less typing of code -- they're still called software engineers for legacy and cultural reasons, of course.

Sadly most shops greatly overlook this reality, stuck in 1991, and continue to add redundant layers and staff thinking it makes them 'agile'.

By @Fruitmaniac - 25 days
Saying AI writes code is like saying your clarinet plays music.
By @killjoywashere - 26 days
Someone kill that muppet that jumps out of the margin halfway down. Please.
By @ZainQasmi - 26 days
Funny how a foreign country got America to compromise on its core value of free speech that we used to lecture Europeans on.
By @wyclif - 26 days
Americans inexplicably re-elected a wildly incompetent conman to be president

In fact, this was totally and easily explicable after the dominant political party tried to convince Americans that their current leader was not suffering from severe cognitive decline and was "sharp as a tack" (a Biden admin talking point) and that NPCs lying to Americans about Trump being "easy to beat" by Kamala and also lying to them about Trump calling Charlottesville protestors "very fine people" and thinking that those fake attempts to make Trump look like he was endorsing Nazis wouldn't backfire explosively.

So, no, it was not "inexplicable" at all but it was rather Whiplash Effect initiated by media narratives originating in the Democrat ecosystem. And don't forget: Trump, Elon and Rogan are all ex-Democrats. I wonder, too, if the author was one of those people who deluded himself into thinking "Kamala is a great candidate" against all the evidence.

So the "Russiagate" hoax failed, "Kamala is awesome" failed, elites orienting themselves around appeasing far left pressure groups failed, smug contempt for middle America failed (astonishingly) and yet it is "inexplicable" why Orange Man Bad won the election!

By @simonw - 26 days
"If you’re passionate about software development, or if you see it as the best opportunity for you to earn a high salary that’ll lift you into the upper middle class, I really hope you won’t let yourself get discouraged by AI hype."

+100 to that