Why is everyone trying to replace Software Engineers?
The article argues that software engineers are not at immediate risk of being replaced by AI, emphasizing the need for better communication with non-technical colleagues to highlight their problem-solving role.
Read original articleThe article discusses the ongoing discourse about the potential replacement of software engineers by AI technologies, particularly large language models (LLMs). Despite advancements in AI's ability to generate code, the author argues that software engineers are not at immediate risk of being replaced. LLMs struggle with problems outside their training data, and their reasoning capabilities are limited. The perception that software engineers could be easily replaced stems from a lack of understanding among non-technical colleagues about the complexities of software development. Many view software engineering as a simple translation of English into code, overlooking the intricate problem-solving involved. The author emphasizes the need for engineers to improve communication with non-technical team members to convey the value of their work effectively. By fostering a better understanding of the software development process, engineers can enhance collaboration and demonstrate their role as problem solvers rather than mere coders. This shift in perspective can help colleagues recognize that software engineers are integral to a company's success, rather than just a cost center.
- Software engineers are not in immediate danger of being replaced by AI.
- Many non-technical colleagues lack understanding of the complexities of software development.
- Effective communication between engineers and non-technical staff is crucial.
- Engineers should view themselves as problem solvers to enhance their perceived value.
- Fostering understanding can improve collaboration and business outcomes.
Related
Engineering over AI
The article emphasizes the importance of engineering in code generation with large language models, highlighting skepticism due to hype, the need for structural understanding of codebases, and a solid technical foundation.
Ask HN: SWEs how do you future-proof your career in light of LLMs?
The integration of large language models in software engineering is rising, potentially diminishing junior roles and shifting senior engineers to guiding AI, necessitating adaptation for career longevity.
AI-assisted coding will change software engineering: hard truths
AI-assisted coding is widely adopted among developers, enhancing productivity but requiring human expertise. Experienced engineers benefit more than beginners, facing challenges in completing projects and understanding AI-generated code.
AI Coding Is Based on a Faulty Premise
The article warns that increasing reliance on AI in software development may lead to poor quality, echoing past software crises. It emphasizes the necessity of human intuition and communication in coding.
The LLM Curve of Impact on Software Engineers
Large Language Models (LLMs) impact software engineers differently: junior engineers benefit greatly, mid-level engineers face limitations, senior engineers are skeptical, while Staff+ engineers use LLMs for innovation and prototyping.
0 - https://www.kalzumeus.com/2011/10/28/dont-call-yourself-a-pr...
Adding to this point, LLMs won't fully understand the problem, so they're simply incapable of developing complete solutions. I've had a lot of success with these tools in developing functions, but at that point I've already architected and designed the solution. Even so, it saves time. These should be regarded as developer productivity tools, not developer replacement tools.
"Replacement" is smoke and mirrors at this point; anyone who seriously tries it will quickly fail, with today's technology.
> CEO: Fantastic! I've already fired all of the doctors and nurses, how soon can it start doing surgeries?
> Scientist: ...
A better metaphor for software is like writing a book. Everyone sees writing a book as borderline undoable, and the expectation is that it takes months and months. I think that software and writing have a lot in common (including lots of text, and even the verb used), and that explaining ourselves as book-writers would help set better expectations.
They (CEOs and executives, shareholders, etc) don't like how much it costs to make software. They think software engineers make too much money.
You (a human) are always a cost they will try to reduce.
There's nothing special about SWEs vs any other class of worker. If they can be replaced or eliminated, they will be, just like any other worker.
1) We're expensive (at least for Poland, I can aim for ~6-8k PLN monthly after few years of experience. Other, even civil/mech engineers will have around 4-6k monthly with similar experience) [edit: and minimal wage is 3.5k after taxes]
2) It appears to be replaceable (you can replace us with LLMs to an extent, you can't do that with other jobs)
Says who? How do you know if solving programming problems is not just advanced pattern matching?
The fact that we find most (if not all) of our solutions on SO or talking to colleagues points towards the pattern matching hypothesis quite strongly.
To be clear: I am talking about the programming aspect of the job only, not the other important things (talking to customers/product managers, crafting solutions, etc.)
In many domains, the scope and complexity of software systems goes beyond the ability of a single software engineer to manage. A coordination layer becomes necessary when the number of engineers required goes beyond a threshold (say 5 or so). When the development effort must be coordinated over extended periods (say several months or years), mechanisms to raise capital and manage risk become necessary. These functions are why companies exist.
Consider that a massive increase in software engineer productivity will make coordination unnecessary for many kinds of software. In the market that opens up, companies with expensive executives, middle management and coordination inefficiencies will not be competitive. Smaller shops with a solo engineer or a team of less than 5 will outcompete larger players because their costs will be significantly lower. Massive one-size-fits-all products will be harder to justify when a small dev shop can quickly build or customise software for the unique requirements of a business or niche.
Before the CEOs stop needing engineers, engineers will stop needing CEOs and managers to coordinate their efforts and raise capital.
I realize they think they they finally can do it by themselves, or rather by hooking up a bunch of fresh bootcamp devs with AI; and I look forward to watching them crash, burn and come back begging for help.
It will grow the sector and massively inflate the combinatorial explosion of system X system X system. Programmers will use AI as a tool in their tool box -- I already do for boilerplate and tests and explaining code -- and they'll use it to write ever more software to work with ever more systems.
What it may do though is cut the bottom off the field. Very junior programming jobs could vanish, creating more of a learning cliff to really get into it. This is happening in every discipline and has been for a while. It's a real problem for our traditional method of school followed by entry level job. I think the future's going to be some kind of school integrated with apprenticeship followed by internship followed by a real job. Unfortunately that may suck in a lot of ways. Fields that are like that require too much 'paying your dues.'
Giving your product the ability to talk & work directly with the customer seems to me a much more valuable activity than covering all the edge cases that would arise during the development of such a solution with an LLM.
Remember, everything is a function [0] and LLMs support calling them. The trick is having the skill and domain expertise to establish bounded contexts and prompts that align with realistic product use cases and transition between them cleanly.
I know there is a lot of doubt that users would enjoy using a chat interface, but I strongly believe it would be popular as long as it actually works. If it's not well designed and the GUI tools are faster even for a novice, you will have wasted a lot of time/money for nothing.
I don't buy it. I also don't think it'll replace writers or artists. Just the low brow, chum bucket stuff which in programming terms is a todo app or web form.
Now you have pneumatic crane, you can move stacks and stacks of things around with ease. You need less strong guys around and instead there is a new need for fewer people that are well-trained on how to operate this pneumatic crane safely. You'll get more work done with less people.
That's what's happening with software engineering and LLM copilots; it's an industrialization of coding. You don't need 20 engineers to get that product out. You need a few that are really good operators of the machinery. Adept at prompting and reviewing code instead of writing code.
The idea that this is possible justifies the AI bubble and the software bubble at the same time.
The former own land, companies, and such assets that they profit from the difference in value between what is sold and the wages they pay labourers to produce that value.
Those people profit from wages going down.
One way to do that is to gain leverage over the job market by threatening to remove access to labourers. Unless those labourers are willing to accept a lower wage they can be replaced by AI and nobody will need them anymore.
Another way they can increase their profits is to force labourers to use GenAI tools to produce their work. There's constant pressure from the capital class to, "always improve productivity!" This is just another example of that.
I suspect there will come a point, perhaps, where some companies will decide to let developers go who refuse to use GenAI (or simply not hire them to begin with).
I don't know that we will see the technology become sophisticated enough to replace human developers entirely.
Companies will always look for ways to save money, and headcount is usually the biggest cost
Well slowly see more Ai intersect with other jobs as people with hobbies that can engineer make things.
Assuming Ai actually delivers what it promises in the first place
The job won’t last forever, but it will outlive almost every other job
Even were it not the case for disposable CRUD Android/web apps that represent the bread and butter for half the industry, the effect on the structure of popular media is alarming in ways I don't think anyone has a hope of understanding yet. I imagine kids coming up now will not be glued to their phones anywhere nearly like the current batch are, or if they are glued to something, perhaps it is a conversational agent prompting them through an earpiece or similar.
Hand-wringing about the quality of LLM-driven app development really misses the point of all of this. We're currently using an extremely novel technology to emulate aspects of our now-defunct technology (which I believe includes the web), in much the same way fax gateways at one time were a popular application for email.
This can't possibly be true. But if it were it would be highly illegal.
But it will most likely lower their pay into "normal" engineering compensation territory.
Equally obvious: nobody is going to replace the most expensive kind of employee, the CEO.
Less obvious point: software engineers sometimes say no. LLMs don't. They'll always give you something, even if it doesn't work or has security holes. The idea of shipping worse products at higher speed is incredibly tempting to sales types.
If you think the app store is bad now, wait until you see what level of AI sloppification is possible when anyone can turn out hundreds of clones of any popular apps.
> We need to start meeting our colleagues where they are and explain what we do in ways that make sense to them. The goal is not to turn them into engineers but to help them get a high level understanding of what it takes to build a software product.
See, why stop at software engineers? Because coding is text based? There is a whole class of IT middle managers in non-tech companies making good money due to "responsibility" and "team supervision". How about they start explaining the value that they bring?
If it is not more than the usual
- check a list of incoming jobs to be done submitted by other departments
- assign the jobs to be done to someone on their team, mostly the person who worked in the same area before
- ask every person (daily/weekly) for status updates and an estimated completion date for the jobs assigned
- ask if the job was done sufficiently and can be reported as completed
- report the weekly/monthly completion rate and hours spent to their supervisor.
- every now and then review contractor bids for open RFPs
then the current state of LLMs can do this just fine.
Is there a good reason not to eliminate most of the little kingdoms in a large org and instead invest the money saved into more AI supervision, better QA and a lot more marketing?
Left to their own devices, software engineering would be just as abusive and exploitative as truck driving or automotive repair. They've already made great strides in this direction in terms of how people are hired (asking ACM/"Leet" code questions) and managed ("agile" nonsense).
The thing is it doesn't really matter what tools you use. The problems are complex and they don't become simpler just because you use an LLM. We will gain a bit of efficiency and then spend that gain on whatever new problems LLMs throw up plus the constant stream of new problems the world generates.
Software engineers are not getting replaced by LLMs any more than they are by UML, visual programming tools, RPA, MDD, code generators or any of the other fads that have come and gone over the years.
(None of that has anything to do with whether it's just for the world to operate this way. Personally I'd like to see all engineer wages go down and owner/shareholder returns go down and all that money go to lesser-paid roles or cheaper prices. But we lived in a version of capitalism that doesn't presently seem to present a way to do that...)
Personally I think programming will be 10x as efficient in the not-so-distant future (25 years maybe), and that everyone vaguely understands this: that what we're doing today is in some sense a waste of effort. Similar to building a road or railroad with hundreds of manual laborers instead of a few big machines.
That aside, it's interesting that software engineering will probably be among the first to be destroyed by ASI. This is simply because it's easy to do reinforcement learning on.
we're living in times where enshittification is a reliable strategy to make money and the people doing it don't care about long-term outcomes as long as they can cash out before they happen
this is almost the entire american business landscape today
Related
Engineering over AI
The article emphasizes the importance of engineering in code generation with large language models, highlighting skepticism due to hype, the need for structural understanding of codebases, and a solid technical foundation.
Ask HN: SWEs how do you future-proof your career in light of LLMs?
The integration of large language models in software engineering is rising, potentially diminishing junior roles and shifting senior engineers to guiding AI, necessitating adaptation for career longevity.
AI-assisted coding will change software engineering: hard truths
AI-assisted coding is widely adopted among developers, enhancing productivity but requiring human expertise. Experienced engineers benefit more than beginners, facing challenges in completing projects and understanding AI-generated code.
AI Coding Is Based on a Faulty Premise
The article warns that increasing reliance on AI in software development may lead to poor quality, echoing past software crises. It emphasizes the necessity of human intuition and communication in coding.
The LLM Curve of Impact on Software Engineers
Large Language Models (LLMs) impact software engineers differently: junior engineers benefit greatly, mid-level engineers face limitations, senior engineers are skeptical, while Staff+ engineers use LLMs for innovation and prototyping.