January 21st, 2025

Ask HN: Can we just admit we want to replace jobs with AI?

The discussion on AI models emphasizes concerns about job automation and the implications of Artificial General Intelligence, highlighting the need for honest dialogue to prepare society for its challenges.

Ask HN: Can we just admit we want to replace jobs with AI?

The discussion surrounding artificial intelligence (AI) models, both open and closed source, highlights a growing concern about the impact of automation on jobs. With models like DeepSeek-R1, Llama, and Claude, there is a prevailing sentiment that the focus of AI development is primarily on automating tasks rather than advancing humanity. OpenAI's definition of Artificial General Intelligence (AGI) emphasizes the creation of systems that can outperform humans in economically valuable work, which raises questions about the future of employment. Despite some optimism among AI researchers about the potential benefits of AI, there is a call for a more realistic acknowledgment of the challenges posed by AGI. The argument suggests that a candid discussion about the implications of AGI is necessary for society to prepare adequately for its arrival, rather than fostering unrealistic expectations that could lead to failure.

- The focus of AI development is increasingly on job automation.

- OpenAI defines AGI as systems that outperform humans in economically valuable tasks.

- There is a disconnect between optimism in AI research and the potential negative impacts on employment.

- Honest discussions about AGI's implications are essential for societal preparedness.

- Acknowledging the challenges of AGI can help mitigate future failures.

Link Icon 53 comments
By @gnfargbl - 3 months
But how is an individual supposed to "prepare" for AGI?

The outcome of further automation will be to move even more capital under the control of an even smaller number of hands. It's that increasing inequality which is the problem, not AGI.

By @gregjor - 3 months
Who hasn’t admitted this already, either outright or by their actions?

Going back to the industrial revolution the people who control capital and production have openly sought to replace human labor with automation. Nothing new, just that this time it threatens “knowledge workers” and the techie HN crowd. If you hear someone claim that they want to “advance humanity” with automation they profit from you can safely assume they don’t mean that sincerely.

By @rustcleaner - 3 months
As for why people losing economic power is a bad thing, see CGP Grey's Rules for Rulers videos. AI is about to make a whole bunch of people superfluous, and superfluous people might as well be rhinos on a safari as the techno-war overlords cruise by in their jeeps. It means our societies will degrade into natural resource rich but skill poor countries: disenfranchised poor hordes kept in check by the zoo keepers.

Are you ready for your Brave New World? :^)

By @Gud - 3 months
I would absolutely embrace AI if the means of production was owned collectively, not by the few privileged.

It seems to me, that the path we are on will lead us closer to a Terminator-like future, and less likely to lead to a positive future.

Our leadership are greedy fools.

By @Tade0 - 3 months
My position is that the demographic collapse in countries with a near 100% literacy rate is going to more than make up for any jobs replaced by AI.

It is said that AI will make 200 million jobs redundant. China's workforce alone shrunk by 80 million during the previous decade according to their 2020 census.

Overall I think we need to increase efforts with AI, particularly make it more energy efficient, as currently it's simply expensive to run, if we wish to not bear the consequences of having a globally ageing literate population.

By @dabe19 - 3 months
When I look at the progress and advancements in industrial automation as a blueprint, and I have worked in this field for my entire professional career, and from my time building a few toy 'agentic AI' systems as a hobby and for education, the jobs aren't going anywhere.

They're going to change, where humans sit in the loop and how they program will change, probably a good blend of procedural control and object-oriented processes, where the agents are considered objects and process flow is defined in the procedural methods, but don't freak out there's still a ton to do.

There are many design principles we can leverage to enhance human participation and fulfillment instead of just replacing them. Lights out manufacturing hasn't taken off, why would we think a lights out society would work any better?

Edited for formatting.

By @RachelF - 3 months
Yes, the idea of businesses is to reduce costs. Labor costs are a big expense.

It's harder to do than it looks. Offshoring to places with lower wages has kept wages for many white- and blue-collar workers in the West from growing. But it is hard to do correctly.

I suspect AI or AGI or AI with agents will take a longer time to actually work reliably than many suspect.

By @oytis - 3 months
I understand why _they_ would want to replace jobs with AI, but not why _we_ would want it. The democratic world as we know it is largely based on labour, and especially intellectual labour, having a lot of leverage. If it is no longer the case, most of people are not needed any more, and their opinion doesn't matter, as simple as that.

Not sure how you (as a worker) want to prepare for that - maybe by becoming depressed, quiet and atomized in advance. US leaders (political and big tech) seem to be preparing exactly for that now, but my assumption is you are not one of them.

By @jdietrich - 3 months
None of this is news to blue collar workers. We've spent the last 300 years replacing skilled manual workers with machines.
By @munksbeer - 3 months
In my opinion, strong AI is inevitable. And I also think superhuman AI is inevitable, though where that point lies in future is guesswork.

Between those two points is a period of instability. Judging from history, many people will be impacted and those people may not transition to new jobs. Though in my case, if I were to lose my software engineering job because AI had become good enough to replace me completely then my ego thinks that we'd be much closer to superhuman AI than I thought and all bets are off. Software developers don't just write code. They think, and think, and think and second guess, and look at the larger stack impact, and scaling, and so on. AI may well get to that level but IMO there is a reason we still don't see robots fixing plumbing, and it is because the reasoning part of these jobs are much deeper than some people realise.

Once we get to the point that huge amounts of white collar reasoning workers can be replaced, then I optimistically think we'll also be closer to reducing scarcity. What I'd hope for is that we would adjust to other types of work that we find hard to predict at the moment, just like every other leap in technology has resulted in. I know the strong narrative is AI can simply do those too, but try to imagine a world where we have superhuman ability to produce, but no consumers because no-one has the means to pay for the produce. That just doesn't work. These so called "elites" will have no-one to sell to. So it seems self obvious to me that a balance will be found. If it isn't people will elect governments who force that balance. Maybe some rough period ahead, but I don't subscribe to the doom theories on the economic side.

What I do unfortunately subscribe to is doom of another type. Once near superhuman AI is possible, then it feels impossible to stop it from being widely available, eventually. And given how many humans there are on the planet, I feel there will be some who are deranged enough to use the technology for full planet wide destruction of some sort. I think this feels almost close to inevitable.

By @Lockal - 3 months
According to their clear definition, "AGI will be achieved once OpenAI has developed an AI system that can generate at least $100 billion in profits"[1]. To reach this amount they can just do nothing, as they secured $500 billion in investment[2]. So consider "AGI" already happened, but nothing happened, only slop.

[1] https://gizmodo.com/leaked-documents-show-openai-has-a-very-...

[2] https://www.forbes.com/sites/dereksaul/2025/01/21/oracle-sto...

By @physicsguy - 3 months
It feels to me like there'll start to be people retraining sooner rather than later. If you feel your job is threatened, you're likely to look at options that can't be easily displaced. I did a Physics degree and PhD and my 'backups' are going to do something in the very very regulated nuclear industry (currently crying out for people in the UK) or going into teaching.

I think the people who will be most easily displaced are those that don't have an additional combined skillset. I see a lot of software engineers with a CS background whose skillset is only writing software. I think people from mixed backgrounds are likely to do better in a big disruption since they've got another thing to jump to, perhaps in a different industry.

By @denkmoon - 3 months
Do we, so far, have any economic analysis or research showing AI is actually reducing jobs?
By @starchild3001 - 3 months
I graduated from undergrad 25 yrs ago. I worked in 5 different business sectors with highly overlapping but also quite distinct skill sets (4.5 yrs of academia/writing papers, 5 yrs of telecom/signal processing engineering, 7 yrs of quantitative finance, 1 yr of mobile consumer-device intelligence, 7.5 yrs of information retrieval).

Every ~5 yrs I've had to reinvent myself.

So, what about the disruption from AI? Just suck it up, ok? You have to reinvent yourself anyway if you are in the tech sector. Start moving already. Every tech disruption is a challenge and opportunity.

By @wickedsight - 3 months
I don't think we're replacing anything. Just looking at my own workplace, we can have multiple X increases in productivity and still have enough work to do. In IT for example, we're building nice software, but some bigger project have been running for years and are nowhere near done. Accelerating this would mean we can do more projects with the same budget, not that we will slash people and keep doing the same number of projects.

Sure, this won't be true for every role and organization, but for many this will definitely be true.

By @cadamsdotcom - 3 months
Let’s chat about lawnmowers.

There was a time when cutting grass was done with scythes, specially shaped swords. All day back and forth swinging the blade to get grass just the perfect length. Swing, take a step, swing. It was backbreaking labor in the sun. So lawns probably were quite expensive and having your own was probably pretty flashy, or maybe they were public, provided by the state to the people, or maybe both.

Along comes the string of inventions that led to lawnmowers. Now, anyone can mow grass in an afternoon. Gone are the lawn-scything jobs! Did the sky fall then?

Of course we know how this played out. Some lawn-scythers lamented the loss of their work, but they’re forgotten to history. Other clever lawn-scythers went and trained as lawn-mowers, while a few even cleverer ones went and became lawnmower mechanics, a few became engineers so they could start to design lawnmowers, and some started lawnmower factories and employed plenty of workers making way more lawnmowers than anyone thought possible.

Lawnmowers aren’t free but they’re super cheap and getting cheaper all the time, they’re abundant, and the real cost now is labor, which is going up all the time. So what do you do? You pay a lawnmowing person to take care of your lawn while you work your high paid office job.

Whenever someone says AI is coming for the jobs, ask them which exact AI model is coming for the jobs; which tool built by which startup it is that’s coming for the jobs, and if so why are they hiring?

By @globular-toast - 3 months
What we need to admit is why we are so scared of this. Do you want to work? If nobody has to work then we are all permanently on holiday. Isn't that good? What are you afraid of here? We just get to spend all day making or finding things fun. Are people afraid because they have no meaning in their lives other than work? Do you need a boss? Are you scared because everyone will be equal and you won't be "special" any more? What is it? I'd really like to know specifically what people are afraid of.

Please just think about it a bit. Everyone seems to be thinking "I'll lose my job!" but what you need to be thinking is "everyone loses their job".

By @insane_dreamer - 3 months
> just tell the truth plainly that everybody hasn't actually thought about what happens when AGI comes

they have and they will reap the benefits as the "alphas"; what happens to the rest of humanity, that's someone else's problem

By @antquinonez - 3 months
There’s uses of AI that do not replace people but rather hands them tools that enable and empower them to create as they have never created before. I hope to talk about what I’m building in the next few days.
By @akmarinov - 3 months
I thought everyone already knew this.

IT, doctors, lawyers will be mostly gone. Probably most office worker jobs that I’m not thinking of.

Best thing to do is become a nurse, that’ll be bulletproof until AGI makes progress on robotics.

By @jkhdigital - 3 months
All major technological advancements have eliminated some jobs and created others. In general, producing more with less human input is how society becomes richer. Is it a tragedy that we no longer need telephone switchboard operators, ice cutters, and lamplighters? There were surely countless mini-tragedies as the people who made their livelihoods that way struggled with stagnant wages and unemployment, but the next generation became free to pursue other more productive forms of work and accumulated greater wealth as a result.
By @purplethinking - 3 months
Yes, of course. But it's also inevitable and overall good. If we stagnate then we are headed towards certain doom via climate change, nuclear war, an asteroid, a vulcano eruption leading to cooling and crop failure or a pandemic. If we are to survive and thrive long term we need to become true masters of our environment, and that means we need to be smarter, stronger and more productive.

I think this is my main disconnect with the pessimists, I don't see "stop AI progress" as a valid option anyway.

By @hiAndrewQuinn - 3 months
I don't think anyone serious was ever claiming they weren't trying to "replace jobs with AI", except maybe as a PR thing. Indeed this claim falls straightforwardly out of the numbers: Human labor is the most expensive component of most industries in the developed world, so naturally we'd like a subsitute. It's a much more obvious near-term target than, say, innovating over a decade or two to reduce the global price of commodity rice by a cent per kilogram.
By @revskill - 3 months
If you take automation and AI to the extreme, the world only needs ONE person to manage the whole robotic AI to do the rest of production. What else do you need ? More people ?
By @Mukina - 3 months
Good question.

My sense is that there is a lot of compute capacity, data acquisition, etc., required to create any dent in this type of automation.

Likely the next 2 years are focused on more “compute”, “data center capacity”, and “energy”. That allows for a) better training, b) more inference load, … and hence prepare for more automation.

Much like the Y2K era of 30 years back, we are entering the new era of YNA, an era whY Not Automate.

By @Throw83949489 - 3 months
> we aren't advancing humanity with AI, it is just focused on automating out jobs.

We are doing both!

You are assuming people like to work, and jobs are something good. Without jobs, people will have less stress and more free time. They will spend more time on leisure activities, with their families, and so on.

Also jobs contribute to carbon emissions, we have to maintain cities, large offices, car makes CO2 for commute...

By @smgit - 3 months
It's not going to happen over night. But even if something radical happens, it will probably play out like Marshall Brain describes in Manna - https://marshallbrain.com/manna1

Not all bad.

By @xgstation - 3 months
from Company perspective, I think the answer is yes

but from individual perspective I don't think that is the case. Since AlphaGo first time was released and beat world-class players, have all these players gone? not really, but it even promotes more people study Go with AI instead.

As a software engineer myself, am I enjoy using AI for coding? yes, for many trivial and repetitive works, but do I want it to take my coding works fully away such that I can just type natural languages? My answer is no. I still need the dopamine hit from coding myself, either for work or my for hobby time, even I am rebuilding some wheels other folks have already built, and I believe many of ours are the same.

By @jakeinspace - 3 months
What still has value when intellectual/knowledge work becomes free?

1) Land

2) Natural Resources

3) Energy

4) Weapons

5) Physical/in-person Labour

That means that, unless there are big societal changes, you better have at least a few of those available to you. A huge percentage of Americans own no land nor investments in durable assets. That means at best, they are thrown into a suddenly way-oversupplied physical labour market in order to make a living.

At the geopolitical level, a country like the United States, with so much of its current wealth tied to amorphous things like IP, the petro dollar, and general stability, certainly has all 5 of the above categories covered. However, there's a lot of vulnerable perceived value in the stock market and housing market which might vanish if the world is hit with a sudden devaluation of labour. Even if the US is theoretically poised to capitalize off of AGI, a sudden collapse of the middle class and therefore the housing market and therefore the entire financial sector might ruin that.

UBI is the bare minimum I can imagine preventing total societal collapse. Countries with existent strong safety nets or fully socialist systems might have a big advantage in surviving this. I certainly feel a certain sense of comfort living in Quebec, where although many aspects of government are broken, I can at least count on a robust and cheap hydroelectric-supplied grid and reasonable social safety nets. Between AI and climate change, I feel like there are worse places to be this century.

By @ergocoder - 3 months
Yes. Who doesn't want to admit that?

This is not just for businesses but for everything in life.

For example, we all buy robot vaccuums, right? People even rearrange furniture to make sure it is compatible with the robot vaccuums. Everyone wants it to do more and be more reliable.

By @donanon - 3 months
This is great thread! Thanks everyone for serious polite exchange!

Seems to me that AI (today) allows amateurs to generate low quality professional work. Cheaply. Disrupting the careers of many creative professionals. Where does that lead?

By @laptopdev - 3 months
We dont align our current actions with AGI. Rather, we align our actions with a presumption of what we think AGI is to become (assuming some inevitability).

Some people believe AGI is imminent, others believe AGI is here now. Observe their behavior, calm your anticipation, and satisfy your curiosoty rather than seeking to confirm a bias on the matter.

The tech is new in my experience and accepting claims beyond my capacity to validate such a grand assertion would require me to take on faith the word of someone I don't know or have never seen, who likely generated such a query in the first place outside the context length of Chatgpt mini.

By @randoments - 3 months
In the whole history of the world, humans have always strived for automating their job and yet, here we are with a single digit unemployment rate.
By @raylad - 3 months
Throughout human existence there have been only a few means of distribution of resources.

In hunter/gatherer days within small bands, and also to a large extent within families (but perhaps a bit less now) the method was Generalized Reciprocity[1], where people basically take care of each other and share. This was supported by the extremely fertile and bountiful forests and other resources that were shared between early people.

Later, the large "water monopoly" cultures like ancient Egypt had largely Redistributive economies, where resources flowed to the center (the Pharaoh, etc.) and were distributed again outwards.

Finally we got Market Exchange, where people bargain in the marketplace for goods, and this has been efficiently able (to a greater or lesser degree) to distribute resources for hundreds of years. Of course we have some redistributive elements like Social Security and Welfare, but these are limited.

Market Exchange relies now on basically everyone having a Job or another means to acquire money. But with automation this breaks down because jobs will more and more be taken by the AIs and then by robotic AIs.

So only a few possibilities are likely: either we end up with almost everyone a pauper and increase dole payments just up to the point where there's no revolution, or we somehow end up with everyone owning the AI resources and their output, taking the place the forests and other ancient resources played in hunter-gatherer days, and everyone can eventually be wealthy.

It looks as if, at least in the USA we are going down the path of having a tiny oligarch class with everyone else locked out of ownership of the products of the AI, but this may not be true everywhere, and perhaps somehow won't end up being true here.

[1] Stone Age Economics by Marshall Sahlins https://archive.org/details/stoneageeconomic0000sahl/page/n5...

By @downboots - 3 months
If the perceived value of laying you off exceeds the perceived value of keeping you employed... Seems like work and value decoupled
By @beardyw - 3 months
If we assume that very many people will be unemployed there will need to be a complete change in political focus towards supporting society rather than the individual. That is, wealth will need to be directed to those who need it rather than those who can grab it. The USA is one of the least politically prepared for that, but contains the wealth to resolve it. We will live in interesting times.
By @tim333 - 3 months
>Can we just be real and just say we aren't advancing humanity with AI, it is just focused on automating out jobs.

No because that's not real. AI will automate jobs but also advance humanity.

It's like saying can't we be real and admit that tractors and farm machinery were not advancing humanity they were just replacing farm jobs. They replaced some farm jobs but also provided food abundance and let people go off and work as personal trainers and the like to help people lose the weight. And work as scientists, artists etc.

By @Agraillo - 3 months
Why can't other AI tasks be also in focus? Like having fun with some creativity. I remember that I once created a story with an LLM when we switched sides. I still remember some of the twists and it's interesting that I no longer can attribute them to either of us :)
By @donanon - 3 months
Newb: What a great thread of informed, courteous folks! Thanks!

Simplistic: A major impact today is that amateurs using AI can generate tons of low quality professional work for peanuts. Overall quality suffers but profits soar or rise enough. Where does that lead?

By @EZ-E - 3 months
I still don't think AI will displace a lot of jobs. Am I the only one left? I'm sure it had had impact in copywriting/translation but I don't see a tsunami.
By @usrnm - 3 months
The whole idea behind computers has always been replacing people, it's the foundation of our industry. From the very first scientific calculations that used to be performed by people before computers, replacing human labour is what allowed IT to grow so fast and so big. Most people on this site enjoy their fat paychecks because one person writing code can replace thousands of people that used to do stuff manually decades ago, so drop the shocked Picachu face. AI is just another tool for doing what we've been doing forever.
By @627467 - 3 months
Can we just admit many jobs either have very crappy parts to it or are entirely crappy and wouldn't it be nice if that could be replaced with technology?
By @concordDance - 3 months
> Can we just be real and just say we aren't advancing humanity with AI, it is just focused on automating out jobs.

These are the same thing.

Automation does advance humanity. The reason for the current world prosperity has been lots of automation.

(There is the separate and much more concerning risk of humans going the way of the horse, but that does not seem to be what you are concerned about)

By @HarHarVeryFunny - 3 months
Another point to acknowledge is that this job elimination, even if it temporarily boosts corporate profits, is not going to be good for the economy, nor going to boost GDP.

Cheap AI labor will depress wages and therefore reduce consumer demand. This is not a recipe for GDP growth or a vibrant economy anymore than outsourcing to reduce labor costs has been.

The people trying to sell AI as good for the economy will no doubt tell you that companies don't want to reduce your salary or lay you off - that they will be happy to keep payroll the same, or increase it (increased consumer spending = GDP growth!) by employing you as a highly paid AI whisperer. Yeah, no, companies are looking to reduce payroll.

By @postepowanieadm - 3 months
Of course we do, just not our own jobs.
By @ben_w - 3 months
> Can we just be real and just say we aren't advancing humanity with AI, it is just focused on automating out jobs.

I believe the goal is to advance humanity by automating out jobs.

I don't know if that is wishful thinking or if it will actually work (Nash equilibriums and politics make me suspect the former) but the idea that it was possible to do both at the same time was already a widespread meme in the spaces that culturally drive AI research of the sort we now see.

The idea that we can have both is also why optimistic people sometimes bring up (amongst other things) UBI in this context.

I'm really not sure what an individual can do to "prepare" for AGI; it's like trying to guess in 1884 if you should bet on Laissez-faire USA, the British Empire, or that manifesto you've heard of from Marx a few decades ago. And then deciding that, no matter what, the world will always need more buggy-whips and humans will never fly in heavier-than-air vehicles.

By @qrsjutsu - 3 months
> the more people can be prepare

Nope. People will prepare as much as engineers care. But engineers don't care. Educating the people is tedious. It's easier to "manipulate"/direct them, which is the job of representatives, who are about status and money and being groomed by the industries, who are exclusively about money.

People are fine without AGI until they are not. That's another 15 years at least.

If you want to worry, worry about local solutions to climate change mitigation where you need old school manpower, big machines, shovels and logistics.

By @gunian - 3 months
it's just a new evolutionary/eugenic variable. the people who can get out, switch to trades and manual labor, learn to fish, minimize costs etc will pass the test and survive

then robots will become a thing and the same for manual labor i think the non manual labor work class was chosen because they have demonstrated problematic sentiments to the powers that be

a lot of people might die or UBI may become a thing. migration to less developed areas might happen although i doubt the powers that be will allow cross border movements they love their fiefdoms

the barons will always be there unless there is a skynet type event that wipes out humanity same shit different day if you a peasant like me and not a good dev without any dev education or connections most probably will die it is what it is