February 25th, 2025

Tell HN: Y Combinator backing AI company to abuse factory workers

Optifye.ai faces criticism for using AI in factories, allegedly dehumanizing workers while benefiting wealthy owners. Concerns arise about ethical implications and the tech industry's focus on profit over human dignity.

Tell HN: Y Combinator backing AI company to abuse factory workers

Optifye.ai, a company backed by Y Combinator, is criticized for its use of artificial intelligence in factory settings, which allegedly dehumanizes workers and treats them as disposable entities. The founders, described as privileged individuals with limited real-world experience, are accused of prioritizing the well-being of wealthy manufacturing company owners over the welfare of their employees. They reportedly promote their technology as a means to reduce stress for owners while increasing it for workers. The company's approach has raised concerns about the ethical implications of AI in the workplace, suggesting a future where technology exacerbates existing inequalities rather than benefiting humanity. Critics argue that this model reflects a troubling trend in the tech industry, where the focus is on profit and efficiency at the expense of human dignity.

- Optifye.ai is criticized for dehumanizing factory workers through AI technology.

- The founders are described as privileged individuals lacking real-world experience.

- The company's technology aims to reduce stress for wealthy owners while increasing it for workers.

- Concerns are raised about the ethical implications of AI in the workplace.

- The situation reflects broader issues in the tech industry regarding profit over human dignity.

Link Icon 44 comments
By @mzajc - about 1 month
Has this been manually delisted from the frontpage? It doesn't say it's flagged, but it's no longer on there.

EDIT: looks like it's #157 now, I think I just misunderstand the ranking mechanism.

By @simonbarker87 - about 1 month
I used to run a small factory (two assembly lines 10 people) and something like this would have been useful, not to force people to work harder but to optimise movements and points of friction. I would actively encourage and reward people for making suggestions and we had a process in place to test if changes made things better (and not just faster - we included easier, simpler, more enjoyable etc in the test)

Sadly it’s not about the tool in this case, it’s how it’s being promoted and positioned. The line “know who’s working and who’s not” on their website says it all sadly.

By @infecto - about 1 month
The only thing truly embarrassing is that nobody at YC advised them that this ad is 1) terribly amateur and 2) has the chance to look extremely bad. Did not advise and in fact published to their feed. Weird optics, I am not familiar with what I presume to be Indian factories, other than the wild lack of safety (thanks youtube) but I cannot see how this video does anything positive.

As for the product itself I don't think it is unusual, these types of measurement systems are not new and can be helpful for a factory, like all things, it boils down to the owner/managers of said factory not the tool.

By @2099miles - about 1 month
Tbh this isn’t that crazy. If you hire someone to do their job outputting 10 items per hour and that number is reasonable because a bunch of other workers you hired for the same job are doing it and 1 guy hits 1 per hour then that guys shouldn’t be doing that job.

The outrage should be focused on the absolute meme of their ad video cuz they were like “lets literally have a convo with an individual but refer to them as a workspace and have them say human painful responses but then just shit on them anyway impersonally”

The product is not crazy. The video is wild.

By @solumos - about 1 month
Is it dystopian, or is it just real-time performance monitoring poorly marketed by inexperienced founders?

There are tools like this for tracking git commits and velocity (that I’ve been on the receiving end of). It probably makes less sense in that context, but if your job is a repetitive task, I don’t think it’s necessarily abuse or dystopian to track it.

Monitoring bottlenecks isn’t a bad thing. They probably could have chosen an example where the solution to the bottleneck didn’t involve berating a low performer (e.g. adjusting the line to add another station or similar)

By @astonex - about 1 month
The product should be marketed as a tool for monitoring and highlighting bottlenecks in the manufacturing production line in order to help maintain peak output. This is a completely reasonable thing to want, it's no different to monitoring micro-services and their latencies/loads.

The video they made however where they berate and meanly put-down an individual employee is so far from acceptable. That's not how personnel performance issues should be managed in the real world, completely void of human empathy. It shows the founders (and did YC view and approve this?) are lacking in areas

By @danpalmer - about 1 month
This sort of performance management is unfortunately necessary. The problem is that we need tools for it to be built by people who can empathise with those subjected to them, and who want to do the right thing, and not these sorts of folks who are too immature and inexperienced to get it right.

My previous company ran a warehouse and there was a clear bell curve of productivity. Most people were fine, some were excellent, but some were below the level that was realistically achievable. We did careful and considerate analysis and it helped improve productivity.

When done badly however you end up with management using productivity tracking as a lever to increase productivity across the curve. Amazon driver delivery quotas are a great example – people urinating in bottles is clearly a symptom of the quota being too high. Unfortunately software built naively to help bring up the bottom 10% can too easily be used to force up the productivity of the other 90%.

By @oefrha - about 1 month
Brought to you by the VC famous for InstallMonetizer? Make no mistake, it’ll basically back anything that makes money, there’s no moral high ground. And like it or not, this kind of AI (or should I say A-eye) is here to stay.
By @leonheld - about 1 month
The response here would be significantly different if this was about measuring the performance of software engineers in wealthy countries.
By @philipjoubert - about 1 month
As someone who grew up in a 3rd world country and whose mother owned a clothing factory, this product seems...fine? The response is an indication of how little people know about how their t-shirts and shoes are made.
By @asimpleusecase - about 1 month
I worked in a factory during university- made playground equipment out of metal. Every task had a reporting system and the company knew exactly who was producing what. This was a long time ago - they still used desktop calculators with paper tape to do calculations. I was only there for a relatively short time, I just wanted to work and make money for school. We were promised productivity bonuses - my section of the factory achieved 147% of expected production. Another had done almost 180%, but when the bosses did the bonus talk they added in the trucking division and showed a 30% productivity due to excessive “dead heads” ( empty trucks ) returning from equipment delivery. It was a management issue- they were not very good at finding return loads for those trucks. So the production bonus was calculated on the total score of the enterprise- so we got almost nothing on the factory floor for bonus. When my time there was done a friend in accounting told me that I had rated as the most productive in the factory, he had seen the numbers as there was a ranking sheet that was calculated every week. ( remember paper tape calculators ) no one ever told me at the company, no real reward, no approach about coming back to the factory. The point of this post is that this type of thing - monitor the employees- has been a thing for a very long time. It is the humans running the factory that decide to reward, encourage , promote those who work hard - or not.
By @acc_297 - about 1 month
> Boost your assembly line efficiency by up to 30%

Ethics of this aside the above claim must be dubious I would think the majority of manufacturing inefficiencies are due to down time as a result of raw material shipping delays or machine break down… of course I’m in no position to offer an informed opinion but just based on the product website I have a hard time taking this stuff seriously.

Monitoring of factory workers isn’t hard to do with current surveillance and 1 or 2 humans in the loop

By @klkvsk - about 1 month
Software can not abuse workers. Managers can, with our without software.

We've had automated KPI measuring tools since punch clocks. Nowadays it's OK in some companies to install remote access software to monitor employees' screens. It's nothing new. It's just collecting data. Question is, what will bosses do with this data, will they abuse or develop.

I have no hate towards those guys. No love also. It's just business.

By @friendly_chap - about 1 month
That post was pulled partly because of my comment. I commented this:

"While I see the economical usefullness, this sounds like the worst possible application of AI.

Using AI to surveil is building hell on earth. AI should be used to help people work less/easier, not whip them into working more."

Which ended up on the top of the thread. Was surprised to wake up this morning and see it gone.

LinkedIn post I made about this:

https://www.linkedin.com/posts/crufter_today-y-combinator-de...

By @pluto_modadic - about 1 month
It's clear there's a large fragment of AI founder types that totally had nobody tell them and I hope they faceplant, hard.
By @dauertewigkeit - about 1 month
I find the marketing interesting considering that this product already exists in other continents ... and it is NOT deployed in factories. It is deployed in office settings. If this our future? Lots of video evidence: https://www.youtube.com/results?search_query=+Using+AI+to+mo... Expect dishonest marketing by those aspiring to build these surveillance and anti-freedom systems in our western countries.
By @rajnathani - about 1 month
Just like how workplace software analyze white-collar work, such as this Microsoft product [1] (and many of those "bossware" which we saw during Covid), I don't see how this tool is any different but from a manual labor analysis standpoint? Their marketing might have been off, but I mean as long as the camera recordings are securely stored, what could be the issue? This solution is more like an automatic supervisor, and this could help as it could heavily reduce the need for supervisor roles (which also can come with biases and potential workplace-politics), and thus directly increase salaries for the workers actually performing the manual assembly. Also, there are already cameras in assemblies for monitoring the environment and also modern solutions to analyze workers and worker safety such as Invisible AI [2] and Rolloos [3].

[1] https://www.microsoft.com/en-us/microsoft-viva/workplace-ana...

[2] https://www.invisible.ai/

[3] https://www.rolloos.com/en/about-us/mission-vision/

By @rideontime - about 1 month
Thank you, I’ve been seeing the reaction to the announcement but hadn’t yet found the announcement itself.
By @pirate_solo9 - about 1 month
Can anyone share the vision models they might be using and how they might be tuning them?
By @lokar - about 1 month
Having been in this industry for a long time now, I see a disappointing trend in tech, and US society in general: viewing everything at zero sum.

I’m probably naive, but I remember in the past tech focusing on innovation that would generate enough gains for everyone to get a share (or at least the gain to the tech company did not come at the expense of someone else)

Now, more and more I see business plans that are zero sum. Using tech to take from someone else, not growing the pie.

This matches a general trend in public life is the US to view everything as a zero sum contest

By @exsomet - about 1 month
It seems like there’s been a rash of these instances lately where someone does something, says something, or builds something like this that’s not just offensive but unconscionable. And there’s always the predictable outcry and then usually that’s the end of it.

It’s given me pause to think about why that pattern has been established and I think the simple answer is that there are no consequences. The people we see in the news doing horrific things for attention are doing it because there is no mechanism to hold them accountable. Product launches like this - where it’s explicit purpose is to degrade and exploit humans - happen, meme video and all, because these people will not face any consequences for it (and the potential benefit is massive to them if it takes off).

Yelling and screaming about how horrible it is doesn’t really do anything and it’s not an effective use of time or energy. I wonder what society could do - not conceptually, but practically - to establish consequences for someone who launches a product like this.

By @boruto - about 1 month
The secondary school the founder went to is a dead giveaway. Ofcourse yc would fund them. Anyone would fund them infact.
By @sub7 - about 1 month
Overpaid software engineers here please spend a moment to talk about your 3pm massage and free omakase

Or just go work a real widget business and you will realize that optimizing worker efficiency is critical and useful

By @thatjoeoverthr - about 1 month
Imagine the features you could add to this. Like a robot that walks around behind the workers and gives well-timed corrective communications with a whip.
By @igleria - about 1 month
An abhorrent product that no doubt will be have abhorrent customers and maybe one or two that will use it for good.
By @kundi - about 1 month
It’s not the only company with low morale that YC is sponsoring. There have been a dozens of copycats shamelessly copying other products, some of them open source.

It is disappointing to see YC going to new levels of bottom without any proper accountability, just greed.

By @anymouse123456 - about 1 month
No comment really, just came here to remind everyone about, "The Yes Men."

NSFW warning: https://www.youtube.com/watch?v=8_bWAF-XxQM

The video features a tear-away business suit to reveal a gold lame stretch suit with a giant, inflatable phallus that has a monitor at the end and controls to let a manager electrocute under performing factory workers from the beach.

This was presented live and in person to an unwitting, but credulous breakout session at the WTO in 2003 to awkward applause.

By @emorning3 - about 1 month
My first full-time job out of high school was unloading trucks of coffee and sugar, by hand.

Trust me when I say that the motto on the loading dock is 'fuck me, fuck you'.

By @laidoffamazon - about 1 month
I can't call this slavery but their attitude towards their workers who make $1 an hour is sadly extremely common for among wealthy Indian people. They just don't see the non-English speaking poor in India to be human, and very likely the people below their status in America to be human. It's disgusting and the reason why I can't identify with my ancestry
By @fraaancis - about 1 month
It's too bad the AI can't just do the sewing.
By @mythrwy - about 1 month
I don't predict much success for their venture
By @lm28469 - about 1 month
Science sans conscience... we haven't progressed one bit in the last 500 years
By @mattmaroon - about 1 month
This seems like a tempest in a tea pot. You're automating doing what factory floor managers already do? Hitler! From the title I thought they had developed a robot that flogs 6 year old workers or something.

If I were YC, though, I'd probably have a rule about startups not using "backed by Y Combinator" logos on their homepage like Optifye does. YC's a pretty low touch investor at the seed round level, their startups could do lots of things YC didn't expect, didn't know about, and couldn't prevent.

By @neuroticnews25 - about 1 month
I think people are scared this will increase the power of the companies over workers. But a worker’s negotiating position is based on their indispensability and productivity, and on average this won’t change that. I know in reality it's often the fog of war that makes the work bearable. And many good managers more or less consciously keep it that way. But fundamentally, how can more insight and more truth be bad? My resources are limited and I'd rather spend them on a good worker than a slightly worse one. I thought that was the whole point of meritocracy.
By @oulipo - about 1 month
This kind of product is really shameful, and peak capitalism... looking at people as mere robots to serve your, disgusting
By @GroupNebula563 - about 1 month
the fact that they’re backing this is ridiculous.
By @rimbo789 - about 1 month
This product should be banned and these guys should be forced to work a year at a textile mill for trying to build it
By @metacritic12 - about 1 month
Yeah it has terrible optics, yet it's clearly going to be normalized and come. The question is who does it and what is the organization of it. If this company doesn't do it, the next will.

In certain roles, AI micromanagement clearly will create higher performance. Add the marketplace of capitalism and it'll all compete away.

There are certain roles, like artists, where this is the wrong solution wholly: monitoring whether an artist is at her desk will create badly performing artists, and this will show. In these roles, these tools won't apply.