August 1st, 2024

Study: Consumers Actively Turned Off by AI

A study in the Journal of Hospitality Marketing & Management found that mentioning "artificial intelligence" in product descriptions reduces emotional trust and purchase intentions, especially for high-risk items.

Read original articleLink Icon
Study: Consumers Actively Turned Off by AI

A recent study published in the Journal of Hospitality Marketing & Management reveals that consumers are increasingly turned off by products marketed with the term "artificial intelligence." The research, which involved 1,000 respondents, found that mentioning AI in product descriptions significantly lowers emotional trust and decreases purchase intentions. Lead author Mesut Cicek from Washington State University noted that emotional trust is crucial in how consumers perceive AI-powered products. In experiments, participants showed a marked preference for a smart television when the term "artificial intelligence" was omitted from its description. This aversion was particularly strong for high-risk purchases, such as expensive electronics and medical devices, where consumers are more concerned about potential financial loss or safety risks. The study's findings suggest a broader trend of growing consumer skepticism towards AI, coinciding with a report from Gartner indicating that the hype surrounding generative AI has peaked. Companies are increasingly incorporating AI claims into their products, despite unresolved issues and high costs, leading to consumer fatigue. Cicek advises marketers to reconsider how they present AI in product descriptions, suggesting that emphasizing features and benefits without using AI buzzwords may be more effective, especially for high-risk items.

Link Icon 42 comments
By @mrtksn - 9 months
Just an anecdote but I had a small mobile app that would make people pay money for the premium features, and (surprising for me) many people happily paid. Although the app had very positive reviews, it wasn't growing fast enough to make it into a meaningful income.

So I told myself, why don't I add AI stuff on it like an AI assistant as a primary way of interaction? Everyone loves AI, it's the future!

Then the app retention tanked, as well as the install numbers and no purchase was made after that. People didn't even bother with leaving bad reviews.

Maybe we are in this strange situation where the people who make the products are so hyped about this new tech but the consumers are really hating it.

Like the artificial sweeteners maybe?

"0 calories and the same taste with the sugar? Why would anybody ever use sugar again, right? Lets short the sugar cane and corn fields and put all our money into artificial sweeteners production equipment and chemicals"

By @ogou - 9 months
I travel often in Europe and see these AI assistants on many websites and apps now. Two things generally happen when I have tried to use them. First, nothing actionable is possible. They can't actually do anything. No trip changes, refunds, connections, or baggage tracing. It takes a significant amount of time to get to to the response, "Sorry I can't help with that, please call xxx during business hours or visit our website at xxx." Second, they invariably end up as marketing funnels with upsells offered in place of solutions. I see that as the main source of anger from others in airports. They try to deal with something and end up in marketing loops.

When I see AI assistance as a travel feature I assume it is not only going to be useless, but actively disruptive to my experience.

By @Quothling - 9 months
> they found that products described as using AI were consistently less popular.

Branding the mistakes LLM's get as hallucinations was sort of brilliant in my opinion, it was a good way to disguise the fact that LLM's are mostly just really lucky. In my anecdotal experience it hasn't worked out too well though, so maybe it wasn't that brilliant? Anyway, parts of how the AI impacts our business (solar energy + investment banking) has been through things like how Microsoft Teams was supposed to be capable of transcripting meetings with AI. Now, we're an international organisation where English is at best the second language for people, and usually the third, so this probably has an impact on it, but it's so bad. I know I'm not personally the clearest speaker, especially if I'm bored, but the transcripts the AI makes of me are so hilariously bad that they often become the foundation for the weekly friday meme in IT. Which may be innocent enough, but it hasn't been very confidence building in the higher ups who thought they wouldn't have to have someone write a summary for their meetings, and in typical top brass style didn't read the transcripts until there was some contract issue.

This along with how often AI "hallucinates" has meant that our top decision makers have decided to stop any AI within the Microsoft platform. Well every thing except the thing that makes power point presentations pretty. So our operations staff has had to roll back co-pilot for every non-IT employee. I don't necessarily agree with this myself, I use GPT quite a lot and while github co-pilot might "just" be fancy auto-complete it's still increased my productivity quite a lot as well as lowering my mental load of dealing with most snippets. That isn't how the rest of the organisation sees it though, they see the mistakes AI makes on areas where no-mistakes are allowed, and they consider it untrustworthy. The whole Microsoft debacle (I used chatGPT to tell me how to write "debaclable") where they wanted to screenshot everything all the time really sunk trust with our decision makers.

By @danpalmer - 9 months
Important clarification: consumers are turned off by AI marketing material.

I'd be interested to see real studies into consumer satisfaction with AI features in existing products. My gut feeling is that people don't like (visible) AI in things they use, but that's biased by me reading online reporting about the failings of these features, I wouldn't be too surprised if it turns out people mostly like them.

By @ksaj - 9 months
They're turned off on the lingo showing up everywhere, regardless of its relevance (or lack thereof). It's similar to when the Internet first started becoming commercial, and people were revolted by terms like "information superhighway" etc being thrown around so excessively.

Eventually everybody grew into it, and we can't imagine life without the Internet. AI is having that same moment right now. The breathless hype bombardment will eventually give way to normalcy just the same.

By @everdrive - 9 months
AI is being pushed everywhere, and I couldn't hate it more. I don't want an AI assistant. I don't want to "talk" to a computer. I don't want a company diving mindlessly into the next trend just because they're afraid of being left behind. As others have said, it's a sign that a company doesn't really know what it's doing. I hear executives brag that they put all their emails through an AI assistant. This just tells me two things: they're apparently bad at articulating themselves, and no one is actually reading their emails.
By @netcan - 9 months
I don't think it's disillusionment with the tech. Consumers barely touch the tech. It's disillusionment with the marketing

I think this goes beyond just consumers. If you listen in on a random companies' strategic planning... Any project with "ai" in its title is likely to be bullschtick.

"AI enabled" is our 2024 "now with electrolytes."

By @zacksiri - 9 months
I wonder if the same sentiment applies to products using .ai TLDs.

The thing about AI is it reminds me of what Steve Jobs said about speeds and fees.

People care about "1000 songs in your pocket" not "30GB hdd". AI seems to be the "30GB hdd" and people don't always relate AI to how it's going to help them.

By @blibble - 9 months
AI becoming associated in consumers minds with cheap useless garbage is the best thing that could have happened

AI is the shovelware wii game of the 2020s

I wonder what this will do to the YC S24 batch... https://www.ycombinator.com/companies?batch=S24

By @Animats - 9 months
LLMs have solved the problem of blithering at scale. Unfortunately, this mostly benefits advertisers.
By @Lerc - 9 months
This does not surprise me, this is a new technology that people are wary of. People are far more aware of marketing techniques these days an know that when a device is advertised as having trendy feature X, it probably doesn't have anything to do with the meaningful aspects of the feature.

Consider how many atomic themed products there were in the 60's. We can be thankful that the only way most of these were actually atomic is in that they were made of atoms. The naivety if those days has gone. Information (true or not) travels so quickly now , everyone is a bit jaded. Many are outright nihilistic.

It's worth noting that people's opinion of AI in product is distinct to the actual AI itself.

I'm not in a position to find the reference right now, but I remember reading about a study which showed people felt that when given artworks to judge, the ones they were told were by AI were inferior. This was independent from whether or not each piece was actually by an AI or a human.

By @EdwardDiego - 9 months
The fact that every C-suite in America simultaenously decided "We need AI in our product. Look, just jam it in somehow, you'll make it fit" is, I'm sure, unrelated.

AI is a keyword for investors currently, that's all. Like all the companies that previously sprouted a blockchain unrelated to their core product.

By @KronisLV - 9 months
I really like something like GitHub Copilot and Copilot Chat, because they help with boilerplate and simple functions, as well as lower the barrier for doing some exploratory work and iterating. Something like Phind is even better in those cases where you care about looking into the actual sources that are returned, as opposed to just testing the output for your needs.

I also like general purpose chat/writing/instruct models and they're a nice curiosity, even the image generation ones are nice for either some placeholder assets, texture work, or some sometimes goofy art. HuggingFace has some nice models and the stuff that people come up with on Civitai is sometimes cool! Video generation is really jank for now, I wonder where we'll be in 20 years.

Overall, I'm positive about "AI", but maybe that's because I seek it out and use it as a tool, as opposed to some platform shoving it down my throat in the form of a customer service bot that just throws noise at me while having no actual power to do anything. There are use cases that are pleasant, also stuff like summarizing, text prediction, writing improvements, but there are also those that nobody asked for.

By @solarkraft - 9 months
It means unreliable. I saw an ad for a cool scheduling app. Once they mentioned it‘s „AI powered“ (why??) I started worrying about it randomly missing events.
By @fch42 - 9 months
There is value way beyond the immediate service given in a human to human interaction; we're social animals after all and tend to derive a form of pleasure out of "being with others". In the same sense that granny went praying the rosaries Saturday evening before mass, because that also got her the coffee and cake with her friends ahead of that, and the chats afterwards. Just like the person at the till chatting with the cashier, the person talking to the surgery's receptionist (and not just about booking in the doc), or the talk with the pharmacist about how and when best to take the prescription medicine (and since we meet monthly for that anyway ... give me the gossip and the events in town as well) - this used to happen because, quite frankly, "efficient" communication and "efficient" human to human interaction is not what we seek. It'd be rude.

And that's where the whole chatbot, AI or no, thing falls a little. No matter how smarmy or even actually helpful that chat thing is, it doesn't shake my hand, won't do me favours, probably not take me to the shops after work, and ask me whether I need help or why I look sad. Nor complement me on my eyes.

I don't know; at least in me, the herd animal roars loudly enough that I feel unvalued whenever I am forced to interact non-physically. The distance can be felt. And to me, that is not a "protective distance" but an "excluding" one. Makes me more lonely.

By @doodaddy - 9 months
As a term, “AI” has become synonymous with “new”. And “new” has become synonymous with “good”. We are in an age where age is a liability. If it’s not new it’s old, and old isn’t good. And it’s all sad because it’s simply not true. But marketing teams wide and far are pushing this narrative. And I think companies are encouraging their marketing teams to do it because it’s an easy way to dress up a half-baked product, hiding it behind a buzzword.

But I don’t think it’s a signal to consumers as much as a signal to investors and competitors. Since when does the consumer care about which technology was used to build a product? Has a consumer ever said “wow this product is so good it must have been built with Rust!” The bottom line is it shouldn’t matter and it doesn’t. People need features, not technology, even though many of them confuse the two.

By @sensanaty - 9 months
Once you look past the initial reaction of "oh cool a computer can do this!?" You quickly start running into the limitations. And if you're in a product team building some AI feature (aka calling an OpenAI API 90% of the time), you realize even more how laughably useless it is in actuality.

For example Notion. They have an AI feature, and it's hilariously useless. It can barely even summarize things, yet alone write the rest of the document for you. Yet they push it constantly onto you with no way of disabling the crap.

What I have noticed is that the marketing people are in love with it, because it lets them generate the useless drivel they spam people with easier than before. I suspect they never used their brains much, but now they don't even have to at all!

Coincidentally, scammers also love it for similar reasons...

By @tmpfs - 9 months
Think about trying to get customer service from any company now, we all end up talking to AI bots that don't actually help at all. No wonder the public perception is bad, our experience of AI is often based on these automated services which actually make our lives more frustrating.
By @senectus1 - 9 months
I sat in on a weekly MS meeting where they spruik their products a few days ago.

AI/CoPilot/Chat-GPT was mentioned 100 times in the 34 mins I was in it. ( I was tallying, cause I'm sick of it)

And thats not including text on the presentation. just the times the words were said.

By @indigo0086 - 9 months
One thing I and probably many others notice about AI integration is it places incorrect operation on the user's plate. If the user gets a wrong answer, well just remember ai tends to do that. People are trying to find ways to make a truly intuitive ai interface that isn't just a text box you chat with. With adequate error recognition and correction integrating ai shouldn't be something the user notices other than the time it takes to process. This would be another issue as integrating ai just adds a time lag on those operations which can depress user sentiment in another way
By @iainctduncan - 9 months
This is not at all suprisinging to me. In general, companies underestimate the intelligence of the consumer and are pretty bad at imagining being a consumer looking at their company. We are all being indundated with AI hype, and it's almost all telling companies "you can save money by getting AI to do a mediocre-to-ouright-shitty versions of this work instead of hiring people".

As a consumer, why would I choose a company optimizing for their margins at the expense of my experience? Touting AI has become a warning sign that the company is doing this.

By @isleyaardvark - 9 months
It's quite simple. Consumers realize either consciously or not that saying "AI" means the added AI will make the product:

1. cheaper

2. worse

It's invariably used as a cost-cutting measure that is quick, cheap, and worse. Companies use AI art because they are too cheap to hire a real artist that would make better art. Companies use AI chatbots because they are too cheap to hire real customer service agents who could actually help people.

If a company slapped a label on a bag of chips that said "Now with fewer chips and less flavor!", I'm sure that would turn off consumers as well.

By @charlieyu1 - 9 months
Not surprising at all, there is a big trend of using AI for cutting costs while producing stuff with minimal quality. Once the initial hype is over, the consumers will hate how repetitive it is.
By @ecjhdnc2025 - 9 months
It's this simple: generative AI is poisoning culture.

Everyone knows this -- even the people pushing it. We know what it is doing to art, what it is doing to just being able to find a book on amazon, what it is doing to reviews, to website searches, to authenticity everywhere.

Splitting hairs about whether people think it's the terminology or the functionality is absurd. Stop making generative AI products that are the cultural equivalent of breaking into the community pool solely to piss in it.

People don't really want this. They have an instinct that a lot of AI products are little more than grift (trained by their experiences with "Web 3.0"). And this study is showing that.

By @perlgeek - 9 months
I'd love to see more products use some machine learning in ways that fit their current UI paradigms.

For example:

* You enter an ambiguous search term into a search engine. It shows you some results, but it also shows you some buttons to filter by meaning. For example if you've entered "universal", it could give you an option to filter out all pages where "universal" appears as the name of the film studio, without a hack like excluding "universal films", but actually deciding based on context

* If you've touched up a few photos the same way, an image processing program could give you a suggestion for a touch-up for the next photo, along with options to tweak it

* In an email program, if you've moved a few emails to a folder, it could suggest a selection of other emails to move to the same folder, or maybe suggest to you a mail rule that would do it for you in future.

... and please, provide an option to switch it off.

By @dmd - 9 months
Logitech's *mouse driver* now has AI in it[0].

[0] https://www.logitech.com/en-us/software/logi-options-plus.ht...

By @n_ary - 9 months
For me personally, when I see "AI" suddenly popping up on something I am subscribed to, 8/10 times, there is some sort of chatbot suddenly added there; nothing ground breaking, nothing cool(that didn't exist before), just a random chatbot!

Secondly, I am slightly burntout looking at all the "BlockChain", "distributed", "Crypto" for like last decade and now suddenly the same products(and products from same people) did a swap of "crypto" or "blockChain" with "AI". Needless to say, my mind is jaded from all those scammy blockchain peedlers that when I see an "AI" (e.g. TodoAI, like wtf? I just want to add some text and dates and want to check those when am done or just show me a notification if I forgot..), I just feel like someone is trying to scam me.

Also, I noticed that some products suddenly hiked the price after adding the "AI" in their featureset, which wouldn't feel scammy(because inflation happened), but if I see some crappification of the product post AI, it just leaves a bad taste that next time I see anything with "AI" as feature, I assume it is also some crap. I know, don't judge a book by fathers and children must not be judged by deeds/sins of their fathers but what can I say.

Also, most of the products are clearly proxying stuff to ChatGPT with a system prompt, you can actually sense it, if you have used OpenAI apis. Which causes me extra pain, because c'mon, you are selling me a proxy with a prompt and charging me like $5.99/month!

By @tauntz - 9 months
Anecdotal evidence but my reaction to seeing a Google product release mentioning anything related "Gemini AI" makes me subconsciously assume that they made the product worse by including a totally unrelated and barely working feature that nobody asked for. The only reason for most of these features is, I assume, that there's a requirement now for all PMs to ship something with "Gemini AI" in its title or they won't get a promotion ¯\_(ツ)_/¯
By @peanut_worm - 9 months
AI just means “annoying chatbots and low quality content” to most people
By @HPsquared - 9 months
I think it's the association with "AI-generated slop content".

People associate the use of AI in an existing area as being a poor-quality facsimile of the real thing, whatever the "real thing" is. That, or an unnecessary addition causing annoyance (aka Clippy)

On the other hand, for genuinely new use-cases where AI is central and beneficial, I'd be surprised if there was a negative reaction. It is "new shiny thing" vs "cheap plastic imitation".

Reminds me of The Graduate "One word. Plastics."

By @poikroequ - 9 months
AI is not being marketed to consumers. AI is being marketed to investors.
By @kebsup - 9 months
I'm building a flashcard language learning app, and the Meta ad with the phrase "AI-powered" performs better than all other variations, so depends.
By @snakeyjake - 9 months
As far as I can tell, AI is only useful for:

1. Making really, really, REALLY, shitty "art"

2. Writing nonsensical and boring short stories or bland written-by-committee memos that make you sound like a soulless AI

3. Creating summaries that are pathetic compared to the first paragraph of any wikipedia article on literally any topic (also, they are often 100% wrong)

4. Acting as a useless, actually worse that useless due to being both useless and time wasting, wall between you and a human who can actually do something "assistant"

5. Looking at images and telling me if there's a cat or a fruit in them

6. Being a worse chatbot than ELIZA was 50 years ago

7. Writing code that, if it is anything more complex than something you can copy and paste from Stack Overflow and have work, you have to spend more time fixing than if you had just written it yourself

But it is very good at bombarding users with an infinite stream of garbage content that is cheap and effortless to create so it will eventually devour everything.

By @samdung - 9 months
In just a few months every website that has had a customer support chat-plugin seems to have become 'AI Powered'. And all they seem to do is go on in a circular drivel.
By @GuB-42 - 9 months
Many consumers have a working bullshit detector. And they are starting to understand that "AI" is usually a meaningless buzzword. And people don't like being bullshitted.

And even to those who think there is real meaning behind it, do you really think people want "intelligence" everywhere, artificial or not. People don't want their toaster to be intelligent, they just want it to toast. So what is an AI powered toaster? A toaster you have to argue with regarding how you want your bread toasted? And in many cases that's what happens when you replace a push button with an "AI assistant", so people are not wrong about it either.

By @bfrog - 9 months
Yeah I’d rather not be part of the experiment thank you. If something takes intelligence to do I want to talk to a human that can help do it.

AI isn’t going to magically help solve problems with technology. Frankly I’m tired of technology. I miss people.

By @feverzsj - 9 months
AI is still far from profitable, except for scam.