August 23rd, 2024

AI photo editing raises trust issues in photography

Advanced AI tools like Google's Magic Editor are changing perceptions of photography, eroding trust in images as evidence and increasing the potential for misinformation, complicating the discourse on truth and authenticity.

Read original articleLink Icon
AI photo editing raises trust issues in photography

The introduction of advanced AI tools like Google's Magic Editor in the Pixel 9 is fundamentally altering the perception of photography as a reliable representation of reality. Users can now create highly convincing images in mere seconds, blurring the line between real and fabricated visuals. Historically, photographs have been trusted as evidence, but this trust is eroding as the ease of generating realistic fake images increases. The implications are profound, as the default assumption may shift to viewing photos as potentially manipulated. This shift could undermine the societal consensus on truth, particularly in contexts where visual evidence has been pivotal, such as in documenting police brutality or significant historical events. The article highlights concerns about the lack of robust safeguards against misuse of these technologies, suggesting that the current measures are insufficient to prevent the spread of misinformation. As AI-generated images become more prevalent, the burden of proof may shift, complicating the discourse around truth and authenticity in visual media. The future of photography may no longer serve as a straightforward reflection of reality but rather as a medium for subjective interpretation and manipulation.

- AI tools like Google's Magic Editor can create realistic images quickly, challenging the trust in photography.

- The societal assumption that photographs represent truth is shifting towards skepticism.

- The potential for misinformation increases as the ease of creating fake images rises.

- Current safeguards against misuse of AI-generated content are deemed inadequate.

- The future of photography may prioritize subjective interpretation over objective reality.

Link Icon 17 comments
By @kens - 5 months
I'll point out an interesting New York Times article "Ask it no questions: the camera can lie". This article discusses how photography had been seen as a medium of truth, but now "the veracity of photographic reality is being radically challenged" as "technology makes it easy to recompose and combine photographic images, and to do so in a way that is virtually undetectable." By creating a "kind of unsettled and unsettling hybrid imagery based not so much on observable reality and actual events as on the imagination", "photographs will appear less like facts and more like factoids" and will "fundamentally alter not only conventional ideas about the nature of photography but also many cherished conceptions about reality itself."

This article, by the way, is from 1990.

https://www.nytimes.com/1990/08/12/arts/photography-view-ask...

By @ChrisArchitect - 5 months
Didn't we do this a few days ago: (all from The Verge)

Google's 'Reimagine' tool helped us add wrecks, disasters, and corpses to photos

https://news.ycombinator.com/item?id=41312381

The AI photo editing era is here, and it's every person for themselves

https://news.ycombinator.com/item?id=41301612

By @allears - 5 months
This is silly. Our "trust" in photography ended as soon as it was invented. Photo editing has been a reality long before computers came around.
By @adolph - 5 months
Errol Morris had some great thoughts to share about the topic some years back:

As I’ve said elsewhere: Nothing is so obvious that it’s obvious. When someone says that something is obvious, it seems almost certain that it is anything but obvious – even to them. The use of the word “obvious” indicates the absence of a logical argument – an attempt to convince the reader by asserting the truth of something by saying it a little louder. [0]

0. https://archive.nytimes.com/opinionator.blogs.nytimes.com/20...

By @alphabetting - 5 months
I know it's not their intention but this seemed like a great ad for Pixels
By @AmVess - 5 months
I tried Adobe's beta of Photoshop with AI.

Pretty useless, as it would refuse to process edits because of adult content.

These pictures were of house interiors, some cars. Nothing pornographic, but the flunk rate was high enough for me to give up.

I don't need big brother looking at my work and judging its content.

By @atentaten - 5 months
Photoshop and similar tools were doing this before AI.
By @ziofill - 5 months
When snapping a photo, couldn't a phone produce a hash with date/time/location included? I understand it's not bullet-proof, but at least it would make it possible to verify the original to some degree.
By @Workaccount2 - 5 months
I don't worry too much about photographs. Fake images have been floating around in increasing quantities for decades now. Yeah these new AI fakes are often perfect, better than photoshop, but plenty of people were falling for bad photoshop edits before anyway.

Audio/video on the other hand...I don't think anyone is ready for that. There is no precedence for fake videos full of fake dialog. And I'm not talking about trick golf shots or sound-alike voice overs.

By @adingus - 5 months
Nobody would question that a painting could lie, or be (literally) colored by it's creator. Photographs should be no different.
By @animal_spirits - 5 months
My feeling on this is AI will continue to erode our trust in photographs, but it will force us to build trust in _photographers_. There are people who are honest and true, and we will need to spend time and energy seeking them out. There will be a market for more trustworthy photographers.
By @burnte - 5 months
Photographs have been faked and fooling people since 5 minutes after the first photograph. AI doesn't make this any worse than Photoshop or the airbrush.
By @mediumsmart - 5 months
How can there be a trust issue without trust? That ship has sailed
By @retskrad - 5 months
Google is clearly doing whatever it takes to sell one more Pixel device, but why make it so easy to create entirely synthetic photos, thereby normalizing them? With the growing difficulty of distinguishing real from fake on social media, this only adds to the confusion. What societal benefit comes from democratizing the creation of fake images? Is Google crossing a moral line?
By @mannyv - 5 months
Every photograph lies. Every photographer who's any good knows this.

Does the public know that? At some level probably yes.

With AI, though, it becomes more obvious. That picture of Elon Musk and Santa Claus having sex probably isn't real. Neither is the one that shows Obama at Yalta next to FDR.

By @nojvek - 5 months
IMO for all the god-complex silicon valley entrepreneurs have about AI (Google, OpenAI e.t.c). Having fake photorealistic images at the tip of a finger seems like a dangerous precedent.

We are already in territory of AI generated images used for political gain. See Trump using Taylor Swift images. And Trump claiming Kamala crowds are AI generated.

IMO I'm going to be calling my congress representatives to make a law that all AI generated images that look photorealistic must have an "Generated by X AI" tag both visible and embedded in image metadata.

We already have similar to laws around counterfeiting currency.

https://www.justia.com/criminal/offenses/white-collar-crimes...

AOC's DEFIANCE bill is a step in the right direction. AI generated images will only get more realistic. However we deffo need guardrails on transparency and sharing.

https://ocasio-cortez.house.gov/media/press-releases/rep-oca...

Tangentially, we have bills that its illegal to represent as police officer or federal agent.

I really hope we get sane regulation around images misrepresenting themselves in social media.

By @danjl - 5 months
We love being fooled by fake imagery. Movies have done this for years -- sets only build the front of buildings, lights that have no relation to reality, and, of course, visual effects. Even films that you think have no CGI are building fake worlds that we love. Same is true for images and Photoshop. We just don't like being told it is fake because it breaks our beautiful internal vision.