Adobe’s Generative AI Jumps The Shark, Adds Bitcoin to Bird Photo


A seagull flies over a body of water with its wings spread wide. An enlarged inset shows a gold Bitcoin symbol highlighted in the center, indicated by a red arrow from a smaller coin image in the bottom left corner.

Last year, Adobe updated its Firefly generative AI platform multiple times, the most recent coming in September. Over that time, the Lightroom and Photoshop tools that rely on the technology have gotten steadily worse and the system’s choice to add a Bitcoin logo to a photo of a seagull is a perfect distillation of the problem.

Last week, I struggled to get any of Adobe’s generative or content-aware tools to extend a background and cover an area for a thumbnail I was working on for our YouTube channel. Previous to the updates last year, the tasks I asked Photoshop to handle were done quickly and without issue. Since, however, it’s been a rocky road.

All I was trying to do was make a little bit more room on that side of the frame so I could reposition the camera and lens Chris was using.

A person in a dark jacket and cap gestures animatedly while standing in a snowy landscape. A brown dog with a curious expression sits beside them. Snow-covered trees are blurred in the background.

A man in a black jacket and cap gestures while holding a small animal in his gloved hand. Snowy trees are blurred in the background, suggesting a wintry outdoor setting.

Eventually had to resort to the old-fashioned way of doing this and manually cloned out the area to produce the thumbnail we eventually published:

The issue with my request is apparently known by Adobe. When I reached out to the company for comment, a company representative pointed me to an article on Lightroom Queen where it says asking Generative Remove or Generative Fill to work in a space requires that the entire subject and anything related to it be selected or else it will try and replace it with something.

“Select the entire object/person, including its shadow, reflection, and any disconnected parts (such as a hand on someone else’s shoulder). Otherwise, the AI tries to rebuild the object based on what’s left behind. For example, if you select a person and miss their feet, Lightroom tries to rebuild a new person to fit the feet,” the article reads.

While this kind of makes sense if you don’t think about it too hard, it also is completely counterintuitive to the concept of the name of the tool and the result an editor is expecting.

If I am selecting a body part and asking a tool to fill or remove that space, zero percent of the time would I want it to replace my selection with its eldritch nightmare version of that exact same thing. What I, and any editor doing this, want is for what is selected to be removed as seamlessly as possible.

Also, this method does not always work, as I demonstrated last year:

A person sits alone on yellow benches in a spacious, circular auditorium. The seating is arranged in curved rows with light-colored flooring and potted plants lining the perimeter.
Original | Photo by Jaron Schneider
Aerial view of an empty amphitheater with curved yellow benches. A person in gray sits alone on one of the benches, surrounded by potted plants along the perimeter. The setting appears indoors with natural lighting.
An attempt by Adobe Generative Fill to remove the person. The entire person and the surrounding area were selected.

This is a repeat of the problem I showcased last fall when I pitted Apple’s Clean Up tool against Adobe Generative tools. Multiple times, Adobe’s tool wanted to add things into a shot and did so even if an entire subject was selected — which runs counter to the instructions Adobe pointed me to in the Lightroom Queen article.

Adobe also pointed me to an Adobe Community post that has some tips for getting better results out of its generative tools, and while I can confirm these do help, we’re still seeing weird results even if we follow the instructions to the letter.

This loops us back to the Bitcoin situation. Yesterday, photographer Matthew Raifman shared a bizarre result Adobe’s Generative AI produced in Lightroom. The Generative Remove tool saw a selection of a reflection and decided to replace it with a Bitcoin logo.

A seagull with outstretched wings gracefully flies over a sparkling blue body of water, with a blurred background.
The original RAW image. | Photo by Matthew Raifman
A seagull gracefully gliding over calm blue water, with blurred bokeh in the background creating a serene atmosphere.
Adobe’s top suggested Generative Remove option. | Photo by Matthew Raifman

“Adobe has officially jumped the shark. Their AI remove feature in lightroom just added a bitcoin to my gull bird in flight photo,” he shared on Bluesky. “A bitcoin!?!”

A close-up of a shiny gold coin with the Bitcoin symbol embossed on it, set against a blue background. The coin features intricate designs and text around the edges.
A closeup of the bitcoin logo Adobe Generative Remove added to the image.

Raifman shared a screen recording with PetaPixel that verifies this wasn’t added on purpose and was the first suggestion from Adobe’s AI.

To its credit, two of the three options Generative Remove suggested did provide usable alternatives. Unfortunately, the Bitcoin option was the first one, which (whether Adobe intends this or not) tells an editor that it is what the platform feels is the best result.

It’s not so much that Adobe’s tools don’t work well, it’s more the manner of how they’re not working well — if we weren’t trying to get work done, some of these results would be really funny. In the case of the Bitcoin thing, it just seems like it’s trying to replace the painted pixels with something similar in shape to the detected “object” the user is trying to remove. But that doesn’t make any sense in how editors would expect the tool to perform.

Editors don’t want something replaced with an object akin to what they select to remove, they want it replaced with what is around it. But, somehow, Adobe’s AI just isn’t coded to understand this and it repeatedly generates the weirdest stuff because of it.

Generative Remove and Generative Fill have become so unreliable that some members of the PetaPixel staff have stopped using it entirely. As I pointed out, I had to go back to the manual clone stamp method to get the task I wanted completed.

“Overall, Adobe is aware and actively working to resolve,” Adobe tells PetaPixel.

When Adobe is pushing AI as the biggest value proposition in its updates, it can’t be this unreliable. It might be enough to fool shareholders into buying more stock but it’s not going to make actual users — you know, the ones directly contributing to the quarterly profit margins — feel like they’re getting their money’s worth.

“The idiom ‘jumping the shark’ or to ‘jump the shark’ means that a creative work or entity has evolved and reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with or an extreme exaggeration (caricature) of its original theme or purpose,” Wikipedia notes. That feels pretty apt for what is happening here.



Source link

Previous articleAMD’s RX 9070 XT might beat Nvidia’s $1,000 GPU
Next articleThis new MagSafe game controller will make on-the-go iPhone gaming incredibly easy