Apple says it plans on introducing generative AI features to iPhones later this year. It’s unknown what they are, however, a recently published research paper indicates one of them may be a new type of editing software that can alter images via text prompts.
It’s called MGIE, or MLLM-Guided (multimodal large language model) Image Editing. The tech is the result of a collaboration between Apple and researchers from the University of California, Santa Barbara. The paper states MGIE is capable of “Photoshop-style [modifications]” ranging from simple tweaks like cropping to more complex edits such as removing objects from a picture. This is made possible by the MLLM (multimodal large language model), a type of AI capable of processing both “ text and images” at the same time.
VentureBeat in their report explains MLLMs show “remarkable capabilities in cross-model understanding”, although they have not been widely implemented in image editing software despite their supposed efficacy.
Public demonstration
The way MGIE works is pretty straightforward. You upload an image to the AI engine and give it clear, concise instructions on the changes you want it to make. VentureBeat says people will need to “provide explicit guidance”. As an example, you can upload a picture of a bright, sunny day and tell MGIE to “make the sky more blue.” It’ll proceed to saturate the color of the sky a bit, but it may not be as vivid as you would like. You’ll have to guide it further to get the results you want.
MGIE is currently available on GitHub as an open-source project. The researchers are offering “code, data, [pre-trained models]”, as well as a notebook teaching people how to use the AI for editing tasks. There’s also a web demo available to the public on the collaborative tech platform Hugging Face. With access to this demo, we decided to take Apple’s AI out for a spin.
In our test, we uploaded a picture of a cat that we got from Unsplash and then proceeded to instruct MGIE to make several changes. And in our experience, it did okay. In one instance, we told it to change the background from blue to red. However, MGIE instead made the background a darker shade of blue with static-like texturing. On another, we prompted the engine to add a purple background with lightning strikes and it created something much more dynamic.
Inclusion in future iPhones
At the time of this writing, you may experience long queue times while attempting to generate content. If it doesn’t work, the Hugging Face page has a link to the same AI hosted over on Gradio which is the one we used. There doesn’t appear to be any difference between the two.
Now the question is: will this technology come out to a future iPhone or iOS 18? Maybe. As alluded to at the beginning, company CEO Tim Cook told investors AI tools are coming to its devices later on in the year but didn’t give any specifics. Personally, we can see MGIE morph into the iPhone version of Google’s Magic Editor; a feature that can completely alter the contents of a picture. If you read the research paper on arXiv, that certainly seems to be the path Apple is taking with its AI.
MGIE is still a work in progress. Outputs are not perfect. One of the sample images shows the kitten turn into a monstrosity. But we do expect all the bugs to be worked out down the line. If you prefer a more hands-on approach, check out TechRadar’s guide on the best photo editors for 2024.