My Blog
Technology

iPhone Photo Edits Feel Like Magic With a New iOS 16 Tool

iPhone Photo Edits Feel Like Magic With a New iOS 16 Tool
iPhone Photo Edits Feel Like Magic With a New iOS 16 Tool


This story is part of WWDC 2022, CNET’s complete coverage from and about Apple’s annual developers conference.

Apple will most likely give its official announcement of the iPhone 14 launch this Wednesday, Sept. 7 at its “Far Out” event. They’re also almost certain to announce when the newest version of the iPhone’s operating system — iOS 16 — will be released.

iOS 16 brings a bunch of new features to the iPhone, including editable Messages and a customizable lock screen. But the feature that truly grabbed my attention during WWDC 2022 is all about photography, despite taking up less than 15 seconds of the event. 

The feature hasn’t been given a name, but here’s how it works: You tap and hold on a photo to separate a picture’s subject, like a person, from the background. And if you keep holding, you can then “lift” the cutout from the photo and drag it into another app to post, share or make a collage, for example.

Technically, the tap-and-lift photo feature is part of Visual Look Up, which was first launched with iOS 15 and can recognize objects in your photos such as plants, food, landmarks and even pets. In iOS 16, Visual Look Up let you lift that object out of a photo or PDF by doing nothing more than tapping and holding.

During the WWDC, Apple showed someone tapping and holding on the dog in a photo to lift it from the background and share in a Message.


Apple

Robby Walker, Apple senior director of Siri Language and Technologies, demonstrated the new tap-and-lift tool on a photo of a French bulldog. The dog was “cut out” of the photo and then dragged and dropped into the text field of a message.

“It feels like magic,” Walker said.

Sometimes Apple overuses the word “magic,” but this tool does seem impressive. Walker was quick to point out that the effect was the result of an advanced machine-learning model, which is accelerated by core machine learning and Apple’s neural engine to perform 40 billion operations in a second.

Knowing the amount of processing and machine learning required to cut a dog out of a photo thrills me to no end. Many times new phone features need to be revolutionary or solve a serious problem. I guess you could say that the tap-and-hold tool solves the problem of removing the background of a photo, which to at least some could be a serious matter.

I couldn’t help notice the similarity to another photo feature in iOS 16. On the lock screen, the photo editor separates the foreground subject from the background of the photo used for your wallpaper. This makes it so lock screen elements like the time and date can be layered behind the subject of your wallpaper but in front of the photo’s background. It makes it look like the cover of a magazine.

I tried the new Visual Look Up feature in the Public Beta for iOS 16. I am still impressed how quickly and reliably it works. If you have a spare iPhone to try it on, a free public beta version and a developer beta for iOS 16  are both available.

For more, get all the latest rumors about the Apple iPhone 14.

Related posts

Everything You Need to Know About Apple’s iOS 17.6.1

newsconquest

How Siri, Alexa and Google Assistant Lost the A.I. Race

newsconquest

Unity: US gaming software developer forms China venture with Alibaba, Bytedance

newsconquest