My Blog
Technology

iOS 16 Has a Hidden Photo Tool That’s Like Photoshop for Your iPhone

iOS 16 Has a Hidden Photo Tool That’s Like Photoshop for Your iPhone
iOS 16 Has a Hidden Photo Tool That’s Like Photoshop for Your iPhone


This story is part of Focal Point iPhone 2023, CNET’s collection of news, tips and advice around Apple’s most popular product.

If you have an iPhone that runs iOS 16, you have to try out one of the best new features. The tool doesn’t have an official name, but lets you separate a picture’s subject, like a person, from the background. All you need to do is tap and hold on a photo to make it work. If you keep holding, you can then “lift” the cutout from the photo and drag it into another app to post, share or make a collage, for example.

iOS 16 debuted alongside the iPhone 14 line with a number of cool new features. Before iOS 16, if I wanted to remove a photo’s background, I would need to use an app like Adobe Photoshop. But what’s great about this tool is that it’s built right into iOS 16 eliminating the need to download a special app or setup an account.

Technically, the tap-and-lift photo feature is part of Visual Look Up, which was first launched with iOS 15 and can recognize objects in your photos such as plants, food, landmarks and even pets. In iOS 16, Visual Look Up lets you lift that object out of a photo or PDF by doing nothing more than tapping and holding.

During the WWDC, Apple showed someone tapping and holding on the dog in a photo to lift it from the background and share in a Message.


Apple

Robby Walker, Apple senior director of Siri Language and Technologies, first demonstrated the new tap-and-lift tool during WWDC. He had a photo of a French bulldog and tapped and held on the dog. Walker dragged the dog’s “cut out” from the photo and into the text field of a text message.

“It feels like magic,” Walker said.

Sometimes Apple overuses the word “magic,” but this tool does seem impressive. Walker was quick to point out that the effect was the result of an advanced machine-learning model, which is accelerated by core machine learning and Apple’s neural engine to perform 40 billion operations in a second.

Knowing the amount of processing and machine learning required to cut a dog out of a photo thrills me to no end. Many times new phone features need to be revolutionary or solve a serious problem. I guess you could say that the tap-and-hold tool solves the problem of removing a photo’s background, which to at least some could be a serious matter.

I couldn’t help notice the similarity to another photo feature in iOS 16. On the lock screen, the photo editor separates the foreground subject from the background of your wallpaper’s photo. This makes it so lock screen elements like the time and date can be layered behind the subject of your wallpaper but in front of the photo’s background. It give the lock screen a slick magazine cover ovibe.

I’ve used the new Visual Look Up feature many times now and I’m still impressed how quickly and reliably it works.

Related posts

Intel to take a position $36 billion in new pc chip factories in Europe

newsconquest

Get Over 90% Off Fallout 76 for Xbox at StackSocial

newsconquest

Berserk is Set to Go back, One 12 months After Kentaro Miura’s Loss of life

newsconquest