iOS 16 includes some awesome hidden innovation

I think I wildly underestimated the impact of iOS 16 and its new Lift Subject from Background feature. This is next-level image stuff that fundamentally changes how you can interact with the 15-year-old platform.

Let's start by getting something clear: Apple's next big mobile platform update, iOS 16, is still months away from final release and is currently only in developer beta. The public beta could arrive as soon as next week (the week of July 3). This means that, while I can talk about what I've learned, I can't show you any more than what we all saw during Apple WWDC 2022 keynote last month.

Granted, the demo of someone grabbing a bulldog out of a photo and casually dropping it onto a Message was pretty cool on its own. Actually using it, though, is something else.

Hold it

From what I can tell, it doesn't matter what kind of photo from your library you use, or even its age. Virtually any photo with a clear subject (or subjects) is game for the Lift Subject from Background feature.

In my library, I opened photos shot with my iPhone 13 Pro, iPhone 8 Plus, iPhone 7, and iPhone 6 and was able to select subjects in all of them.

As demonstrated in the keynote, you open the photo on the iPhone and place your finger on the subject (or multiple subjects, as it's happy to let you grab a group of people). You know your iPhone is finding the subject thanks to a cool visual effect that appears to marquee the subject and transport it to your finger's control.

As Apple told me last month, the ability to identify subjects is all part of the company's rapidly developing image-segmentation technology. Apple uses it on the lock screen to put just your image subject in front of the time. In the case of Lift Subject from Background, it lets you select and move the photo subject almost anywhere.

It's more

I think I understood what I saw during the WWDC keynote demonstration, but it wasn't until I tried the Lift Subject from Background feature myself that I understood the radical iOS change that comes along with it.

Look, it's cool that iOS 16 can identify and lift any subject (person, flower, bird, dog) from a photo. What I didn't understand is how you might move that subject elsewhere. This is not a cut-and-paste feature; it's also not a photo-editing feature, à la the Google Pixel's Magic Eraser. It's more like a mobile platform magic carpet ride.

Once I had a subject selected, I paused for a moment as I tried to figure out what to do with the floating image under my finger. How would I get it to Messages as they did during the WWDC demo?

Instinctively, I kept one finger on the subject and with my other hand I touched the screen and swept up from the bottom to access my home screen. Then I selected Messages.

I found I could hover with the captured image over my messages list and drop it into one of the threads, or go directly to an open message conversation.

Alternatively, I could open a different app like Notes or Keynote and drop in. As long as I held my finger on the captured subject, I could do whatever I wanted with my other hand, including launching new apps or swiping up one-third of the way from the bottom of the screen to access all my open apps and choose the one where I wanted to drop in my subject.

I couldn't recall ever seeing iOS 16 work in this fashion before, like a multi-window system.

It's weird, cool, and a distinct departure from previous versions of iOS. We've always had multi-touch, but this is like multi-modal touch -- and with a pretty wild new image feature to boot.

It's possible that Lift Subject from Background will undergo many changes before Apple launches the final version of iOS 16 in the fall, but I don't see it going backward from this near-revolutionary change (which also happens to work in iPadOS 16). It's the start of something big.

How It works

Search Crack for

Latest IT News

Apr 23
Microsoft tests user patience with ads in Windows 11, sparking debate over integration, privacy, and the future of the Windows user experience.
Apr 23
The Ray-Ban Meta smart glasses now offer native Apple Music controls, and new WhatsApp and Messenger video call tools.
Apr 23
Meta is opening the newly dubbed Horizon OS to third parties so they create more Quest headsets and port their apps over.
Apr 22
Google's Gemini AI, looking to become users’ choice digital assistant, might soon be getting music streaming service integration, offering seamless music control.
Apr 22
Google Chrome's potential integration of Gemini, its flagship AI model, promises smarter password suggestions but raises concerns about security vulnerabilities.
Apr 22
Vivaldi browser is ported over as another step is taken towards pushing ARM-based AI PCs to really take off.
Apr 22
Microsoft's VASA-1 AI system transforms single photos and audio clips into lifelike talking avatars, offering customization and potential for magic, but also raising concerns over misuse.

Latest cracks