Photoshop and Lightroom are incredibly powerful tools for manipulating images, but since the beginning of time, the most frustrating part of working with these tools has been selecting specific objects to cut them out of an image, move them elsewhere, etc. Over the years, the object selection tools got a lot better, but for complex objects — and especially for masking people — your results still depend on how much patience you have. At its MAX conference, Adobe today announced a number of updates across its photo-centric tools that make all of this a lot easier, thanks to the power of its AI platform.
In an earlier update in 2020, Adobe already launched an Object Selection tool that could recognize some types of objects. Now, this tool is getting a lot smarter and can recognize complex objects like the sky, buildings, plants, mountains, sidewalks etc. But maybe more importantly, the system has also gotten a lot more precise and can now preserve the details of a person’s hair, for example, in its masks. That’s a massive time saver.
For those times when you just want to delete an object and then fill in the empty space, using Photoshop’s Content-Aware Fill, the company now introduced a shortcut. Shift+Delete and the object is gone and (hopefully) patched over in with an AI-created filler.
On the iPad, mobile users can now remove an image’s background with a single tap and they also get one-tap Content-Aware fill (this is slightly different from the one-click delete and fill feature mentioned above, but achieves a very similar same outcome.
iPad users can now use the improved Select Subject AI model as well to more easily select people, animals and objects.
A lot of this AI-powered masking capability is also coming to Lightroom. There’s now a ‘Select People’ feature, for example, that can detect and generate masks for individuals and groups in any images — and you can select specific body parts, too Unsurprisingly, the same Select Objects technology from Photoshop is also coming to Lightroom, as is the one-click select background feature from the iPad version of Photoshop. There’s also now a content-aware remove feature in Lightroom.
All of this is powered by Adobe’s Sensei AI platform, which has been a central focus of the company’s efforts in recent years. But what’s maybe even more important is that these efforts have allowed Adobe to turn these features into modules that it can now bring to its entire portfolio and adapt them to specific devices and their use case. On the iPad, for example, the background selection feature is all about deleting that background while in Lightroom it is only about selecting it, but in the end, it’s the same AI model that powers both.
This is, of course, only a small selection of all of the new features coming to Photoshop and Lightroom. There are also features like support for HDR displays in Lightroom and Camera Raw, improved neural filters, a new photo restoration filter, improvements to Photoshop’s tools for requesting review feedback and plenty more.
Adobe makes selecting and deleting objects and people in Photoshop and Lightroom a lot easier by Frederic Lardinois originally published on TechCrunch