Photoshop's new Adobe Firefly AI integration could be a game-changer

 Image from a tutorial on the a Photoshop update that adds AI generative fill from Adobe Firefly
Image from a tutorial on the a Photoshop update that adds AI generative fill from Adobe Firefly

Adobe may have been slightly late to the generative AI party, but it's been dropping bombshell after bombshell in recent weeks. Shortly after releasing its versatile Adobe Firefly generative AI tools in beta and their integration in Google Bard, it's now just added the tech to its flagship image editor, Photoshop.

The addition means that users of what remains the industry standard image editing software can now quickly add, transform or remove elements in images using simple text prompts (see our pick of the best AI art tutorials and the best Photoshop tutorials).

Adobe Firefly's generative AI technology is available in Photoshop beta for all subscribers from today. The headline feature is Generative Fill, which will allow users to select a portion of an image using the lasso or other selection tool and fill it with new imagery generated using a text prompt (see the tutorial above) The tool will automatically match the perspective, lighting and style of the existing image, adding details like shadows or reflections where it thinks they're needed.

The newly generated content is added in non-destructive layers so that edits can always be reversed without impacting the image (see how layers can be stacked and the importance of their order in the costume makeover tutorial below). Other new additions to Adobe Photoshop beta include around 30 new adjustment Presets. These are filters that users can apply to an image to achieve a particular look and feel.

There's also the new Remove Tool, a brush that uses Adobe Sensei AI to quickly eliminate unwanted objects, saving potentially hours of manual work. Meanwhile, a Contextual Task Bar is designed to make common functions more accessible by recommending relevant next steps in several workflows and Enhanced Gradients introduce new on-canvas controls.

The new features feel like a logical and inevitable development in Adobe's expansion of its generative AI tools. We've already seen that Google Bard will be integrating Adobe Firefly, and it makes sense that Adobe's own flagship products get the same tech. Adobe has already been adding AI-driven tools to Photoshop, with neural filters, which can transform facial features.

"Adobe has a long and established history of AI innovation and the exciting new integration of Firefly into Photoshop (beta) will enable creatives to transform the way they work," Rufus Deuchler, Director of Worldwide Creative Cloud Evangelism at Adobe, told Creative Bloq. "Adobe Firefly is the only AI service currently that produces high quality professional content that is also commercially viable and can be embedded in creative workflows. With Firefly now supporting Generative Fill, Photoshop users will be able to easily extend images and add or remove objects using text prompts, providing a level of control that was unthinkable until today. Generative fill in the Photoshop beta is truly a game-changer."

Photoshop’s Generative Fill looks set to make photo-bashing and collaging a lot quicker and easier, automatically matching the context of an image to save a lot of the more tedious work of adding details like shadows and reflections. It should also save a lot of the time it takes to find the images you want to add to a composition – it might no longer be necessary to trawl through the best stock photo libraries for everything you need.

But perhaps most significantly, it makes experimentation quick and easy, allowing users to test out off-the-wall ideas instantly (or at least as fast as they can type), which will make many users more inclined to try our wild ideas.

Adobe says Firefly has been one of the most successful beta launches in its history, and stresses that Firefly is designed to be safe for commercial. It's trained on Adobe Stock images rather than on images from the wider net. Generative Fill supports Adobe's new Content Credentials 'nutrition labels' – tags designed to clarify that an image has been created by or edited using AI.

How to use Adobe Firefly generative AI in Photoshop

Adobe says that Firefly Generative Fill will be coming to Photoshop as standard later in the year, but for now, you'll need to use Photoshop's beta features. This is easy to do. First, you'll need to have a Photoshop subscription of some kind, either as an individual app as part of Adobe's photography package with Lightroom or as part of a Creative Cloud All Apps subscription (see how to download Photoshop).

You'll need to open the Creative Cloud desktop app and click on 'Beta apps' in the column on the left. Look for the Photoshop (Beta) app and click the 'Install' button. Once installed, Photoshop beta will appear listed under Installed beta apps. Click the Open button and check you're running the beta version by opening Help > About Photoshop in the menu bar in Windows (on a Mac, you'll see Photoshop (Beta) in the Menu Bar where it normally just says Photoshop).