Of the many nascent technologies demonstrated at Adobe MAX this year, Project Sky Replace was one of the most intriguing. As the name implies, the tool can automatically replace a boring or overexposed sky in a photo with a more interesting one from another image. It doesn’t sound terribly complex, but there is much more going on behind the scenes to make the final image look as realistic as possible. Digital Trends spoke with Adobe’s Xiaohui Shen, Senior Research Scientist and chief engineer on the Sky Replace project, about how the technology works.
Sky Replace is smart enough to know which areas of an image need to be preserved and which need to be replaced. It can handle relatively complex horizon lines and will automatically create a matte for any shapes, such as buildings and trees, that stick out. But it goes much further than that.
One of Sky Replace’s most impressive aspects is that it doesn’t just copy and paste one sky over another; it actually adjusts the image foreground to accurately mimic the lighting and colors of the new sky. Several examples of this are given in the above video.
We asked Shen how this was accomplished, and the answer was more complex than we anticipated. It goes far beyond global exposure and white balance adjustments, and uses information from the foreground of the reference image to enhance the foreground of the original.
“We segment the foreground to different semantic regions such as grass, buildings, and people, and computer transfer functions between semantically matched regions,” Shen said. In short, this means grassy areas of the reference image affect only the grassy areas of the original, skin tones affect skin tones, and architecture affects architecture. In this way, the system ensures there are “no very weird colors” in the final result, according to Shen.
For all its power, Sky Replace is still far from being complete, and Shen was open about its current limitations. Water scenes, for example, can pose a challenge. “When there are no sky reflections in the water, our method generally works well,” Shen said. “At this point, we are not handling the cases with mirror reflections of skies, as the reflections need to be segmented and replaced as well to make the photo look realistic.”
Another challenge comes in the form of original images with highly directional light, including backlit scenes or direct sunlight with strong shadows. The issue here isn’t so much a technical limitation of the algorithm, but rather with the selection of a replacement sky. “The sky to be used for replacement should have very similar lighting directions,” Shen explained.
The software can automatically generate replacement sky suggestions based on the content of the original image. However, Adobe simply doesn’t have a large enough collection of skies to match all the possible variations in lightning. Shen said the team planned to address this issue in the future with an improved image search function.
Sky Replace has obvious value to both casual and professional photographers, but one thing not talked about in the Adobe MAX presentation was the idea of applying it to video. While it’s currently limited to still images, Shen mentioned that video presents an intriguing use for the technology, one the team plans to investigate. In theory, Sky Replace could be used in place of shooting on a green screen or going through an involved masking process in post. It could be particularly beneficial to budget science-fiction productions, which could easily replace a boring old Earth sky with an artist’s rendering of a Mars sky, for example.
As with other Adobe sneak peeks, there is no guarantee when, or even if, Project Sky Replace will become a commercial product or make it into a future version of Photoshop. The tool has been in the works for over a year, and Shen confirmed development will continue. This gives us reason to hope that Sky Replace will one day be more than a behind-the-scenes side project.