Google has been a pioneer of computational photography in recent years and its latest trick, called ‘Photo Unblur’, might be one of its most impressive tricks yet. A Google Photos feature that will initially be exclusive to the Pixel 7 and Pixel 7 Pro, it promises to rescue your new and old photos from blurry oblivion.
Photo Unblur is an expansion of ‘Face Unblur’, which arrived last year on the Pixel 6 and Pixel 6 Pro. The latter has quickly become one of the most popular computational photography features since Google released ‘Night Sight’ on the Pixel 4 in 2019. But it’s also quite different from Photo Unblur, meaning the two will act as complementary modes for situations. varied.
Both features use machine learning to improve your photos, but Photo Unblur is designed to improve the photos you’ve already taken on any camera. Meanwhile, Face Unblur is a preemptive mode that uses the power of Google’s Tensor chip to detect when someone is moving too fast in your scene. It then automatically takes two photos, which are combined to achieve a sharp, well-exposed shot.
So how exactly does Google’s new Photo Unblur mode work without getting multiple snaps of the same scene? Google hasn’t fully expanded its inner workings yet, but we can get a good idea by looking at where it came from.
How does Photo Blur work?
Photo Unblur didn’t come completely out of the blue – while Google hasn’t expanded on its inner workings yet, it’s likely built on some existing features we’ve seen in the Google Photos app. And that means it might be available on devices beyond the Pixel 7 and Pixel 7 Pro.
In 2021, the Google’s AI Blog (opens in new tab) described the technology behind two new Google Photos features called ‘Denoise’ and ‘Sharpen’. They arrived to help you scale up photos taken in rough conditions or with older phones that had noisy sensors or old optics. And these probably form the basis of Photo Unblur.
Photo editors have long had sliders to help you adjust noise and sharpness, but Google’s new technology is much smarter than these. For starters, it analyzes your entire image to find noise and blur levels down to a pixel level, regardless of the camera the photos were taken on.
This crucial step allows noise reduction and blurring to occur at a more granular level than older techniques, which makes them less processor intensive. And that makes them ideal for running on the device or in the cloud. Once Google analyzes your image, it can apply its somewhat counterintuitive methods to reduce blur and noise.
These are counter-intuitive because they involve pushing your photo in the seemingly ‘wrong’ direction, before bringing it back for an improvement on the original. To reduce noise, Google blends noisy pixels (effectively reducing the image) and then blends them together while generating finer details. Sharpening works in a similar way, with Google’s algorithms blurring the image multiple times in an efficient, phone-friendly process.
So how does Photo Unblur build on these techniques? We don’t know the details right now, but a year is a long time in machine learning – and some of Google’s examples during the Pixel 7 launch certainly looked impressive.
The image below, for example, has been impressively cleansed of its nearly unusable origins, which appear to have been caused by moving light and an excessively slow shutter speed.
Since Photo Unblur doesn’t work with two images of the same scene like Face Unblur, it may struggle to be as powerful as this older feature, particularly for issues caused by motion. But we’re looking forward to having a look at our old photos when the Pixel 7 and Pixel 7 Pro are released.
How do you use Photo Unblur?
Google again hasn’t revealed the details of how you’ll use Photo Unblur on the Pixel 7 and Pixel 7 Pro just yet. But he said that in “just a few taps” you’ll be able to remove blur and visual noise in a process that feels as straightforward as last year’s Magic Eraser (for removing unwanted objects).
This process will take place in the Google Photos app, with Photo Unblur initially only available on the PIxel 7 and Pixel 7 Pro. But we expect the technology to be available on all devices running the Google Photos app at a later date.
While Photo Unblur isn’t as automated as Face Unblur, which works during the photo-taking process on phones from the Pixel 6 series onwards, it looks like another very simple example of computational photography improving our snaps. Including the old ones we had discarded.
It seems likely that the two modes are complementary, with Face Unblur kicking in (on compatible devices) before taking a photo, and Photo Unblur being useful for old photos taken on any camera. We will be taking a tour of Photo Unblur very soon and will update this article with all our findings.