There are few things more irritating than trying to take a picture of a great hotel view, and finding you’ve only captured a rain-spotted, indoor-light-blurred image that shows your face frowning down at your phone, rather than an envy-inducing cityscape.
Well, tut no longer, because MIT and Google have joined forces to make the finding the right filter the only obstacle to getting your perfect image with their new algorithm which removes problems like fences, reflections, and water droplets.
By taking a short video, or a short burst of photos from a few different positions, the algorithm will work to separate the obstruction and the actual scene beyond it, yielding clear photos of both, if you decide your reflection was actually the better image in the end.
The demonstration video below, published to the developers’ SIGGRAPH 2015 paper ” A Computational Approach for Obstruction-Free Photography”, shows a few examples of the tech in action, removing reflections, physical obstructions such as writing on a window, and improving picture quality.
Amazing! It’ll be great not to have to smash windows to get a decent image. Not that we’ve done that. Nope.
Main image © iStock/Kichigin