You come home after holidaying in the Alps with your family and friends. To your dismay, some of your treasured photographs and videos have been ruined by the light’s uncanny property to bounce here and there off the windows and mirrors (read reflection) . So, what do you do. Well, coming soon to a smartphone near you is a novel solution – a complicated algorithm – designed by Google and MIT researchers.
The algorithm will allow your smartphone to automatically remove the reflections and other unwanted obstructions from the final image. The algorithm works by differentiating between the foreground from the background of the image, and what’s the object the is ruining your perfect image.
Tianfan Xue, Michael Rubinstein, Ce Liu and William T. Freeman are the minds behind this futuristic algorithm. They will present their research paper at Siggraph computer graphics and interaction conference in Los Angeles 2015, at the end of this month. At the event they will also demonstrate how the algorithm removes the obstructions from a short video sequence taken with a phone.
Michael Rubinstein, a research scientist at Google, states that the basic principle behind the algorithm is the phenomenon of motion parallax. Sounds complicated, well it sure is. It is the phenomenon where in objects that are closer seem to move faster than those that are farther away – yes, like the rear view mirror in your car points out. However, Tianfan Xue, remarks that as long as the obstruction doesn’t move, the algorithm can be used to remove it.
The other limitations concerning the algorithm is that it does not work with images of subjects in motion. So no sporting events,and added to that it does not weave its magic so well in low light conditions.
Though there is a considerable amount of interest being generated inside the research labs in Google, but for now, their discovery seems to be one marked for the future, and may not appear in a smartphone near you, anytime soon.