Computational imaging has given us some interesting and useful inventions so far, from fake bokeh to capturing the movement of light. This time, scientists have figured out how to take a clear image from as far as 28 miles (45km), regardless of the Earth’s curvature and the amount of smog in the air.
Drone technology has come on so quickly in such a short space of time. Especially the camera technology. I’m not just talking about the quality of the optics and sensors, either. The “brains” behind the visual systems in drones now is just nuts. Even modest consumer drones have facial recognition, subject tracking, and similar features. All these features and help us to achieve the best shot possible.
A team of research from MIT and ETH Zurich have now taken things way beyond that which is currently available to the masses. Building on the basic visual systems, theirs actually allows you to determine where in the frame the subject is positioned. It also lets you choose the camera angle. If you want full frontal, you got it. 3/4 left or right? No problem, it’s just the flick of a menu item.
A photograph is, by its very nature, a still image. Good ones perform their task very well. As technology has evolved the lines between photo and video have started to become a little blurred. We’ve got live photos, Instagram supports video, and now wants to be SnapChat.
Then we have augmented reality. The combination of computer generated data and real world imagery in real time.
The big problem with augmented reality is that the two don’t really mix together too fluidly yet. MIT PhD student Abe Davis has figured out a possible way to solve this problem. It’s a process he calls Interactive Dynamic Video. It uses tiny vibrations picked up in video to simulate real world movement in still images.
A team of researchers from MIT (Tianfan Xue, Michael Rubinstein, Ce Liu and William T. Freeman) are teaming up with Google with to present a new algorithm that is able to extract photographic inconveniences such as glares and reflections from photographs. The algorithm can then reproduce the image free of any reflections, in addition to being able to create an additional image of the reflection itself. This kind of problem solving would be especially useful when shooting behind glass or a fence, for example.[Read More…]
Windows have been ruining photos ever since the first time a photographer tried shooting through one.
Unless you bring along dedicated contraptions or start messing around with cloths and funny angles, shooting through the glass will likely lead to an annoying reflection that will make you want to smash it to pieces. (If you’re actually trying to get a reflection then scratch everything I said; windows are awesome).
This problem might soon come to an end, though; as researchers say they’ve developed an algorithm that can automatically remove reflections from digital photos. The algorithm can’t remove all types of reflections, but it does an impressive job with the ones it can remove.
Ah, the smell of fresh rain. Clean, refreshing, and light…There’s nothing else quite like it. And, unfortunately, according to a team of researchers from MIT, that nostalgic aroma of a rain shower could also be loaded with hazardous viruses and bacteria. This interesting discovery was made possible through the use of high speed cameras, which the scientists used to record “roughly 600 experiments on 28 types of surfaces”, where a drop of water was released onto each of the 28 varying soil surfaces. The scientists then took their high speed footage and played it back at 1/250 of the actual speed.
Their findings showed that, upon making contact with the surface, air bubbles were trapped between the soil surface and the drop of water. This resulted in the trapped air to be forced upwards through the drop of rain, effectively creating an aerosol effect. This entire process, scientists believe, is essentially how the “smell of rain” is created.[Read More…]