A team of researchers from MIT (Tianfan Xue, Michael Rubinstein, Ce Liu and William T. Freeman) are teaming up with Google with to present a new algorithm that is able to extract photographic inconveniences such as glares and reflections from photographs. The algorithm can then reproduce the image free of any reflections, in addition to being able to create an additional image of the reflection itself. This kind of problem solving would be especially useful when shooting behind glass or a fence, for example. [Read more…]
One of the things I used to hate when perusing my path as a photographer is when I reached a point where I was uninspired to shoot, or I’d reach a point where I’d feel that my best work is behind me and there is nothing new I can do. These creative dry spells used to happen to me from time to time and I know it happens to everyone so I want to share what I personally do to overcome this feeling.
Nothing like saddling up and taking a rotting whale carcass out for a ride in the ocean. Especially when there’s a shiver of great white sharks engaged in a feeding frenzy on it. As crazy as that may sound, a researcher and photographer in South Africa can add these exact circumstances to his resume. Seriously.
Take a look:
In order for these tools to succeed in the fields above, let alone search and rescue missions, commercial delivery and monitor livestock farms, they need to be more reliable and able to operate in less-than-ideal weather.
Australian and American researchers took a high-speed camera and set out to find out how ruby-throated hummingbirds cope in turbulent winds.
The study could lead to drones getting ‘tails’.
Last week, we wrote about how researchers at Brown developed a code that would allow realistic weather alterations in photo-editing through text commands. As fate would have it, the new trend these days is apparently groundbreaking algorithms. Two days ago, a video was uploaded showcasing Microsoft’s latest advancement in photography; using first-person-view cameras, researchers for the company developed an algorithm that makes what they call a hyperlapse. Watching the video, you’ll probably find yourself surprised by just how fluid everything almost looks. Keep reading after the break; seeing how it’s done is just as rewarding.
Do you ever notice how sophisticated and easily accessible futuristic technology can look at times when watching a movie? Just to throw an example out there, remember how subtly awesome it was when all Tony Stark needed to do to paint his armor was ask Jarvis to add some hot rod color? As advanced as technology is these days, Louis C.K. was right; we’re a bit spoiled when it comes down to how much we expect. Just the other night, I had a friend complaining that he was stuck on 4G because there wasn’t any LTE in the area.
The bottom line is that efficiency and speed both play a big role in how technology moves forward. As simple as it is to take your phone out and press a button to show the screen, we ended up finding a way to make pushing it unnecessary. As simple as it is to type in a password to buy an app, we replaced it with a fingerprint sensor. And as efficient as it is to Photoshop your pictures to change the weather, we’ve now found a way to let an algorithm do the job for us.
As advanced as smartphone cameras are today, they’re still limited by the size they need to be. As a result, most smartphones have a fixed aperture to save space; the iris itself is mad from fixed blades that set the aperture for each camera. But as always, in a time where mobile devices are so engraved into the modern lifestyle, technology is constantly reaching higher ground. In this case, that higher ground is reached by a new type of iris – one made of chemicals that eliminate the need for physical blades.
Back when social media was still something establishing its foundations, things were a bit different. People didn’t care what they typed as comments on MySpace, or how many seizures they’d cause others to have from their profile’s flashing black-and-white Fall Out Boy skin(face it: that was why we all learned HTML in the first place.) Where once profile pictures were something you’d only expect high school kids to worry about, things have changed today.
You’ve probably heard of Nanotechnology, and you probably know that its nothing new. But what if you heard that nanotechnology now puts a camera in your blood vessels?
I’m not talking about just inside your body. I’m talking about something so small that it can travel through arteries in your body and not end up causing an aneurysm. A scientist named F. Levent Degertekin has developed exactly that. The tiny camera is not only just a breakthrough by definition; it’s apparently a breakthrough in its performance as well. It was developed to project real-time, 3d, high definition imaging of the inside of our vessels, arteries, and hearts. The prototype developed by researchers generated images at 60 frames per second.