Watch: Video explains why your camera’s sensor sees the world like an octopus

Mar 29, 2021

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Watch: Video explains why your camera’s sensor sees the world like an octopus

Mar 29, 2021

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Join the Discussion

Share on:

Camera sensor technology has come a very long way since it was first developed in the 1970s. And while they’ve gotten much smaller and faster since Steven Sasson’s original 3.6kg, 0.01-megapixel digital camera, the basic principle of how a sensor records an image is still pretty much the same – as explained in this wonderfully technical and geeky video from IMSAI Guy.

In it, he begins by explaining the difference between standard sensors and rear-illuminated (BSI) sensors – and how the latter is basically the way an Octopus’ eyes work. He illustrates how a pixel “sees” the light that hits it and how a sensor arranges all these pixels using just a standard (albeit very large) matrix array in order to make up the final image.

YouTube video

It’s fascinating to think that, in principle, just about any of us could order the components on eBay or Amazon to build a sensor (of sorts). All each pixel uses is a photodiode and three MOSFET transistors. Of course, it’d be absolutely huge, with each “pixel” being hundreds of times larger than pixels on your average sensor these days. Your typical through-hole photodiode is around 3mm in diameter. The pixels on something like the Sony A7R IV, for example, are 3.76 microns (about 1/800th the size).

But equally as fascinating are BSI sensors and how they see the light vs how more traditional front-illuminated sensors see it. The latter of those sees light in the same way that almost every animal on the planet sees light… Except for one; The octopus, whose eyes see the world in a very similar way to a BSI sensor.

You don’t need to know this stuff to become a better photographer, but it’s absolutely fascinating.

[via Hackaday]

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

John Aldred

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *