Samsung is working on creating 600-megapixel sensors
If you needed more proof that the world was going nuts, it seems that Samsung is working on 600-megapixel sensors. In an editorial post on the Samsung website, Yongin Park, Head of Sensor Business Team stated they’re prepared to “ride the next wave” of tech innovation, and is determined to produce sensors with higher resolution than the human eye.
In just the past couple of years, we’ve already seen smartphone sensors make what seemed at the time to be a ridiculous leap to 64-megapixels, followed up very quickly by 108-megapixels, and a 150-megapixel 1″ Samsung sensor was rumoured to be in the works just last month. 600-megapixels is a bit of a step up, but Samsung seems keen to get there.
The image sensors we ourselves perceive the world through – our eyes – are said to match a resolution of around 500 megapixels (Mp). Compared to most DSLR cameras today that offer 40Mp resolution and flagship smartphones with 12Mp, we as an industry still have a long way to go to be able to match human perception capabilities.
Simply putting as many pixels as possible together into a sensor might seem like the easy fix, but this would result in a massive image sensor that takes over the entirety of a device. In order to fit millions of pixels in today’s smartphones that feature other cutting-edge specs like high screen-to-body ratios and slim designs, pixels inevitably have to shrink so that sensors can be as compact as possible.
On the flip side, smaller pixels can result in fuzzy or dull pictures, due to the smaller area that each pixel receives light information from. The impasse between the number of pixels a sensor has and pixels’ sizes has become a balancing act that requires solid technological prowess.
Samsung’s latest flagship, the Samsung Galaxy S20 Ultra, houses their latest 108-megapixel ISOCELL Plus smartphone sensor. And while Samsung’s 600-megapixel goal may be high, is there really a need? At least for smartphones. Unless we’re going to be printing images that cover our entire field of view, why do we need this kind of resolution? And if we did need it, would we really want it to exist in such a tiny sensor?
I can see the potential need for scientific research or computer vision systems where you want to be able to see and zoom in as much as possible on tiny details, but just for shooting snaps of our drunken friends on a night out?
Aiming for 600Mp for All
To date, the major applications for image sensors have been in the smartphones field, but this is expected to expand soon into other rapidly-emerging fields such as autonomous vehicles, IoT and drones. Samsung is proud to have been leading the small-pixel, high-resolution sensor trend that will continue through 2020 and beyond, and is prepared to ride the next wave of technological innovation with a comprehensive product portfolio that addresses the diverse needs of device manufacturers. Through relentless innovation, we are determined to open up endless possibilities in pixel technologies that might even deliver image sensors that can capture more detail than the human eye.
Of course, Samsung’s latest 108-megapixel sensor doesn’t typically shoot 108-megapixel images. Sure, you can if you want, but generally, you’re only getting a 12-megapixel final result out of that thing as their latest generation sensor combines 9 individual pixels on the sensor to create 1 pixel on the output image. It’s a similar principle to Sony’s “Quad Bayer” tech, but with 9 pixels instead of 4.
If the 600-megapixel sensors also combine 9 pixels into one, then you’re only going to get a 66-megapixel final output at best. I’m sure, as with the current 64-megapixel and 108-megapixel sensors, there will be a way to output the entire pixel array in your images if you need to, but even at 66-megapixels, that seems pretty ridiculous for a phone.
From a tech standpoint, the breakthrough is pretty cool. I just don’t understand the point, when there are far more important things they can be focusing on.
Samsung also says they’re working on sensors that can register smells and tastes. Wait, wasn’t April 1st three weeks ago?
You can read their full post on the Samsung website.
John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.