This is why your 4K camera isn’t really 4K

Aug 5, 2017

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

This is why your 4K camera isn’t really 4K

Aug 5, 2017

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Join the Discussion

Share on:

We all harp on about the latest 4K this or 8K that. But do we really know what we’re saying? Most of the time, probably not. It all comes down to how the camera’s sensor actually records each of those pixels in the image, which is largely guesswork.

In this video Cooke Optics interview cinematographer Geoff Boyle. He explains that it’s basically all down to the nature of a Bayer pattern filter array. What’s really happening when your sensor sees an image and why your camera’s resolution is lying to you.

YouTube video

The whole problem essentially boils down to the fact that a Bayer pattern filter array just doesn’t see all colours for every pixel. Its very design makes this impossible. Out of every four pixels, two are seeing green, one is seeing red, and one is seeing blue. The camera’s using the information from around each pixel to help fill in the gaps.

So, as Geoff describes, a 4K Bayer sensor doesn’t actually record a true 4K image. It records 2x2K in green, 2x1K in red, and 2x1K in blue. Then interpolates the missing data.

Fuji attempt to improve upon the Bayer pattern with their X-Trans filter, but it still suffers from the same problem. It does seems to produce a better result, if the wave of adoring Fuji fans are anything to go by, but it’s still not seeing the whole picture.

It’s still guessing the colour in the pixels to help create a complete red, green and blue image.

In the standard definition days, I used to shoot with Sony DSR-500 cameras. Great big shoulder mount cameras. They contain three separate sensors. This was common to many higher end systems then, although not so much today.

When the image entered through the lens, it was split up into its red, green and blue parts. One sensor would see the entire red image, one would see the entire green image and another would see the entire blue image.

Then the signals from these three separate sensors were brought together to create the final result. This is why these cameras were so much more expensive than lower end single chip cameras. They were essentially three cameras in one unit.

With today’s cameras, even many of the high end ones, the single chip Bayer is the standard. And to really get that 1080p resolution footage, you actually need a camera that’s around 2.7K to get the best quality out of it. As Jeff mentions in the video, this is why Arri produced a 2.7K camera for shooting HD. It’s probably why DJI’s drones also shoot 2.7K. It’s also why companies like Panasonic produce 5.7K cameras for producing 4K footage.

This is also why many people shoot 4K even though they only want to produce a 1080p end product. The increased detail of scaling 4K (or UHD) down to 1080p is noticeable.

So, whenever you see a camera with an odd resolution, or something that seems to be way more than people are actually able to watch, now you know why.

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

John Aldred

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

13 responses to “This is why your 4K camera isn’t really 4K”

  1. WillMondy Avatar
    WillMondy

    I thought that the main reason why your camera isn’t really 4K is because it’s 3840 pixels wide which isn’t quite 4000.

    Sigma’s foveon sensor is the closest you would get to having full RGB 4K without a proper 3CCD and prism setup

    1. Quasi Annonymous Avatar
      Quasi Annonymous

      4000 isn’t 4K either. 4096 is.

      1. Xystren Avatar
        Xystren

        Partially true to a degree.

        It depends on the context as to what “K” stands for. Generally when we are talking computers, 1k is equal to 1024 as it is the closest to 1000 that is a power of 2 (0,1,2,4,8, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096, 8192, …..). Hence when talking with computers and or computer storage, we are working on 1k = 1024.

        But when we are taking in other non-computer contexts, 1k is 1000. Hence Y2K is year 2000, not year 2048. One Kilogram or 1Kg = 1000grams (not 1024grams). 1Km = 1000m. A K in this context means 1000.

        Now the lines and context here blur since we are talking digital cameras. But at the same time we are talking resolution… the physical numbers of pixels which isn’t rounded to the the powers of 2. The total number of pixels is going to be W x H. The total number of pixels divided by 1000 is going to give you the number of K pixels. Divide that by a 1,000,000 (or 1M) and you get the number of Mega Pixels.

        So essentially is comes down to what number system are we using – is it Base2 (binary) which is used for computers, or is it Base10, which is our typical number system we use day to day.

        But when the Marketing department get involved Who knows what numbering system they are using.

        Cheers,
        Xystren

        1. Quasi Annonymous Avatar
          Quasi Annonymous

          I’m not speaking generally or what people have been convinced I am speaking precisely and precisely speaking it is 4096. Just because people have rounded and simplified doesn’t change the truth. It just means that a lot of people don’t understand it. Not HD wasn’t 1000 pixels was it? Therefore 4 times the accurate number isn’t 4000.

      2. WillMondy Avatar
        WillMondy

        It depends upon what the K means.
        In common English it is used for 1000 but in computing it would be 1024 which would give 4000 or 4096.

        Of course you can’t get a 4000 pixel wide video camera, as they are 4096 wide.

    2. jimdog Avatar
      jimdog

      3840 X1084 = 4162560 pixels aka 4K

    3. pandacongolais Avatar
      pandacongolais

      3860 x 2160 = Full HD, but often called 4K for short
      4096 x 2160 = 4K

      Or so have I been told by DJI representative who said Luc Besson would not be happy to learn that his 4K camera does 3860 x 2160 px, because he bought 4096 x 2160 px. I think they were also suggesting that Luc Besson uses DJI products to shoot movies. Who knows ?…

      OK, anyway …

      What I don’t get is that the Beyer matrix is not something new. This is the most common way to filter light before it reaches photo-sites.
      So we all know that there is no such thing as 1 photo-site = 1 pixel.

      Does Mr Boyle suggest that sensors sellers are stealing pixels, may be the same way hard drives vendors have been stealing bytes for decades ? (1000 bytes does not make 1 kilo byte, except for them. When talking about Tera Bytes, this makes a huge differences)

      I see two answers :

      – sellers are honest : they put enough photo-sites to obtain the advertised number of pixel after the Bayer matrix and the demosaicing (sorry, is this word correct ? I’m French ! I means after the math processing the filtered light).

      – sellers are not honest and they talk about the definition of the Beyer matrix, not the definition of the sensor itself, which would means that Mr Boyle is correct.

      But if Mr Boyle is correct, does that mean that my 36 MPix D800 “invents” pixels from a 36 MPix Beyer matrix, which would output only something like 24 “real” MPix ?
      A picture from my camera is made of 36 152 320 pixels. Are 1/3 of them fake ?

      1. WillMondy Avatar
        WillMondy

        Don’t worry, your English was perfect (100 times better than my French!)

        3820 x 2160 is often marketed as 4K or UHD (Ultra High Definition)
        Full HD, or 1080p is 1920 x 1080and sometimes called 2K
        Quad HD is 2560 x 1440 but is often called 1440p

        They don’t make these things easy!

        I think the average photographer doesn’t worry, and the ones that do, buy proper 4096 x 2160p cameras anyway. Maybe there should be another little disclaimer written on packaging explaining bayer matrix as well as the number of bytes in a gigabyte.

  2. Ignasi Jacob Avatar
    Ignasi Jacob

    A lot of HD video cameras has 3 CCD or CMOS. Imagine 8K super 35 cinema camera with 3 CCD…

  3. Peter J Dylag Avatar
    Peter J Dylag

    Hmmmm? just another reason to shoot film stock!?

  4. Erik Saari Avatar
    Erik Saari

    If you like what result is, don’t sweat the numbers.

  5. Christopher R Field Avatar
    Christopher R Field

    Nah, my footage looks pretty fantastic actually.

  6. Doug Sundseth Avatar
    Doug Sundseth

    “Pixel” != “Photo Site”. The pixels are created from the information from each intersection in the grid, since each such intersection has exactly the same quantity of information (2G + 1R + 1B).

    Caveat: There is a fence-post issue around the edge of the photo site array, since the last ring of full intersections is one photo site inside the edge of the array.