How sensor size affects depth of field

Sep 8, 2023

Alex Baker

Alex Baker is a portrait and lifestyle driven photographer based in Valencia, Spain. She works on a range of projects from commercial to fine art and has had work featured in publications such as The Daily Mail, Conde Nast Traveller and El Mundo, and has exhibited work across Europe

How sensor size affects depth of field

Sep 8, 2023

Alex Baker

Alex Baker is a portrait and lifestyle driven photographer based in Valencia, Spain. She works on a range of projects from commercial to fine art and has had work featured in publications such as The Daily Mail, Conde Nast Traveller and El Mundo, and has exhibited work across Europe

Join the Discussion

Share on:

YouTube video

Can you achieve beautiful background blur with a small sensor? Yes, you can. According to this video by the Koldunov Brothers, small sensors produce more background blur than larger ones. However, you may not fully appreciate this blur because too little of it fits into the frame.

But of course, it’s more complicated than that. Small sensors often require shorter focal length lenses, which don’t blur the background as effectively. In this video, the Brothers shed light on this controversial subject.

Visualizing sensor sizes

To understand the impact of sensor size on depth of field, imagine a circular image created by a camera lens. Now, picture different-sized camera sensors placed beneath this circle, as if we were using the same lens on various cameras.

  1. 6×6 Medium Format: The largest sensor that can fit within our circular image is the 6×6 medium format film. It’s square and captures a vast area.
  2. 44×33 Modern Digital Medium Format: This sensor is significantly smaller than the film medium format but still quite large.
  3. 35mm Full Frame: This is the format for advanced full-frame digital cameras and was the most common film format in its era.
  4. APS (Cropped Frame): This is a widely used modern digital format in the affordable price range, also known as cropped frame. It captures only the central part of the image.

As you move from the largest sensor (6×6) to the smallest (APS), you’ll notice a gradual loss of detail as parts of the image get cropped by the frame boundaries.

So how did they do this experiment? Well, first, they took the image circle created by the lens and placed different sensors under it.

They then began to compare the photos taken with different sensors, highlighting three main viewpoints:

1. Depth of field is not dependent on sensor size:

This viewpoint is more technical and is the basis for depth-of-field calculators. To comply with calculation rules, the right image needs to be stretched to match the size of the left one. As the image is stretched, it becomes less sharp, and the depth of field decreases. In other words, smaller sensors result in a shallower depth of field and more background blur.

2. The smaller the sensor, the shallower the depth of field.

Images need to be brought to the same size, so the smaller the sensor, the shallower the depth of field. This viewpoint is more technical and is the basis for depth-of-field calculators. To comply with calculation rules, the right image needs to be stretched to match the size of the left one. As the image is stretched, it becomes less sharp, and the depth of field decreases. In other words, smaller sensors result in a shallower depth of field and more background blur.

3. The smaller the sensor, the greater the depth of field.

And it doesn’t matter that we’re comparing completely different lenses in this case. In practice, comparing these two frames may not be meaningful because they lead to different levels of closeness in the shot. Achieving the same closeness can be done by using a longer focal length lens on a camera with a larger sensor. Longer focal-length lenses create a shallower depth of field. So, background blur is more a result of the lens’s focal length than the sensor size.

In defence of the third viewpoint, they provide an example showing that, in practice, larger sensors do a better job of blurring the background than smaller ones.

As a bonus for those who can’t afford a camera with a large sensor but still want to achieve beautiful bokeh, there’s an artificial way to enlarge the sensor size—by stitching together a panorama. This is also known as the Brenizer method.

This video shows that if you want to capture stunning bokeh and background blur, it’s still preferable to use a camera with a larger sensor. However, remember that other factors also influence depth of field, and as always, size isn’t everything.

If investing in an expensive large-format camera isn’t feasible, you can experiment with shooting overlapping composites instead.

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

Alex Baker

Alex Baker

Alex Baker is a portrait and lifestyle driven photographer based in Valencia, Spain. She works on a range of projects from commercial to fine art and has had work featured in publications such as The Daily Mail, Conde Nast Traveller and El Mundo, and has exhibited work across Europe

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

2 responses to “How sensor size affects depth of field”

  1. Stephen Jenner Avatar
    Stephen Jenner

    Of course, the best sensors are the ones that you replace after every exposure.

    Also known as film.

  2. Thebes42 Avatar
    Thebes42

    Format effects the rate of change in the size of the airy disc because it requires a different magnification ratio. As a practical matter a portrait with only nose tip to ear tip in focus on 8×10 film will have a painterly blur to buildings in the background whereas a comparable full frame portrait will blur the background beyond recognition.