This is why your camera shoots 29.97fps (not 30fps) and why it doesn’t really matter any more

Oct 5, 2016

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

This is why your camera shoots 29.97fps (not 30fps) and why it doesn’t really matter any more

Oct 5, 2016

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Join the Discussion

Share on:

2997fps

If you’re not dealing with broadcast, and you’re simply uploading to YouTube, Vimeo, Facebook, etc. then I’ll save you some time. You don’t have to stick with 29.97 framerate. It’s old, it’s obsolete, it’s no longer technically relevant, shoot and play back at whatever framerate you like.

If you want to delve a little deeper into why 29.97fps is even a thing, check out this video from Matt Parker at Standup Maths. In it, he talks about how 30fps became 29.97fps in the first place. It basically boils down to a combination of the frequency of the electrical supply (60Hz) and the amount of broadcast “bandwidth” that was available to the first colour analogue TV signals.

YouTube video

If you’re into the maths behind it, and why it really doesn’t matter any more, it’s a good watch. And yes, he goes a little technical, but he does break it down into simple terms that most of us can understand (one of the reasons I love Matt’s channel).

What I found most interesting is that NTSC’s whole framerate problem could have been solved if they’d simply bumped the resolution slightly. When they made the transition from black & white to colour, maths meant they had to change something. They chose to alter the framerate, which was originally exactly 30fps.If they’d chosen to bump the resolution instead, future video editors would’ve had a much easier time.

Purely by mathematical co-incidence, bumping it to the same resolution that PAL uses would’ve fixed it (despite the difference in framerate and bandwidth range between regions).

2997fps_bandwidth

The 29.97fps framerate (and 25fps for most of the rest of the world) was basically chosen due to technical and mathematical limitations of the time. those limitations no longer exist, so sticking with 25fps or 29.97fps really isn’t important any more.

Some people thought I was nuts for doing all my video work at 24fps (the so-called “film rate”) when I made the move from standard definition to HD. “It won’t be compatible with this, and that. It won’t play on TVs, etc.” It turns out, I had absolutely nothing to worry about.

Many of the people I knew then who worked with video also switched to 24fps when they made the move to shooting HD. Quite a few still stuck with the UK standard 25fps, though.

60fps has started proving very popular in the gaming channels and for action camera videos. It’s probably about the highest framerate where you’ll really see any kind of benefit, in terms of smoothness of the footage. But, I think I’ll stick with 24fps for a while yet.

Do you still stick with the “standard” 25fps and 29.97fps framerates for your videos? Have you switched to 24fps? Do you shoot 60fps? What about 3D or VR? Do you find the faster framerates like 60fps offer a better viewing experience? Let us know your thoughts in the comments.

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

John Aldred

John Aldred

John Aldred is a photographer with over 20 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

11 responses to “This is why your camera shoots 29.97fps (not 30fps) and why it doesn’t really matter any more”

  1. Kay O. Sweaver Avatar
    Kay O. Sweaver

    24fps forever.

    I saw some old film footage on YouTube recently that someone ran through a de-noise filter, image stabilized and most disturbingly interpolated to 60fps with Twixtor or some such software. It was awful and felt nothing like film.

    That said I’m so glad that the age of 29.97 is drawing to a close. That was some editing bullshit back in the day.

    1. x Hades Stamps Avatar
      x Hades Stamps

      Oh, yeah. Even in the age of computers, I physically cannot work with 29.97 FPS.

  2. Steve Solis Avatar
    Steve Solis

    Lol, like those that say to double-space after a period.

  3. Oleg Antoshkiv Avatar
    Oleg Antoshkiv

    Богдан Тимків

  4. Ralph Hightower Avatar
    Ralph Hightower

    Well, the Europeans had the benefit of hindsight of the problems that the Americans faced and solved when introducing color TV and having it remain compatible with existing B&W TV receivers. My parents didn’t get a color TV until 1968.

    1. SAL_e Avatar
      SAL_e

      Actually this is not quite accurate. It is true that tv standards in Europe were set after the USA did theirs and result they had chance to avoid some of the ‘mistakes’ made in USA. But at the same time the frame format 625 lines x 25 frames was set long before Color TV standards were set. The main driving force was image flickering. As the video correctly points out the main reason for selecting frame rate was the interference from power source. USA power frequency is 60Hz in Europe is 50Hz. As result Europeans had to run at 25Hz frame rate. The problem was that at 25Hz 525 lines are not close enough to produce stable picture. They had to increase the number of horizontal lines. As result they added 100 more lines to become 625. This actually created a problem, It interfered with the sound located at 4.5MHz sub-band. They had to move the sound and re-arrange the channels. What they did right was that they had a foresight to move it and leave space for future use. Initially they were planning to have additional sound tracks (different languages). This part never was implemented and later that allow development of both PAL and SECAM. On paper SECAM can produce better image then PAL or NTSC, but in real life conditions PAL turnout to be the best balance between competing factors, as result it become dominant standard and better picture in real life conditions. In the lab SECAM looks better. With improvements of the electronics NTSC performance got very close to PAL as well.

      PS. The old joke between PAL fanboys was: Never To See Correct (NTSC) colors.
      PPS. Do you know why the CDs are encoded at 44.1 kHz? It is directly linked to TV format frequencies. ‘See the mising link’: https://en.wikipedia.org/wiki/Sony_PCM-501ES_digital_audio_processor

      1. Kaouthia Avatar
        Kaouthia

        I’d heard “Never The Same Colour” :D

  5. Theuns Verwoerd Avatar
    Theuns Verwoerd

    “29.97fps is a completely arbitrary number you don’t need to worry about”

    … shoots at 24fps.

    1. x Hades Stamps Avatar
      x Hades Stamps

      It’s an integer framerate, and fully compatible with 60 FPS (and, on interlaced video, 30 FPS, or 60 fields per second, more properly) via 3:2 pulldown.

  6. SQLGuy Avatar
    SQLGuy

    Not “slightly”. If they had increased the resolution to 625 lines, they would have broken backwards compatibility.

    The high voltage (flyback) transformer of the TVs that provided tens of thousands of volts to the CRT’s second anode ran at the 15750Hz horizontal scanning frequency. This also powered the horizontal deflection coil around the neck of the CRT. Changing to 625 lines would have changed the horizontal frequency to 18750Hz. That’s a big difference. The circuit in question is highly tuned for correct resonance at 15750Hz. 29.97 changed to something like 15734Hz… so almost the same. At 18750, though, the width of the picture would be wrong, and things would break. If they didn’t break right away, the CRT would also be emitting a lot more X-rays than it was meant to… because of the higher voltage produced at the higher scanning frequency.

    Net net, changing to 625 would have meant everyone would have had to buy new TVs. Going from SD to HD with a several year overlap, and adapter devices for old TVs, in the 2000s, is one thing… doing it in the 1950s when a TV was a huge purchase was another story indeed. The least expensive TVs in the early 1950s were over $1200 in 2020 equivalent.

    1. Ben Hutchinson Avatar
      Ben Hutchinson

      Actually a lower horizontal scanning frequency would have produced a higher voltage from the flyback transformer. Lower frequency means the voltage is applied to the primary coil of the flyback transformer longer, and according to the equation for current in an inductor, the longer the voltage is applied, the more the current flows. When the switching transistor for to the flyback transformer is switched off (as it does at the end of each horizontal line), it switches off quite rapidly. The more current that’s flowing when this switch-off happens, the faster the current drops to 0. According to the equation for voltage across an inductor, the faster the current changes, the more voltage is produced. As a result of these factors, it turns out that running a flyback transformer transformer at too LOW of a frequency actually generates HIGHER voltages.

      Running it at a higher frequency will therefore lower the voltage and thus the CRT will produce a dimmer image. It also would mean FEWER (not more) xrays being generated.