Higher frame rates have been at the top of wish lists for cameras, smartphones, and – perhaps most importantly – gamers.
With viewfinders on cameras like the Canon EOS R3 (opens in a new tab) reaching 120fps, Apple’s ProMotion clocked at 120Hz, and a growing market for 144Hz (or higher) monitors and corresponding GPUs, it’s hard not to fetishize the tech – but can people actually perceive the difference? Some, like FilmmakerIQ, say no.
Players also love to quote a test given to prospective USAF fighter pilots; an image of an aircraft is flashed on a screen for 1/220 sec and pilots can identify the aircraft they have seen. They’ll tell you that means the human eye can see at least 220 frames per second (and some are investing accordingly). However, any photographer can see the flaw in this logic. After all, you can see a xenon flash go off, and it’s only a fraction of a millisecond.
Viewed differently, if you’re shooting video at 24 fps and firing a xenon flash, the frame when the flash fires East brightened, even if the flash is only “on” for a fraction of the shutter open time. So if the human eye is working at 10 fps, it will still perceive this aircraft at 220 fps.
Ok, but a faster refresh rate like ProMotion seems smoother, so this proves it? No. Motion on a phone screen, an old 30Hz monitor, or a super-fast gaming screen is always a succession of still images. When things “move,” scientists call it “apparent motion,” the basis of all animation.
Move a mouse quickly and you’ll see small gaps and multiple mouse pointers on the screen. The faster the refresh rate, the more pointers and the smaller the gaps, but you will still see multiple instances. Which actually supports the idea that the “refresh rate” of the eye is lower than that of the monitor.
Okay, where it gets really interesting is with the research of Adolphe-Moïse Bloch in 1885, who said that – below a certain time (or “exposure”, let’s say) – the eye perceives the light as less bright if seen for less time. Above this exposure, the perception of brightness was not affected. Bloch and other scientists found that the period in which perception was affected by the duration of light exposure was – drum roll – 100 milliseconds. That’s a tenth of a second.
Unlike a camera, there is no digital clock for capture and playback. The eye is always active, so there’s no need – and there isn’t – for an actual frame rate. The human eye actually has different areas of perception; the high resolution fovea – the middle – sees better color but is slower. Peripheral vision is better suited to identifying movement for evolutionary reasons.
However, it usually cannot identify the flicker of a low-power bulb, for example, between 60 and 90 Hz. Videographers, however, will be well aware that strobe is something that cameras can easily pick up. if the shutter speed is incorrect.
The strobe effect, however, can be seen in the eye. You will know this mostly from videos of a spinning wheel in which at a certain point the wheel seems to turn in another direction. JF Schouten, in 1967, showed that humans looking at a rotating subject in continued the light (no flicker) nevertheless saw a “subjective strobe”, the first being 8-12 cycles per second (so, yes, around 10Hz again).
Since then, various researchers have pursued the idea that it reveals a frame rate (some basing their conclusions on how LSD users perceive their experiences). The most recent research, however, seems clear: there is no frame rate. The biology is just more complex.
All this is a very long way of explaining why Peter Jackson may have been wrong to choose the HFR (High Frame Rate) for The Hobbit!
If you want to continue diving into frame rate, we can also answer “What is Variable Frame Rate (VFR) (opens in a new tab).” If you are interested in capturing slow motion then definitely check out our best slow motion camera (opens in a new tab) guide.
#eyes #fps #wasting #money #technology