People looking at a strobing light, start to see it as just “on” (not blinking anymore) at almost exactly 60Hz.
In double blind tests, pro gamers can’t reliably tell 90fps from 120.
There is however, an unconscious improvement to reaction time, all the way up to 240fps. Maybe faster.
The real benefit of super high refresh rates is the decrease in latency for input. At lower rates the lag between input and the next frame is extremely apparent, above about ~144hz it’s much less noticable.
The other side effect of running at high fps is that when heavy processing occurs and there are frame time lags they’re much less noticable because the minimum fps is still very high. I usually tell people not to pay attention to the maximum fps rather look at the average and min.
However, when the modulated light source contains a spatial high frequency edge, all viewers saw flicker artifacts over 200 Hz and several viewers reported visibility of flicker artifacts at over 800 Hz. For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate.
People looking at a strobing light, start to see it as just “on” (not blinking anymore) at almost exactly 60Hz.
In double blind tests, pro gamers can’t reliably tell 90fps from 120.
There is however, an unconscious improvement to reaction time, all the way up to 240fps. Maybe faster.
The real benefit of super high refresh rates is the decrease in latency for input. At lower rates the lag between input and the next frame is extremely apparent, above about ~144hz it’s much less noticable.
The other side effect of running at high fps is that when heavy processing occurs and there are frame time lags they’re much less noticable because the minimum fps is still very high. I usually tell people not to pay attention to the maximum fps rather look at the average and min.
It seems to be more complicated than that