The science behind whether human eyes can see beyond 60FPS is murky at best. Some studies say yes, some say no, but all say it won't matter on that old CRT you're using as a monitor.
What do you think? Do you notice a difference going beyond 60FPS? Or are you perfectly content to snipe across maps at 30FPS?
For all my interest in visual fidelity, I just cannot wrap my head around crazy high frame rates.
Of course, that might have something to do with the underwhelming refresh rate on my monitors.....
High FPS is nice, consistent FPS that is high is better.
My 2080ti has a hard time pushing the full resolution of my 49" ultrawide during games. I should have waited for the 3090...
I am just to used to 144Hz. I think 60 is fine but would prefer 100-144Hz if possible. I do think 200 is a bit overkill but I would prefer 200 over 60, with 144Hz being my good sweetspot
I noticed a slight shift from FPS to refresh rate in the comments. The GPU writes to the frame buffer at the frame rate, but the frame buffer writes to the screen at the refresh rate. Of course the refresh rate would need to be at or above your frame rate for the high frame rate to be wholly worthwhile. But I usually turn on G sync if its supported, so that helps to synchronize the 2 rates.
That said, it probably depends more on the game I'm playing whether that high FPS is going to matter. If I'm competitively playing a fast paced game (any PvP for example), I will turn down their graphic options to just maximize latency and frame rate and hopefully gain an advantage (or at least to not disadvantage myself further). But if I'm playing an open world sandbox type game then frame rate is less important to me then having stunning graphics running (as long as the frame-rate is fast enough to be playable).
Submit photos and a description of your PC to our build showcase
See other custom PC builds and get some ideas for what can be done
Services starting at $149.99