Personal computing discussed
Moderators: renee, mac_h8r1, Nemesis
Aphasia wrote:As long as framerate is above refresh rate, consistency all the way. And yes, I hate tearing and thus also pay a price for using vsync. So soon as decent freesync monitors are out in force, I'll get one which might help that quite a ways considering a play a whole lot of FPS, not to mention having more then 60fps would be nice too.
Damage wrote:I think we're wired to notice slowdowns more than variance in frame rates, in other words. That's what I've gathered by comparing frame time numbers to my perceptions over a lotta testing with fast, variable-refresh displays and the like.
ultima_trev wrote:What are the actual benefits of high framerates outside of competitive multiplayer twitch shooters? Surely it's not necessary for a slow paced single player campaign? I mean, I can definitely tell the difference between 30 and 80 frames per second in terms of actual responsiveness, especially the mouse look in shooters. But as long as FPS stays 40 (or 25 ms latency) and doesn't budge any lower, I could care less about average or peak. I'd rather have a constant 40 as opposed to 60 or 120 with dips to below 40. Maybe this wouldn't fly for professional, competitive gamers though?
Krogoth wrote:Consistent framerate hands down.
It is the frame-dips and spikes that are annoying above all else. Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.
just brew it! wrote:Aphasia wrote:As long as framerate is above refresh rate, consistency all the way. And yes, I hate tearing and thus also pay a price for using vsync. So soon as decent freesync monitors are out in force, I'll get one which might help that quite a ways considering a play a whole lot of FPS, not to mention having more then 60fps would be nice too.
Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.
rahulahl wrote:ultima_trev wrote:What are the actual benefits of high framerates outside of competitive multiplayer twitch shooters? Surely it's not necessary for a slow paced single player campaign? I mean, I can definitely tell the difference between 30 and 80 frames per second in terms of actual responsiveness, especially the mouse look in shooters. But as long as FPS stays 40 (or 25 ms latency) and doesn't budge any lower, I could care less about average or peak. I'd rather have a constant 40 as opposed to 60 or 120 with dips to below 40. Maybe this wouldn't fly for professional, competitive gamers though?
Personally, I can tell the difference between 100 and 144 quite easily. That said, if the fps only occasionally dips down to 100, its not really much of an annoyance. I still factor in the time spent below x amount of milliseconds chart in TR reviews before I buy a GPU though. This is one of the reason I chose to go Nvidia, before I even decided to get G-Sync. The dips hurt more when you already have a low FPS. But if you can manage to get your FPS 100+ and even touching 144, these tiny dips wont be as distracting. Of course, having G-Sync helps as well. Since with only V-Sync those tiny dips would halve my FPS for that duration.Krogoth wrote:Consistent framerate hands down.
It is the frame-dips and spikes that are annoying above all else. Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.
Not all of them.
Most of the online games I play like Chivalry, Counter Strike, Sanctum 2, Torchlight, etc. I have yet to play a single game that I play online which has a limit on FPS. Of course, it is quite possible, that I simply don't play those games, but there certainly are a number of online games that do not have this limit.
Krogoth wrote:just brew it! wrote:Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.
Actually, it is older than that. It is a limitation from the days of motion picture (celluloid). Where increasing the framerate cost $$$$ and the filmmakers found that 24FPS was the spot sweet for film length while retaining a relatively smooth framerate as long you don't pan the camera around like a caffeine junkie.
Krogoth wrote:The following examples have hard-coded limits or you will end-up running into de-syncing issues. Any decent multiplayer game has build-in limits. FPS counters don't tell the whole story. Counter-Strike is the most famous example and the CS junkies go as far as tweaking the server-client predictions.
JBI wrote:The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.
Glorious wrote:Krogoth wrote:The following examples have hard-coded limits or you will end-up running into de-syncing issues. Any decent multiplayer game has build-in limits. FPS counters don't tell the whole story. Counter-Strike is the most famous example and the CS junkies go as far as tweaking the server-client predictions.
Tick rate is different than frame rate. Obviously interrelated, but still distinct.
Fundamentally, if the tick rate is 100 and I'm rendering at 300 FPS, where is the problem? That everything will magically de-sync because I have three rendered frames for every one game state update? How? Why?
I don't think you know what you are talking about, again.
just brew it! wrote:Krogoth wrote:just brew it! wrote:Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.
Actually, it is older than that. It is a limitation from the days of motion picture (celluloid). Where increasing the framerate cost $$$$ and the filmmakers found that 24FPS was the spot sweet for film length while retaining a relatively smooth framerate as long you don't pan the camera around like a caffeine junkie.
Umm... how does 24 FPS from film relate? NTSC is 60 fields / 30 frames per second, and I don't recall computer monitors ever running below 60 Hz, other than for specialized applications.
If they were really trying the refresh rate of electronic displays to film, NTSC would've been 48 fields / 24 frames per second. NTSC was actually problematic when it came to broadcasting content which was originally shot on film, since it required every 4th frame to be shown twice, or the use of "3:2 pulldown" to match the frame rates; this resulted in visual artifacts. (Modern digital processing techniques can mitigate this.)
The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.
Krogoth wrote:Because the limits from celluloid set the standard for the source material in A/V for over a century. 60Hz was chosen because it not only match AC in residential grids in USA/Canada it is also divisible by 12 like 24FPS set by celluloid.
just brew it! wrote:Krogoth wrote:Being divisible by 12 was likely more of a happy accident; matching the line frequency was the overriding concern (otherwise they would've gone with an integer multiple of 24, i.e. 48 or 72 Hz). European standard (PAL) is 50 Hz to match their line frequency, and is not divisible by 12; too bad for them! Going with double the line frequency (120 Hz) would have probably made television receivers too expensive for the average consumer, given the limitations of contemporary electronics back in the 1940s. Heck, the interlaced format was also a compromise dictated by technological limitations, which we have (finally!) ditched with the shift to digital.Because the limits from celluloid set the standard for the source material in A/V for over a century. 60Hz was chosen because it not only match AC in residential grids in USA/Canada it is also divisible by 12 like 24FPS set by celluloid.
Krogoth wrote:FPS junkies keep forgetting that games are programs at heart. Any decent programmer puts in limits in games so do not to go into "turbo mode" becoming unplayable if there is enough hardware prowess.
krogoth wrote:You can see this with really old games which lack such coded limits on modern systems. Id's FPS games have a "timedemo" argument that disables the limits when they do a demo playback. That's why such demos go super fast on a modern system unlike normal gameplay under the same system.
Krogoth wrote:That 300FPS on the FPS counter is somewhat misleading. The game is actually rendering the graphics at 85-100FPS, but without the limits would render graphics at full 300FPS but it would completely unplayable. It is also why the visual benefits of 120FPS+ rendering seems to "disappear".
Krogoth wrote:I understand the ticking rate but the same framerate limits also help prevent bizarre syncing and prediction issues in multplayer games.
Krogoth wrote:The real reason why more FPS = better meme took off is because of interesting issues with the physic portion of the Quake 3 engine. Astute players discovered that they could jump slightly further and strife-running faster if the framerate was high enough (100FPS+). This gave competitive players an edge over their competition that were running at slower framerates.
Krogoth wrote:You really need to play some really old games that lack limiters to see the "turbo mode" problem at work.
Try any of the old games that use Unreal Engine 1.0 and turn off Vsync under a modern system. It is much worse with older titles coded back early 1990s and throughout 1980s. It is actually the reason why the "Turbo" button existed back in the day before programmers addressed the timing and speed issues.
krogoth wrote:You really need to play some really old games that lack limiters to see the "turbo mode" problem at work.
Krogoth wrote:Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.
Krogoth wrote:Try any of the old games that use Unreal Engine 1.0 and turn off Vsync under a modern system. It is much worse with older titles coded back early 1990s and throughout 1980s. It is actually the reason why the "Turbo" button existed back in the day before programmers addressed the timing and speed issues.
At any rate, programmers didn't "address" the timing and speed issues in those programs. They may have written NEW programs that relied on different timing sources, but you can probably find those same programs on an abandonware site somewhere and they'll have the exact same problem they always did. These practices may have persisted into later games (like into the 90s and maybe even later...), but it just goes back to inertia and the fact that the same basic problem existed in a slightly different form: no high quality timer available/dire performance implications for non-cycle dependent timing.
But that's just history. The REAL POINT here is that for the last ten years or so (AT LEAST) the kind of problem you are vaguely alluding to is EXTREMELY UNCOMMON in multi-player games. It is perhaps less rare in singleplayer games, as I can dimly remember a few that had scripts and few other things break at extremely high framerates, but it is still very rare.
Glorious wrote:JBI wrote:The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.
Yup, and that's equally why PAL is 50Hz, much to the chagrin of British/European/Australian old school console speed runners.
Krogoth wrote:You are just admitting that modern programmers use limiters in their code that prevent the earlier issues related that arose when computer hardware were able to run application at "turbo speeds". That's why such issues are such a rarity these days.
Pancake wrote:We enjoyed our higher definition 625 line (per frame period) vs your crude low res 525 line displays.
Pancake wrote:Mind you, it took a bit of work to get sprites into the border on my C64 to use all that extra resolution goodness.
Westbrook348 wrote:Explanation of Damage Labs FCAT setup: http://techreport.com/review/24553/insi ... ture-tools